Apple’s array of MacBooks and desktop Macs is starting to look a little dated on the inside, so it’s high time the company upgraded its range.
In a recent call with investors, Apple COO Tim Cook observed that you could arrange all the company’s products on one table, and yet its revenue in 2009 was $40 billion (about £26.65 billion). ‘I think any other company that could say that is an oil company,’ he then quipped.
Still, compact as Apple’s product line-up is, it’s clearly been spending a lot of time recently on the iPhone and iPad. Its laptops and iMac desktops still look great, but their insides are rapidly dating; all the MacBooks use Intel’s older range of Core 2 CPUs, and none boast quad-core chips – the same is true for the majority of iMacs. The brand new 27in model is the exception to the rule, as it features a quad-core chip that’s from Intel’s new CPU range; which explains why, despite the fact it’s clocked at 2.66GHz, it’s over £250 more than an identical 27in machine with the older 3.06GHz dual-core chip inside it.
If you’re looking at the 27in machine then that £250 is well worth paying, as it will feel quicker, more responsive, and when you throw actual hard work at it – such as chomping through some HD video in Final Cut Pro – your tea won’t get cold while you sit around waiting for the file to render. In fact, considering you also get the Radeon HD 4850, the only graphics card in the iMac range a PC gamer would get excited about, that top-end machine really is the one to get.
So, what does your extra money get you? Why is the chip inside so much better if it’s actually more than 10% slower?
This is the point where we need to get into Intel’s byzantine naming scheme; remember learning your times tables at school? How it was just a lot of numbers and didn’t seem to make much sense? That’s what we have here. Back in 2004, Intel started making CPUs with two cores in one package – so they were the same size as older chips, but packed in two cores, essentially, two CPUs. This meant they could work on two things at the same time, which is very handy when you consider that modern computers are almost always working on many different things at once – just load up Activity Monitor and look at all the system processes you have running.
The problem is that while multi-core CPUs are a great idea, they’re also quite a complex one to sell, particularly to the general public. This is where Intel got it quite spectacularly wrong, as it proceeded to make its branding almost completely incomprehensible. Its first few multi-core CPUs were called ‘Pentium Dual-Core’ – not a bad first effort, it pretty much does what it says on the tin. These were relatively rare though and were mostly made available to performance enthusiasts who built their own machines.
In 2005, Intel went mainstream with multi-core, and called its new range of chips, Core Duo. Not bad, although there’s a hint of Starbucks-style Italianatto in there. Next came the Core 2 Duo, which is sort of clear – I think we can all agree its better than the Core Duo, although how many cores it has is unclear. Then you get to the Core 2 Quads, which is a very confusing name for a CPU with four cores that’s based on the Core 2 Duo design.
So to make things clear, Intel decided for its next range of CPUs to start again. Only the company still wanted to use ‘Core’ as a brand (despite it being a technical term for a feature), so we have the Core i3, i5 and i7. Some are dual-core, some are quad-core, and by the time you read this, some will even have six cores. All clear?
Oh, I forgot to say that some of these chips have hyper-threading, which means that they’re able to double their core-count using clever software trickery.
Quick, what’s 8 x 9? Clear as mud, really. What is worth knowing about the Core i-series chips is that they’re significantly faster than the Core 2s they are replacing. The major design innovation Intel made with them was to shift the memory controller from the motherboard to the chip itself. The keen-eyed geeks among you will say that this isn’t much of an innovation since AMD has been doing it since 2004.
This is certainly true, but while AMD’s design has been advanced in this one respect, it’s lagged behind Intel’s chips in terms of power consumption, thermals and out-and-out speed. Intel has managed to combine raw pace with technical finesse in its Core i-series.
In addition to an on-chip memory controller, the Core i-series processors, particularly the high-end models, feature a lot of on-chip cache memory – 8MB to 12MB, depending on the model, compared to typically 3MB to 6MB for Core 2 chips. Pairing more cache with reduced memory latency and higher bandwidth is all about optimising the chip for getting to data faster and being able to shuffle it around quicker.
In addition to the much improved memory design, Intel’s i5 and i7 CPUs also feature ‘Turbo boost’. This allows the chip a certain amount of leeway to decide what frequency to run at – essentially, to push itself to run faster than it says on the box. CPUs are all designed with a certain thermal output in mind, which you can think of as a budget. It’s the maximum amount of heat the chip will put out, but it’s only going to hit that limit when all four cores are working. If you’re only using two, there’s clearly some spare capacity – which is where Turbo boost kicks in, allowing the chip to exceed its stated clock speed. It’s perfectly safe, validated by Intel as part of the design and happens automatically.
It should be no surprise then that Core i-series chips are faster in benchmarks – around 15% to 25% faster, clock for clock at tasks such as video encoding and image editing. Obviously, these figures vary depending on applications, codecs and settings, but we’re talking about a difference that’s measurable in benchmarks and is clearly visible to the naked eye.
Another design feature Apple might be able to do something interesting with is the fact that some of Intel’s Core i5s (and i3s) feature not just multiple cores integrated into the CPU, but a graphics processor (GPU), too. The benefit of this is that you don’t have to have a separate GPU chip, so you save on power and space – much thinner laptops are an obvious result.
Hopefully that goes some way to explaining why, if I was after an iMac, I’d definitely pay the extra for the i5 model. It also explains why I’m not going to be looking at upgrading my MacBook Pro until Apple gets a move on and updates the line to include mobile Core i5s and Core i7s. These chips are already appearing in Windows laptops, so really, we’re just waiting on Apple to get its design right.















