When people try to explain CPU speed, they often make a comparison to cars. As far as I can tell, there are two reasons for this. The first makes some kind of sense: it’s because the central processing unit ‘powers’ a computer – not in the sense of providing energy, but of determining how fast it can go – and is available running at a range of speeds, just as cars are powered by engines available in a variety of sizes. The second reason is that most technology writers are male and like cars. There are some who don’t like cars, but even they see them as the kind of thing the mythical everyman likes and thus are suitable to be deployed in explanatory metaphors.
Neither reason should be enough to save the comparison from the bin. The everyman these days doesn’t even have a car. That’s why he’s also known as the man on the Clapham omnibus (if you’re primarily an iPhone user as opposed to a Mac user, look it up, it’s on Wikipedia).
The comparison between CPUs and cars doesn’t hold up because it ignores how they’re used: almost all cars are driven on the same roads, which are maintained to a certain standard. This helps ensure all cars offer not just a minimum level of performance, but a level that’s normal, expected and experienced by the vast majority of drivers. Very few, if any, new cars will have a problem reaching any legal speed limits.
When it comes to computers, I’d argue it’s actually not hard to find a machine that will offer a large number of people a deeply substandard experience; one that will take minutes to boot up, randomly rev its fans, kick the hard disk and stall and freeze when you have too many websites open.
Well, it’s not hard to find a PC like that, I mean. Apple’s carefully limited range means the minimum experience Macs offer is far more consistent. Not quite as consistent as iOS, though: it will be interesting to see whether consumers and critics of the Mac App Store can cope with the inevitable disappointment of discovering that computer specs differ a lot and software doesn’t give exactly the same experience on all machines.
It’s not just the fact that Apple only sells a few different types of Macs that helps here. It’s that Apple doesn’t mess about when it comes to CPUs – for example, never touching Intel’s two budget CPU ranges, the Atom and the Celeron. Nor does it ever jump onto Intel’s newest and fastest CPUs as soon as they launch. This isn’t just because iOS devices, rather than Macs, are now the priority (though I suspect that plays a part), it’s because just as Apple wants a minimum level of performance, it’s also happy with a maximum level. That is, it won’t allow demands for speed to seriously impact other aspects of the experience – the size of the computer, the noise it makes, the power it draws.
While Apple has never been interested in the very fastest CPUs that Intel makes, it has, I’d argue, been careful to make sure Macs are never slow out of the box. Apple creates premium products and ‘performance’ is an important component of this – particularly in light of Apple’s history with PowerPC, the series of processor chips it created with IBM and Motorola in the 1990s. These initially impressed, but by the time of Apple’s switch to Intel in 2005, the CPUs in its computers had drifted down in relative performance to the point where they were regarded as inadequate. They were no longer fast enough to cope with general computing, and everyone knew it. Not just the geeks: the deficiencies of the PowerPC played into a general public perception that Macs were weird, expensive and different.
The choice of CPU played a part in creating this reputation. Intel worked long and hard through the 1990s to become the only CPU company most people have heard of, paying billions of dollars over the years to etch its da-ding-ding-ding jingle into the brains of the masses. Despite a good few car metaphors, few people have a clue exactly what Intel’s CPUs do. It doesn’t matter, though, because we do know they’re fast, and if a machine doesn’t have an Intel chip in it, the manufacturer often has a much more difficult job selling it – particularly if it’s not a budget machine or one that’s being sold to a hardware enthusiast.
All of this makes it interesting that rumours now suggest Apple is considering using CPUs developed by Intel’s main rival, AMD. The reasons cited range from the political to the practical. The former would be motivated by the fact that Intel is pouring millions of dollars into developing chips for smartphones. It’s had little success so far, as it has struggled to get its x86 architecture to a point where its performance per watt is comparable to the ARM architecture chips that currently power most smartphones. Apple has put its might behind ARM, buying firms with ARM engineering know-how – so, the reasoning goes, Apple must be nervous about funnelling money to Intel while that company busy building a hardware platform to compete with the iPhone.
Conversely, if Apple stays loyal to ARM, will Intel still give it preferential access to new desktop and laptop CPU technology? Those less given to paranoia and Kremlinology point to the fact that Intel’s CPUs increasingly feature built-in graphics chips (GPUs), but these tend to be underpowered and lacking in features. This means that when Apple uses a GPU from AMD or Nvidia, it needs to engineer around the Intel CPU’s built-in GPU. Even then, it’s still there, possibly using power, generating heat and, of course, costing money.
AMD’s long-overdue new CPUs for 2011 will also typically feature built-in GPUs, but because the company bought specialist graphics manufacturer ATI back in 2006, its GPUs will likely be far better. However, the downside is that the CPU performance of AMD’s chips currently lags way behind Intel’s – and the release last month of Intel’s latest-generation CPUs, codenamed Sandy Bridge, has increased this gap. AMD’s next-generation architecture, due to launch later this year, faces a huge battle.
From a performance point of view, then, Apple would likely lose as much as it would gain if it switched to AMD. What about using AMD chips only in low-end Macs? It’s possible, but that could confuse consumers. Currently, it’s easy to compare the Mac models, as they all feature Intel CPUs. Then there’s the fact that, as we mentioned earlier, AMD spends far less than Intel on advertising, and its approach to branding is typically less coherent and not as appealing to mainstream consumers as that of its rival.
Still, Apple might gamble that consumers trust the Apple brand enough not to worry about who makes the CPU. On the one hand, it’s in a much stronger position than when it was selling PowerPC machines. On the other hand, people are typically very conservative when it comes to computers and Intel has been hammering its brand message into the market for years.
It seems to me there’s just too little to gain: living with Intel as it channels its profits into developing iPhone-beating smartphone chips might be a bit grating for Apple, but the reality is that most of Intel’s cash comes from selling business computing CPUs anyway. And surely Apple, of all companies, appreciates that it takes more than fast silicon to make a successful product.
There’s also a more radical option on the horizon. ARM is pressing on with plans to increase its architecture’s performance, arguing that its roots in mobile mean that it will be fast enough, but very cool running and power efficient at the same time.
At the huge CES technology show in January, Nvidia announced its Project Denver chip. This will combine one or more ARM CPU cores with numerous GPU cores and is targeted at PCs, not phones. Nvidia has recently become interested in using GPU cores to run non-graphics software, arguing that for highly parallel tasks, the typical design of a GPU makes more sense. Apple has incorporated some general-purpose uses of GPUs into Snow Leopard, so it will have some idea internally of how feasible this is.
Microsoft has already announced it’s developing the next version of Windows to run on ARM chips, among others. In a few years’ time, Apple might be able to make very light and thin machines running a version of OS X, powered by high-performance ARM CPUs from companies such as Nvidia. Or it could use the same expertise it used to tweak ARM itself to produce the Apple A4, the chip in the iPad and iPhone 4, and make its own CPUs. Many consumers might not know AMD, but they certainly know Apple.















