The end of the world is IT

by Howard Oakley on May 13, 2010

Howard Oakley

Howard Oakley

There’s no escaping technology’s impact on our planet, so it’s about time the computer industry stopped its excessive resource consumption.

As visual artists started to record their travels through scenery made dramatic by nature or man, so they rendered some into visions of the Apocalypse. Gone were the disturbingly grisly scenes of torment detailed by Hieronymus Bosch, replaced by near-Impressionist tectonic waves and clouds of colour in John Martin’s The Great Day of his Wrath and, of course, JMW Turner’s awe-inspiring depictions of the St Gotthard pass and trans-Alpine routes.

Perhaps global warming needs a visionary artist to bring it home to us all that, regardless of the possibly disgraceful conduct of some scientists involved, we do face serious, maybe truly apocalyptic problems. Yet the parochial behaviour of the world’s political leaders at last year’s Copenhagen conference showed that they still don’t accept the imperatives.

For the overwhelming majority of businesses, climate change has to be someone else’s problem, and the computing industry is no exception. All of the Next Big Things on the computing horizon seem set to increase our generation of greenhouse gases, because if they don’t, someone loses out on new product sales. When personal computing started to take over from mainframes in the 1980s, millions of offices drew more power to run their new computers and more power to remove the heat they produced.

In theory, cloud computing could produce great economies. Most firms could strip out their server suites, replacing them with capacious pipes connecting to remote data centres. Desktop systems with multiple hard disks and a handful of processor cores could go, in favour of ‘thin’ terminals. That should have happened when network operating systems became widespread in the 1990s, but hardware manufacturers fought back to avoid the fall in revenue. The past decade saw the rise of mobile devices, but how few of us have replaced our desktop systems with a laptop, let alone replaced that laptop with an iPhone? Even in recession, we’ve clung on to all the hardware that we can squeeze onto our desk and into our pockets.

No matter where you look in the computing industry, the drive is to deliver something new and compelling that will charm or cajole more money out of the customer. Apple has excelled at doing that, from 68K Macs to laser printers, Power Macs, Intel Macs, iPods and iPhones. Like every other successful Silicon Valley corporation, it has kept convincing us we need to keep buying new and better products. In contrast, the car industry has fallen into a slump of its own making, failing to innovate early enough to inspire consumption.

However, the name given to the area around Apple’s headquarters, Silicon Valley, tells of the precious natural resources that innovation is consuming, of the high cost of energy needed to transform silica and rare metals into superfast processors, and so into tomorrow’s hot products. Although we’re always told that it’s the software that makes the machine, each new major product release demands wasteful hardware upgrade or replacement instead of making better use of existing resources.

As motor manufacturers are belatedly grappling with the need for energy-efficiency and greener power systems, the time has come for the computer industry to curb its reckless waste. This isn’t just a matter of slapping Energy Star stickers on monitors and cutting back on packaging, but demands a fundamental change in the industry’s development model.

While it may not be current generations that witness the consequences, I hope that our grandchildren won’t curse the legacy of our technological excesses.

For more breaking news and reviews, subscribe to MacUser magazine. We'll give you three issues for £1

Previous post:

Next post:

>