/ 7 July 2010

The internet isn’t cool, it’s hot

We take it for granted that the internet is free. Yes, we pay to connect, but once we’re on it there’s very little that isn’t free (whether it’s all legal is another matter). But the internet has hidden costs — enormous ones — that are totally invisible to most users.

I mentioned last month that the net uses around 5% of the planet’s electricity. That’s probably an understatement. Futurist Kevin Kelly worked that rough figure out in 2007 (somewhat controversially). Since then Facebook alone has added about 50 000 servers, and we suspect Google now has well over one million servers — all packed into giant data centres whose locations are kept top secret.

According to some calculations, every time you search for something online you use enough energy to power an 11-watt light bulb for an hour. That’s quite a lot of energy for something that takes less than a second, and that most of us do at least a dozen times a day. The really surprising part? More than half that electricity goes into cooling.

Although they have few moving parts, computers give off enormous amounts of heat as the bits and bytes race around their innards. If you pack thousands of them together, this obviously exacerbates the problem.

And, like most machines, computers don’t like being hot. Above 85°C bad things start to happen involving violent words like “crash” and “fry”. These are words that make the data centre elves very unhappy, and so they make darn sure that every inch of their server farm is properly cooled.

So what? Surely, on aggregate, the internet saves far more energy than it uses? All those car trips to the library and the post office for instance. It’s a good point, but what worries me is that other industries aren’t growing like the internet. Google alone fully expects to have 10-million servers in the not too distant future. While 5% of global electricity sounds acceptable, 25% or 30% is considerably less palatable.

But, as is often the case with a flawed technology, another technology comes to the rescue. The eggheads at IBM have teamed up with the Swiss Federal Institute of Technology Zurich (ETH Zurich) to come up with a deliciously simple solution: cooling computers with water instead of air.

Named Aquasar, the system cools a super computer at the level of individual CPUs using the age old principle of heat exchange. Since water is over 4 000 times better at conducting heat than air, they have managed to make the whole computer 40% more efficient at a stroke.

And we’re not just exchanging electricity wastage for water wastage — Aquasar is a closed system that recycles its heat into nearby buildings and then returns it to do more cooling work. At the moment Aquasar has the quaint steam punk look of a prototype, but when data centre owners around the world hear “40% savings” you can bet it will go from quaint to mainstream fairly quickly.

What’s more Aquasar is only one of many innovations helping to get more out of computers for less. Technologies like virtualisation and cloud computing are making computing into a true utility like power or water — something that is efficiently managed at large scale and available on demand. Gone are the days of computers sitting idle, nibbling away at our energy reserves.

So, thankfully, all our googling about “global warming” no longer has to exacerbate the very problem we’re all worried about. Now, if only we can find a way to efficiently dispose of the world’s spam merchants we will be good to go.