/ 14 September 2011

How much is the internet’s electricity bill?

The internet, with few exceptions, is free. That makes it easy to overlook the biggest hidden cost of the net: electricity.

To most of us the internet is pretty close to magic. Type in a search, click a link, and the info just arrives on our screens. There isn’t any visible evidence that actual work is needed to make this happen; no grinding gears or roaring burners, and there certainly isn’t an exhaust pipe. It’s also, with few exceptions, free. That makes it easy to overlook the biggest hidden cost of the net: electricity.

We got a rare insight into that cost this week when the normally secretive Google revealed exactly how much electricity it uses to operate its business. In 2010 Google drew a continuous 260 megawatts of power to keep its home fires burning.

That means that in 2010 Google consumed over 2.2-million megawatt hours of electricity (2 277 gigawatt hours to be more exact). If the company was based in South Africa its yearly power bill would be about R940-million.

That sounds like an awful lot, but in global terms it’s fairly small. Our smallest coal-fired powered station, Komati, could power three Googles on its own with room to spare. By comparison the US’s aluminium smelting industry uses over one million gigawatt hours per year — over 400 times as much as Google.

But Google is only one player in a rapidly expanding industry. Facebook, Yahoo! and Microsoft attract comparable audiences, and presumably use comparable amounts of energy. The internet as a whole has literally billions of active users. No one is entirely certain how much energy all this consumes, but estimates range between 5% and 6% of global electricity supply. That means the internet’s annual electricity bill is somewhere between $120-billion and $200-billion. Ouch.

So what chews all this energy? The biggest culprits are enormous data centres; giant warehouses full of powerful servers that do the heavy lifting required to keep the internet running. All these computers packed together generate an enormous amount of heat, and so they must be continuously air conditioned. Google alone is rumoured to have literally millions of servers scattered around the globe, hence its large energy footprint.

Google argues that its services actually produce a net saving in energy, because they allow people to quickly and cheaply find the information they need without resorting to car trips. The same can be applied to the internet in general — telecommuting (working from home) and video conferencing save many millions of litres of fuel each year.

But, given how quickly the industry is expanding, many people are concerned about the net’s future power needs. The exploding popularity of cloud computing, which shifts computing power away from desktops and offices and into ever larger data centres, will be a huge factor in this growth. Greenpeace frets that the industry will consume triple the amount of power by 2020.

While this concern is understandable, it’s largely overblown. We’ve all heard about Moore’s Law — the tendency of computers to double in power and halve in price every 18 months — but a handy corollary has recently been discovered.

Engineers at Stanford University have proven that the energy efficiency of computers also doubles every 18 months. This has applied ever since the first computers were built in the 1950s, and continues to apply today. It’s most starkly illustrated by the recent leaps in cellphone technologies. A modern smartphone is now more powerful than a desktop computer was a decade ago, but it uses a tiny fraction of the power.

What’s more we are only scratching the surface. Richard Feynman, a Nobel-prize-winning physicist, hypothesised in 1985 that computers could be made 100-billion times more efficient before they hit any theoretical limits. Computers are already 40 000 times more efficient than in 1985. And as Professor Jonathan Koomey, who lead the Stanford study, says “There’s so far to go. It’s only limited by our cleverness, not the physics.”

As if on cue, Intel announced yesterday that their next generation of chips will use 20 times less energy than they currently do. This will allow you to run a laptop for 24 hours without recharging, and to run it off a solar cell.

Whether the marketing fanfare proves completely accurate or not is irrelevant. Even if the technology delivers 50% of its promise, data centres will be many times more efficient to run than they currently are. And so the internet could feasibly triple in size by 2020 without tripling in energy hunger.

What’s more, data centre owners have a vested interest in efficiency. Unlike many other large utilities their costs are tied directly to their consumption of electricity. And so what is good for the planet is also good for their bottom lines.

I have often fretted about the impact of the internet on pollution and global warming. But, as is so often the case, technology has made a fool out of me. In this case I’m happy to eat my words as long as I get one of those 24 hour laptops in the bargain.

Follow Alistair on Twitter: @afairweather