If you want to start a rumour, how about that Google is going to build its own nuclear power station? The logic is easy. Larry Page, the company’s co-founder, reportedly sees ”running out of power” as the biggest potential threat to Google, and the electricity needed to run its ”server farms” — tens of thousands of power-hungry computers storing billions of internet pages — could soon cost more than the hardware. Partly this is because Google is based in California, where the state solved its 2001 energy crisis by borrowing $10 billion to buy electricity at massively inflated prices. But the rest of us are heading in the same direction.
We live in a world where the use of chip-based computers and consumer electronics devices is increasing rapidly, while supplies of oil and natural gas are diminishing perhaps even more rapidly. Worse, the threat of global warming means we should now be decreasing our energy use, like the Japanese, not increasing it. And although each individual PC or peripheral may not use much electricity, when you have a billion of them, it adds up.
For home users in the United Kingdom, of course, most conservation efforts should be applied to heating and hot water systems, which account for about 85% of the energy used in homes, compared to about 13% for lighting and electrical appliances. The top priority is loft insulation, which is why the government offers grants for it. Double glazing, blocking draughts, drawing curtains, lagging hot water tanks and turning down thermostats all make a difference. So does swapping 100W light bulbs for high-efficiency 20W versions, and turning off electrical equipment such as TVs and PCs at the plug. (Standby mode uses surprising amounts of energy.
Powering down
Sadly, it’s impossible to say how much power a PC uses without measuring it, because of variables such as the type of motherboard, the speed of the chip and the power of the graphics card. (A fast graphics card can use more power than the processor.) PC power supplies can range from about 150W to about 650W, and will actually draw more than that during peak loads. However, PCs use much less power when idling and the US Energy Star programme — which PC manufacturers have been following since 1992 — is aiming to get the power consumption on idle below 50-60W.
The simplest approach is to use the PC’s power-saving software to turn the screen and hard drive off and then suspend the whole system after a specified time. The most expensive option is to use a screen saver that ”donates” processor cycles to a worthy cause, but will run your PC at a high level most of the time. PC Pro magazine found that it cost £79 a year to run an Athlon Shuttle PC with normal use (eight hours a day then switched off), but running SETI@home made that £400.
The situation is improving thanks to market trends towards flat screens and the use of portables rather than desktop computers. LCDs use much less power than traditional monitors, and by design, most notebooks use less power than most desktops. At the extremes, the 1GHz Pentium M Ultra Low Voltage chip uses only 5W whereas Intel’s hottest chip for gaming, the 3,73GHz Pentium 4 Extreme Edition, can consume up to 115W.
However, Intel has done a U-turn on its processor design goals, which should help. The Pentium design drove up clock speeds (and power consumption) to build the fastest chips. In 2002, Intel executives still assured me that ”gigahertz is far from over” and looked forward to a 4.7GHz Pentium codenamed Tejas. In 2005, however, still short of 4GHz, they announced a new mantra: ”performance per Watt”.
Alistair Kemp, a spokesperson for Intel, says the company has now developed ”a new microarchitecture that will be coming out in the second half of this year”. New chips codenamed Merom (for notebooks), Conroe (for desktops) and Woodcrest (for workstations and servers) will, he says, ”reduce average power use quite substantially”. With Conroe, the reduction will be from about 95W, for a fast Pentium 4, to about 65W.
Performance per Watt is also important for the arrays of servers in corporate data centres. Luiz Andre Barroso, principal engineer at Google, has already warned that ”the possibility of computer equipment power consumption spiralling out of control could have serious consequences for the overall affordability of computing, not to mention the overall health of the planet.”
In an article called The Price of Performance in the professional journal ACM Queue (http://tinyurl.com/m4udr), Barroso expressed concern at the cost of providing computers, with electricity overtaking the cost of buying the hardware in the first place. Running costs are exacerbated because companies generally try to utilise their servers as heavily as possible 24/7, or 8 760 hours a year. Faster processors also generate more heat, so computer rooms require extra cooling. This uses more electricity, costing more money.
Cool solution
Of course, there’s nothing new about any of this. The late Seymour Cray, the world’s greatest supercomputer designer, spent a lot of his time on plumbing. In 1985, he resorted to pumping a non-conducting liquid called Fluorinert over the Cray 2’s electronics to cool them. When mainframes ruled the world, IBM packed its mainframe chips in ceramic Thermal Conduction Modules with chilled water flowing through pipes to conduct away the heat. Some of today’s high-performance games PCs use similar techniques, and it’s still an option for servers — but no one really wants to go back to plumbing.
In his article, Barroso suggests that multiprocessor chips are ”the best (and perhaps only) chance to avoid the dire future envisioned above”. What has changed recently is that multicore processors — with more than one processing element on a single die — have finally entered the mainstream, and many PC manufacturers are now shipping systems with Intel Core Duo processors that deliver more performance per watt than their forebears.
Indeed, the idea has even reached the home market. Microsoft’s Xbox 360 games console has a processor with three IBM PowerPC cores, each of which can run two programming threads: the result is a single chip that can work like six. Sony’s forthcoming PlayStation 3 will use an IBM Cell chip with multiple processing elements.
Not even low-power multicore chips will solve the power consumption problem permanently, but they should at least buy us a few years breathing space.
And if people factor the cost of power consumption (and cooling, where required) into their computer purchasing decisions, both for commercial and ecological reasons, this will put pressure on the manufacturers to do even better in the future. – Guardian Unlimited Â