Problems and pay-offs of new-generation processors
A fundamental change in the design of microprocessors is presenting software developers with a challenge—and a huge financial opportunity.
Chip makers are no longer racing to have the fastest microprocessor and have shifted their focus away from building chips with a single, super-fast calculating core. Instead, to save energy and reduce heat, they’re putting multiple cores on the same chip—the equivalent of several computers on the same slice of silicon.
The cores run slower but are more energy-efficient, and are designed to break up big chores and work on the separate pieces simultaneously.
The resulting technology is ideal for the most demanding multimedia tasks, such as processing large video files, pulling information from multiple databases at the same time, or playing a computer game while downloading music and burning a DVD.
The problem is that many software applications were not written for chips with multiple cores, and the hardware is advancing so fast that the software runs the risk of being left behind.
“You can imagine a scenario where people stop buying laptops and PCs because we can’t figure this out,” said David Patterson, a computer-architecture expert and computer science professor at the University of California, Berkeley.
As processors sped up, software developers tagged along by making their programs faster and faster. But now that chip makers are no longer focused solely on speed, programmers must change their tactics and learn to send instructions to different parts of the chip instead of through a single processing core.
Intel and Advanced Micro Devices are making their latest microprocessors with two and four cores, with plans for more in the future.
Intel has even demonstrated an 80-core research chip that is so complex that it doesn’t have an operating system smart enough to work with it.
Supercomputers and corporate data centres have used machines with multiple processors for years—with specially written software that enables, say, the processing of multiple web searches at the same time.
That inspired chip makers to build multicore microprocessors for mass-market PCs that began hitting the market in recent years.
The philosophy is similar. But software on the PC side has not traditionally been designed with multiple processors in mind.
The gap between hardware and software hasn’t become a problem for consumers yet, because operating systems such as Windows XP and Vista already work with the multicore chips out now, and basic applications such as word processing and email won’t need the extra cores or a software overhaul.
But experts predict dire consequences if the software for more complicated applications isn’t brought up to speed soon. They warn that programs could suddenly stop getting faster as chips with eight or more cores make their way into PCs. The software as it’s currently designed can’t take advantage of that level of complexity.
“We’d be in uncharted territory,” Patterson said. “We need to get some Manhattan Projects going here—somebody could solve this problem, and whoever solves this problem could have this gigantic advantage on everybody else.”
To be sure, industry and academia are working on ways to prevent PCs from being saddled with vast pools of untapped processing power. But it’s not easy.
Mark Lewin, programme manager in external research and programs for Microsoft Research, said the solution will require more than inventing new programming languages as developers need to invent entirely new ways to build software.
“It will take a lot of heavy lifting, a lot of rethinking, but the opportunity is huge,” said Lewin, whose group last week announced a $500 000 grant programme for universities with innovative proposals for studying software development for multicore computing.
In a speech in May, Craig Mundie, Microsoft’s chief research and strategy officer, declared that “the free lunch to some extent is over” for software companies that have counted on chips going faster and faster. He said Microsoft researchers have been focused over the past five years on this so-called parallel computing, or tasks being performed at the same time on multiple processors.
“Clearly something is going to give and the question is what,” he said. “And I think the tools are going to evolve, people are going to get more creative, as they always have, in trying to figure out what to do with this capability.”
Race is on
Chip makers say the race to add cores isn’t about bravura.
Jerry Bautista, director of technology management for Intel’s Microprocessor Technology Lab, said the usefulness of multicore computers is apparent on Wall Street, where investors need complicated calculations quickly.
“People want to make decisions in real time—they don’t want to run some complex simulation overnight; they want to see the results then and there,” he said.
The same mentality applies in consumer electronics, where seamless video streaming and smooth computer game play with ever-richer graphics are in high demand, he said.
The march of multicore progress does raise the question: How many cores are practical for the average PC user?
Phil Hester, AMD’s chief technology officer, said it doesn’t make business sense to modify applications such as word processing and email that already work faster than users can input instructions.
“The fact is that a lot of the applications today are human-response-time limited,” he said. “If you took the word processor and made it four-core-capable, from a human standpoint, you wouldn’t notice a difference.”
However, Hester said some of the most promising and popular applications clearly can benefit from multiple cores, such as high-performance technical computing, facial- and pattern-recognition software, and search programs for large databases.
Companies that successfully migrate appropriate mass-market applications to the parallel computing environment—or create new ones that exploit the shift in chip technology—stand to profit mightily.
“The software industry would have been very happy if the processor industry could have been able to double performance every two years without having to go to this parallel world,” said Marc Tremblay, chief technology officer for Sun Microsystems’s microelectronics business, where he oversees the server and software maker’s processor road map.
“Unfortunately people ran into roadblocks, and the winners will be the people who can actually leverage this disruption.”—Sapa-AP