“I’m amazed. The original prediction was to look 10 years out, which I thought was a stretch,” Intel’s cofounder Gordon Moore said at an event on Monday to mark 50 years since he postulated that the number of transistors on a computer chip doubles every two years.
For years, critics, including Moore himself, of the 1965 article that foresaw the growth of semiconductors into a $340-billion market, said there would eventually be obstacles that no amount of innovation could overcome. But the assertion has held true for half a century.
Now, the exponential growth in computer-processing power may finally be slowing. Simply put, the cost of making things smaller – of shrinking the tiny circuits that make up the basic components of every chip – may start to outweigh the benefits.
“Everyone who has predicted the end of Moore’s law over the last 40 years has been wrong,” said James Plummer, a former head of Stanford University’s engineering department. “That said, it’s getting increasingly expensive and more difficult to continue on the trend.”
Computers have become faster every year. Apple’s iPod, iPads and iPhones have gained the ability to store more pictures and music, and mobile devices can tap into web-based data and computing resources with fast connections. The biggest beneficiary of that progress has been the end user, whose costs have remained relatively constant.
As well as predicting the personal computer, more than a decade before the device became a practical reality, Moore’s 1965 article also foresaw the mobile phone, big data and even self-driving cars. Compared with Intel’s first microprocessor, which debuted in 1971, current models are 4 000 times faster, use about 5 000 times less energy and are 50 000 times cheaper per transistor, according to the company’s Moore’s Law: Fun Facts website.
Intel is far from ready to give up and is sticking to its view that its engineering wizards are going to invent the next miracle.
“It’s getting more difficult and progressively more difficult every generation,” said Mark Bohr, who heads Intel’s efforts to advance its production technology. “[But] engineers love a challenge.”
The challenge with making anything smaller, particularly when billions of circuits are crammed into something the size of a thumbnail, is that there are physical limits that can’t be ignored. Some of the layers of materials being deposited on discs of silicon to build the chips are atoms thick. Circuits on chips are now getting tinier than the wavelength of light used to burn them on to coated silicon, requiring the use of new technology called extreme ultraviolet lithography, or EUV.
But the technology hasn’t been deployed widely. Delays have happened in the past, but never for so long.
Moore said he was confident that progress could continue for another five to 10 years. “[But] some day it has to stop,” Moore said. “No exponential thing like this goes on forever.” – © Bloomberg