If technological progress, known in the chip industry as Moore’s Law, had ended a decade ago, as some people predicted, we wouldn’t have had smartphones or tablets.
The electronics world would have been very different. It would have been more primitive than it is today, and we might not have cool apps like Angry Birds, according to Bob Colwell, director of the Microsystems Technology Office at the government’s Defense Advanced Research Projects Agency (DARPA).
Now, Moore’s Law may be coming to an end, and chip designers should really come up with tricks that will keep technology moving forward even if they don’t get the kind of enormous free advances they used to get in the past, Colwell said.
People have been predicting the end of Moore’s Law almost since it was first established. Gordon Moore, chairman emeritus of Intel, predicted in 1965 that the number of transistors –basic on-off switches that are the fundamental building blocks of modern electronics — on a chip would double every two years or so. Intel’s first microprocessor, the 4004, had just 2,300 transistors on a chip when it was introduced in 1971. Now, Nvidia’s Kepler-based Titan graphics chips have more than 7 billion transistors on a single chip. That’s what 42 years of chip doublings have gotten us.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
But Colwell, who made his remarks at the Hot Chips engineering conference at Stanford University in a keynote speech on Monday, believes that it’s time to think about an end. The laws of physics are preventing further gains through simple miniaturization — which has made chips smaller, faster, and less power hungry over the years. Now problems such as power consumption and atomic variation are creating barriers.
Silicon Valley has long wagered that Moore’s Law will keep going.
Colwell was a longtime chip architect at Intel, heading projects such as the extremely successful Pentium II processor. In 1997, he was named an Intel Fellow, the highest technical rank at the company.
He said that the basic chip manufacturing technique known as CMOS (complementary metal oxide semiconductor) has created the “illusion of binary digital crispness for several decades, but the game is up.” That means the on-off switches required for digital ones and zeroes, the foundation of computing, may no longer be reliable. Problems such as cross talk, metastability, thermals, electromagnetics, and other problems threaten the stability. The Pentium II, for instance, could operate on 20 or so watts. The Pentium 4 operated on 80 or 90 watts. The trend was a very bad one.
Even if these barriers prove to be insurmountable, Colwell said, chip designers will have to use more ingenuity.
“Post Moore’s Law, all roads are not blocked,” he said.
Engineers can work on tasks such as 3D stacking, improved packaging, better cooling, longer battery life, better input-output systems, improved memory, and better chip architectures, he said.
“There is plenty of room to improve on software,” he said. “Plenty of room for I/O, and memory. … We, as chip designers, will have to care about neighboring technologies because they affect us.”
He joked that chip designers should stop designing chips and computers that nobody can program.
DARPA itself is funding a bunch of alternatives that could drive technology forward when conventional methods wane, Colwell said.
Colwell believes, like Gordon Moore himself, that economics will bring an end to Moore’s Law, not physics. At some point, chip makers won’t be able to get a return on their $4 billion-plus chip factories because they won’t be able to sell their chips for high enough prices to make up for those huge startup costs.
“Everybody concentrates on how many atoms, and those things matter,” he said. “But my suspicion is there is so much money in this that this is what will break first.”
In response to a question, Colwell said it will be very hard for chip designers to come up with something that challenges the dominance of the two prevailing architectures from Intel and ARM.
“It would require somebody to say that such a machine needs to exist and it would lead to a huge profit,” Colwell said. There’s too much fear in the industry for that to happen. Colwell, who has not worked at Intel for 13 years, said, “Intel is terrible at looking five years ahead and saying let’s get ready for it.”
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More