IBM has helped create an electronic “brain chip” after a decade of research. If all goes as planned, it could be one of the greatest inventions of the computing era, and this is not an understatement.

IBM Synapse chip

Above: IBM Synapse chip

Image Credit: IBM

The IBM Synapse chip is like a modern supercomputer in the space of a postage stamp, outperforming today’s fastest microprocessors because it processes data in a more efficient way — similar to the way that the brain works. It weighs just a few grams, and something as small as a hearing-aid battery could power it. It could find use in a variety of mobile, cloud, and distributed-sensor applications.

IBM may be exaggerating its achievement. The real test will come when programmers learn how to use the chip and create new applications for it. But for now, it’s worth noting that IBM has been on the path of continuous improvement in its well-documented history of this research.

I got my first look at IBM’s previous chip back in 2011, when IBM created a prototype chip that had one core, 256 neurons, and had the scale of a worm’s brain. Now the new chip uses 15 times less area, 100 times less power, and it has more than 4,000 times more cores.

AI Weekly

The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.

Included with VentureBeat Insider and VentureBeat VIP memberships.

“I’m holding in my hand a new machine for a new era,” said Dharmendra Modha, principal investigator on a project that was created by the IBM Almaden Research Center and a bunch of other research institutions. “It’s the culmination of over a decade of our research. Ten years ago, many believed this was impossible. The impossible has become possible, and the possible will very soon be real applications.”

IBM Synapse chip

Above: IBM Synapse chip

Image Credit: IBM

The ambition is huge. IBM’s so-called cognitive computing chips could one day simulate and emulate the brain’s ability to sense, perceive, interact, and recognize — all tasks that humans can currently do much better than computers can.

Modha asked for permission to “geek out” as he explained the features of the chip. It has a million neurons (brain cells), 256 million synapses, and 4,096 neurosynaptic cores, arranged in a 64 x 64 array. It has 400 million bits of local on-chip memory. The cores can be connected via a network on a chip.

“You can tile these chips seamlessly, and it becomes twice the chip,” Modha said. “We have already tiled 16 of them in a 4 x 4 array, to create a 16 million neuron, 4 billion synapse core.”

The chip is the outcome of a $53.5 million project funded by the Defense Advanced Research Projects Agency (DARPA).

Called Synapse (Systems of Neuromorphic Adaptive Plastic Scalable Electronics, or SyNAPSE), the chip is like a component of the human brain, and it can be paired with more than 4,000 other chips to form a thinking machine like no other we’ve seen.

“I believe that it is very significant, and would compare it to the Connection Machine,” said Horst Simon of the Lawrence Berkeley Labs government laboratory, who was not involved in the Synapse project. “When the CM-1 was developed in the late ’80s, it was an architectural breakthrough,as was the Caltech hypercube project, that demonstrated the feasibility of what was then called massively parallel computing [MPP].”

But it’s not just about parallel computing, Simon said: It’s about the need to fundamentally rethink what we mean by “computers.”

“I think we are on the threshold of a similar transition to new architectural paradigms as we approach the limits of performance in von Neumann architecture,” Simon said. “The IBM Synapse project is an indicator of that change in the next 10 years. It is a remarkable achievement in terms of scalability and low power consumption.”

Big Blue had Samsung use its low-power chip manufacturing process to fabricate the chip in a 28-nanometer process, where the average width between circuits is 28 billionths of a meter. The chip has 5.4 billion transistors, or basic electronic components. That makes it the largest chip IBM has ever made. But it consumes just 70 milliwatts.

Above: IBM’s Dharmendra Modha in 2011.

While other chips are measured in FLOPs, or floating point operations per second, IBM measures the chip in SOPs, or synaptic operations per second.

“This chip is capable of 46 billion SOPs per watt,” Modha said. “It’s a supercomputer the size of a postage stamp, the weight of a feather, and the power consumption of a hearing-aid battery.”

The power density of the chip is 20 milliwatts per square centimeter, or four orders of magnitude cooler than a microprocessor, he said.

“It is an astonishing achievement to leverage a process traditionally used for commercially available, low-power mobile devices to deliver a chip that emulates the human brain by processing extreme amounts of sensory information with very little power,” said Shawn Han, the vice president of foundry marketing for Samsung Electronics, in a statement. “This is a huge architectural breakthrough that is essential as the industry moves toward the next-generation cloud and big-data processing.”

IBM Synapse application

Above: Possible IBM Synapse application: Head-mounted computers.

Image Credit: IBM

Brave new architecture

The chip has a radical new architecture that differs from the basic computing architecture, first laid out by pioneering computer scientist John von Neumann, that has ruled computing for 68 years. In von Neumann machines, memory and processor are separated and linked via a data pathway known as a bus. Over the years, von Neumann machines have gotten faster by sending more and more data at higher speeds across the bus as processor and memory interact. But the speed of a computer is often limited by the capacity of that bus, leading some computer scientists to call it the “von Neumann bottleneck.”

IBM Synapse chart

Above: IBM Synapse power density chart.

Image Credit: IBM

With the human brain, the memory is located in the same place as the processor — at least, that’s how it appears, based on our current understanding of what is admittedly a still-mysterious 3 pounds of meat in our heads.

“After years of collaboration with IBM, we are now a step closer to building a computer similar to our brain,” said Cornell University professor Rajit Manohar in a statement.

The brain-like processors with integrated memory don’t operate fast at all by traditional measurements, sending data at a mere 10 hertz, or far slower than the 5 gigahertz computer processors of today. But the human brain does an awful lot of work in parallel, sending signals out in all directions and getting the brain’s neurons to work simultaneously. Because the brain has more than 10 billion neuron and 10 trillion connections (synapses) between those neurons, that amounts to an enormous amount of computing power.

IBM is emulating that architecture with its new chips.

This new computing unit, or core, is analogous to the brain. It has “neurons,” or digital processors that compute information. It has “synapses,” which are the foundation of learning and memory. And it has “axons,” or data pathways that connect the tissue of the computer.

IBM Synapse chip in sensor application

Above: IBM Synapse chip in sensor application

Image Credit: IBM

The work combines supercomputing, nanotechnology, and neuroscience in an effort to move beyond calculation to perception.

“We are using totally different design techniques than those used to create current computer chips,” Modha said. “It’s scalable, efficient, and flexible.”

Modha said that this new kind of computing will likely complement, rather than replace, von Neumann machines, which have become good at solving problems involving math, serial processing, and business computations. The disadvantage is that those machines aren’t scaling up to handle big problems well any more. They are using too much power and are harder to program.

“This is like the milk to von Neumann’s cookies,” Modha said. “French fries and ketchup, yin and yang. This architecture puts the anatomy and physiology of the brain in today’s silicon. It’s a parallel, distributed, modular, event-driven, and fault-tolerant architecture.”

The more powerful a computer gets, the more power it consumes, and manufacturing requires extremely precise and expensive technologies. And the more components are crammed together onto a single chip, the more they “leak” power, even in stand-by mode. So they are not so easily turned off to save power. The advantage of the human brain is that it operates on very low power and it can essentially turn off parts of the brain when they aren’t in use.

IBM Synapse chip in sensor application

Above: IBM Synapse chip in sensor application

Image Credit: IBM

These new chips won’t be programmed in the traditional way. Cognitive computers are expected to learn through experiences, find correlations, create hypotheses, remember, and learn from the outcomes. They mimic the brain’s “structural and synaptic plasticity.” The processing is distributed and parallel, not centralized and serial.

With no set programming, the computing cores that the researchers have built can mimic the event-driven brain, which wakes up to perform a task.

The so-called “neurosynaptic computing chips” re-create a phenomenon known in the brain as a “spiking” between neurons and synapses. The system can handle complex tasks such as playing a game of Pong, the original computer game from Atari. IBM demonstrated the chip playing Pong a couple of years ago, and it got a tech community working on learning how to program the chip in 2013.

As we noted before in 2011, the eventual applications could have a huge impact on business, science, and government. The idea is to create computers that are better at handling real-world sensory problems than today’s computers can. IBM could also build a better Watson, the computer that became the world champion at the game show Jeopardy.

Richard Doherty, an analyst at Envisioneering Group, has followed the research for a long time. He describes the importance of the Synapse technology — and the kits for programming it — to the launch of the Apple II in 1977.

“Outside engineers, programmers, neurologists, psychologists will tinker, improve, quicken and pour it all into a better IBM Synapse and some new startups,” he said. “We must have processors which will use the human senses we all identify with and feel comfortable with” so we can sift information from noise.

IBM Synapse chip application

Above: IBM Synapse chip application

Image Credit: IBM

A long time in the making

I wrote about this when IBM announced the project in November 2008 and again when it hit its first milestone in November 2009. The company hit new milestones in 2011 and 2013.

With the grant from DARPA, the project took the work of eight IBM labs, Samsung, a startup, four universities (Cornell, the University of Wisconsin, University of California at Merced, and Columbia) and a number of government researchers from two Department of Energy Labs.

It took so long because it required the creation of an unconventional architecture, custom design of hardware and software, and new manufacturing processes. The team built new simulations, algorithms, libraries, training programs, and applications.

IBM Synapse architecture

Above: IBM Synapse architecture

Image Credit: IBM

“It was genuine collaboration and sheer hard work,” Modha said.

After the design was done, the leaders offered a $1,000 bottle of champagne to anyone who could find a bug in the chip. After a month, no one found one.

“We had cheap wine for everybody at the end of the month,” Modha said.

Applications are likely to be in analyzing sensor data in real-time, and sifting through the ambiguity found in complex, real-world environments. Modha said companies can embed this brain chip in sensors.

Modha said the human eye sifts through a terabyte of data per day. Today, we record the world pixel by pixel. But the human retina detects changes in a scene. When it detects changes, the neurons spike. A Swiss partner provides a dynamic vision camera that connects to the chip and enables vision-like processing.

“This can revolutionize the mobile experience as we know it,” Modha said. “It can take visual, auditory, multisensory data to mobile, to the cloud, and to a new generation of supercomputers and distributed sensors.”

In closing, Modha said, “We haven’t built a brain. But we have taken the functions, structure, behavior, and capability and put it into silicon to enable a new computing capability. The chip has the potential to transform business, government, science, technology, and society.

“That’s my story, and I’m sticking to it.”

As for shipping the chips, Modha said it is still a research project. More than 200 people worked on it, investing 200 person-years in the project.

“The commercialization part is where we are engaged with a number of discussions with business partners, universities, government agencies, and fellow IBMers to move it out of the lab,” Modha said.

Science magazine is publishing a paper describing the project today.

IBM Synapse chip hardware

Above: IBM Synapse chip hardware

Image Credit: IBM

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More