IBM took Morpheus’ little blue pill and have now discovered exactly how deep the rabbit hole goes with their SyNAPSE computer chip. With this new chip IBM has brought us closer to artificial intelligence than we have ever been.
IBM’s new computer chip, the Systems of Neuromorphic Adaptive Plastic Scalable Electronics, or SyNAPSE, was designed to more closely mimic the human brain. More specifically it cognitively understands its surrounding to the point that it can interpret its environment and respond to it with decisive action — making use of complex data.
The chip’s creators at IBM are hoping to launch a new revolution in computing with this new discernible shift in thinking, the first step towards “cognitive computers”. This new cognitive computing paradigm will empower computers to not only learn through experience but also to use that experience to form theories about what it means.
We are still be a few years away from becoming the human plantations pictured in the matrix, but the first step is complete with IBM’s two new prototype chips. These two new prototypes are hyped by IBM as the first giant leap forward into computers having the power of “reason”, in place of simply following predetermined equations. One prototype’s core consists of 262,144 programmable synapses, while the second prototype’s core consists of 65,536 learning synapses. All in all the chips, with only a centimeter squared of surface space run with the equivalent of 10 billion synapses and 1 million neurons — as a point of reference the human brain has an estimated 100 billion neurons and 100 trillion synapses.
IBM researchers believe that their new “reasoning” chips will pave the way for more sophisticated technology, and an increase in the types of jobs that can be done my computers. For example, these new computing systems could monitor the oceans, interpreting temperature, waves and pressure that could lead to earlier warning notices for tsunamis. Another possible use that IBM foresees is in traffic lights, “imagine traffic lights that can integrate sights, sounds and smells and flag unsafe intersections before disaster happens,” says the SyNAPSE project leader for IBM Dharmendra Modha.
Neo, here we come.