Fast Forward // Neuromorphic Computing & Next-Gen AI Chips

By Steve Gotz, Silicon Foundry

Silicon Foundry
5 min readNov 14, 2019
Image source: Depositphotos

The human brain is one of the most advanced structures in the world. It is also one of the most efficient, using less energy than your standard 20W bulb. For the past thirty years, a nascent and growing group of academic researchers have been forging a new approach to computer chip design called neuromorphic computing (NC). Their research has focused on understanding the mechanisms of the brain, in hopes of using what they learn to inspire and inform the development of next-generation computer chips that mimic the biology and efficiency of the human brain. Recently, Silicon Foundry convened leading researchers, entrepreneurs and investors, as part of our Fast Forward Series, for a conversation about the frontiers of neuromorphic computing and next-generation AI chips. Here’s what we learned.

The Neuromorphic Inflection Point.

Hope for neuromorphic computing springs eternal. Ever since Carver Mead from CalTech started talking about neuromorphic-inspired designs back in the 80’s, the promise of specialized chips that mimic the human brain has always been just over the horizon. Despite several false starts over the past three decades, during our dinner we heard a general consensus that our society is entering a period where enough fundamental deep technology building blocks (e.g., nanoscale memristive crossbar arrays) have been put in place to allow us to approach Carver Mead’s vision.

According to Professor Bruno Olshausen from UC Berkeley, who studied under Carver Mead at CalTech, “Two modern developments are converging that favor neuromorphic computing: 1) the end of Moore’s law, which forces us to consider the efficiencies inherent in low-power, stochastic circuit components for memory and computation, and 2) the growing demand for brain-like functions such as visual and auditory scene analysis, and reasoning, which stress the need for computations and answers that are more probabilistic in nature. In other words, it is becoming ever more difficult to make computers perfectly deterministic (as you make them smaller or consume less power), and you don’t really need to for most AI applications. What is needed are data representations and operations and computing fabrics that are less Turing/von Neumann-like and more brain-like.”

Beyond the fundamental technical requirements to produce a neuromorphic chip, clearly we are reaching a point where the constraints of traditional “von Neumann” architectures and the questionable future of Moore’s Law are encouraging a new generation of AI chip innovators. When coupled with the gravity of data generated at the edges, it’s easy to see why investment in energy efficient chips capable of inference tasks at the edge, has been an active category for venture capitalists and corporations. Key incumbent corporations such as HP, IBM, Intel, SK hynix, Qualcomm and others have been exploring the space and indeed generating headlines with big bets: Korean Conglomerate SK Leads $600M Round for Chinese Chipmaker Horizon Robotics.

Market Challenges vs. Technical Challenges.

While technical challenges remain, the most vexing hurdles may be market-related. A significant portion of our dinner conversation was devoted to discussing semantics. More specifically, what can actually be considered a neuromorphic chip? Are “deep neural networks” implemented on traditional chip architectures sufficient to be categorized as Neuromorphic Computing? In reality, the answer depends on whom you ask. The recent debate between Facebook’s Yan Lecun and Intel’s Mike Davies highlights this stark reality: Intel’s Neuro Guru Slams Deep Learning: ‘It’s Not Actually Learning’. Nevertheless, researchers are well on their way towards making these trite distinctions a moot point.

In July, a research team from Tsinghua University was featured on the cover of Nature for its ground-breaking work architecting a next-generation chip called Tianjic. The Tianjic chip uses a hybrid architecture to bridge the gap between computer science-based techniques (neural nets) and neuroscience-oriented approaches. Using a single chip, the researchers demonstrated a fully autonomous bicycle which included real-time object detection, tracking, voice control, obstacle avoidance and balance management; a task most Tesla owners still dream of.

Who Should We Be Paying Attention To?

IBM, a mainstay in computing, has developed “TrueNorth” technology, one of the most popular neuromorphic implementations. TrueNorth consists of dense “crossbar arrays” of non-volatile memory (NVM) devices or “cores” for implementation of parallel and energy-efficient NC systems. IBM recently delivered a supercomputing platform based on TrueNorth with an equivalent of 16MM neurons and 4B synapses to Lawrence Livermore National Labs. For reference, the human brain has 86B neurons. Amazingly, this supercomputer works using approximately 2.5W, equivalent to a hearing aid battery, demonstrating one of the most anticipated advantages of NC.

Not to be outdone, Intel introduced its “Loihi” chip, a research test chip in November 2017 that was based on a specialized architecture in silicon optimized for spiked neural net (SNN) algorithms. The original Loihi chip included a total of ~130K neurons, each of which communicates with thousands of others, and it reportedly integrates a wide range of novel features for the field, such as hierarchical connectivity, dendritic compartments, synaptic delays and programmable synaptic learning rules. Fast forward to 2019 and Intel recently announced the next generation of Loihi, Pohoiki Beach, which has scaled to 8MM neurons and 64-cores delivering 1,000x more efficient performance over conventional CPU’s.

At the same time, we are seeing an increasing number of well-funded startups competing head-to-head with traditional incumbents. Companies that came up in our conversation include: aiCTX, Analog Inference, GrAI Matter Labs, Rain Neuromorphics and Prophesee.

What’s Next?

With so much activity happening within and around the field of neuromorphic computing, it’s exciting to think we are entering a period where low-power, decentralized computing at the edge, unlocks a range of exciting and transformative use cases. With that said, Niels Bohr was fond of observing “prediction is very difficult, especially if it’s about the future.” So instead of trying to predict what’s next in the neuromorphic computing field, we will leave you with some wise words from one of our dinner attendees:

“If there’s one thing I want people to know about neuromorphic engineering and brain-inspired computing, it’s that there is so much more than spiking neurons. Teams across Silicon Valley and the world innovating in materials (memristors, processing-in-memory), architectures (event-based processors) and algorithms (OpenAI, Numenta) are using inspiration from the brain to inform their technologies.” Gordon Wilson (CEO, Rain Neuromorphics)

--

--

Silicon Foundry

Silicon Foundry is an innovation advisory platform that builds bridges between leading multi-national corporations and global startup ecosystems.