Content area
Full Text
Guided by brain-like 'spiking' computational frameworks, neuromorphic computing-brain-inspired computing for machine intelligence-promises to realize artificial intelligence while reducing the energy requirements of computing platforms. This interdisciplinary field began with the implementation of silicon circuits for biological neural routines, but has evolved to encompass the hardware implementation of algorithms with spike-based encoding and event-driven representations. Here we provide an overview of the developments in neuromorphic computing for both algorithms and hardware and highlight the fundamentals of learning and hardware frameworks. We discuss the main challenges and the future prospects of neuromorphic computing, with emphasis on algorithm-hardware codesign.
Throughout history, the promise of creating technology with brain-like ability has been a source of innovation. Previously, scientists have contended that information transfer in the brain occurs through different channels and frequencies, as in a radio. Today, scientists argue that the brain is like a computer. With the development of neural networks, computers today have demonstrated extraordinary abilities in several cognition tasks-for example, the ability of AlphaGo to defeat human players at the strategic board game Go1. Although this performance is truly impressive, a key question still remains: what is the computing cost involved in such activities?
The human brain performs impressive feats (for example, simultaneous recognition, reasoning, control and movement), with a power budget2 of nearly 20 W. By contrast, a standard computer performing only recognition among1,000 different kinds of objects3 expends about 250 W. Although the brain remains vastly unexplored, its remarkable capability may be attributed to three foundational observations from neuroscience: vast connectivity, structural and functional organizational hierarchy, and time-dependent neuronal and synaptic functionality4,5 (Fig. 1a). Neurons are the computational primitive elements of the brain that exchange or transfer information through discrete action potentials or 'spikes', and synapses are the storage elements underlying memory and learning. The human brain has a network of billions of neurons, interconnected through trillions of synapses. Spike-based temporal processing allows sparse and efficient information transfer in the brain. Studies have also revealed that the visual system of primates is organized as a hierarchical cascade of interconnected areas2 that gradually transforms the representation of an object into a robust format, facilitating perceptive abilities.
Inspired by the brain's hierarchical structure and neuro-synaptic framework, state-of-the-art artificial intelligence is, by and large, implemented using...