Fig. 1: Neuromorphic computing vs. traditional computing from the view of dynamic computing.
From: Spike-based dynamic computing with asynchronous sensing-computing neuromorphic chip

a Spiking neuron vs. Artificial neuron. Left: spiking neurons communicate through spike trains coded in binary spikes, and the major operation is synaptic Accumulation (AC) between weights. Right: neurons in ANNs communicate using activations coded in analog values, and Multiply-and-Accumulate (MAC) of inputs and weights is the major operation. b From a dynamic computing perspective, we compare neuromorphic and traditional computing in three aspects: vanilla algorithm (top), dynamic algorithm (middle), and hardware (bottom). In traditional computing (right part), vanilla algorithms generally hold a static, fixed computational graph manner. Although some neurons have activation values of zero, all zero-based MAC operations must be performed. By adapting the structures of static models to different inputs, dynamic ANNs can lead to notable advantages in accuracy and computational efficiency. However, traditional computing hardware is mostly optimized for static models and not friendly to dynamic networks, and there is a gap between the theoretical and practical efficiency of dynamic ANNs21. In neuromorphic computing (left part), SNNs are born with dynamic computational graphs, and neuromorphic hardware is naturally suitable for SNNs. However, we observed the dynamic imbalance in SNNs, which respond similarly to diverse inputs. c Dynamic imbalance. SNNs satisfy dynamic activation forms but are not good at dynamic functions, i.e. responding discriminatively. Spatio-temporal invariance (Figs. S2, S3) is the fundamental assumption of SNNs because they share parameters at different timesteps. Consequently, LSFRs (definition is given in Part “Details of algorithm evaluation'') at each timestep is similar, which indicates that the scales of the activated sub-networks of SNNs are similar for diverse input.