Fig. 1: Biomimetic signal encoding with an artificial spiking visual neuron. | Nature Communications

Fig. 1: Biomimetic signal encoding with an artificial spiking visual neuron.

From: An artificial visual neuron with multiplexed rate and time-to-first-spike coding

Fig. 1

a Schematic illustration of the retina, which is composed of photoreceptors, synapses, and neurons. Photoreceptors can respond to external optical signals and convert them into graded potentials. In synapses, synaptic plasticity is responsible for learning and storing memories. Neurons (retinal ganglion cells) can encode synapse-processed graded potentials as action potentials (electrical spikes) to be processed by the brain. b Encoding of different input stimuli and synaptic weights by time-to-first-spike (TTFS) coding and rate coding schemes in a biological spiking visual neuron. The frequency (F) of rate coding depends on the number of spikes (Nspikes) within the time window (Ttotal), while TTFS coding depends on the first spike latency (T). Low stimulus input (orange) and high stimulus input (blue) along with synaptic weights w1 (black) and w2 (purple) correspond to neural spiking responses. c Schematic and an optical image of artificial neuron device composed of the integrated In2O3 optoelectronic synaptic transistor and NbOx Mott neuron (1T1R). d The optoelectronic retina enables synaptic plasticity and rate-temporal fusion coding. Spiking sensory neurons are activated when EPSPs reach a certain threshold. The rate-temporal fusion encoding of spiking neurons represents the characteristics of the stimulus in real-time through the change in the first spike latency and the spike frequency.

Back to article page