Fig. 1: Reservoir computing architectures.
From: Rotating neurons for all-analog implementation of cyclic reservoir computing

a A conventional reservoir computing architecture with random connections. b A simplified version of a reservoir, also known as a cyclic reservoir. The randomly connected neurons are replaced with a ring structure. c Illustration of the working principle of the proposed rotating neuron reservoir (RNR) that can be physically implemented. The input weights are uniformly distributed in the range of [−1, 1], and a pre-neuron rotor sends the signal to different neuron channels at different time steps. After flowing through the dynamic neurons, the signal is sent to different state channels via another post-neuron rotor, and the final states are read out through a fully connected layer and used in training. d Sketch of the working principle for the case of three neurons, where R denotes the rotation matrix. The legend for all subfigures is provided at the bottom.