Fig. 4: MNIST pattern recognition simulation.

a Constituents of a single-layer network for the typical MNIST pattern recognition process in which input neurons (yellow) and output neurons (green) are fully connected by synaptic weight (blue). b Diagram of a crossbar array mapped into ANNs. The input voltages (xi, grey) are applied to the input neurons of each column (yellow) and are scaled by their own synaptic weights (wi,j = Gi,j+−Gi,j−). The output signals are integrated into the form of ∑wi,jxi at the output neurons of each row (green). c, d Diagrams of reconstructed ANN architectures for two cases: (i) the case in which the undesired neural pathways (red lines) are not suppressed, and (ii) the fully suppressed case. e, g Reshaped 28 × 28 contour images of the final conductances (Gi,j+ and Gi,j−) and synaptic weights (wi,j) corresponding to “3” after 15 training epochs corresponding to S.C = 1 and S.C = 0. f, h Confusion matrices for a classification test of the 10,000 MNIST handwritten digit images after 15 training epochs corresponding to S.C = 1.0 and S.C = 0