Fig. 4

Applications of multi-memristive synapses in neural networks. a An artificial neural network is trained using backpropagation to perform handwritten digit classification. Bias neurons are used for the input and hidden neuron layers (white). A multi-memristive synapse model based on the nonlinear conductance response of PCM devices is used to represent the synaptic weights in these simulations. Increasing the number of devices in multi-memristive synapses (both in the differential and the non-differential architecture) improves the test accuracy. Simulations are repeated for five different weight initializations. The error bars represent the standard deviation (1σ). The dotted line shows the test accuracy obtained from a double-precision floating-point software implementation. b A spiking neural network is trained using an STDP-based learning rule for handwritten digit classification. Here again, a multi-memristive synapse model is used to represent the synaptic weights in simulations where the devices are arranged in the differential or the non-differential architecture. The classification accuracy of the network increases with the number of devices per synapse. Simulations are repeated for five different weight initializations. The error bars represent the standard deviation (1σ). The dotted line shows the test accuracy obtained from a double-precision floating-point implementation