Fig. 4: Hardware NSM performing image classification and exhibiting self-normalization. | Nature Communications

Fig. 4: Hardware NSM performing image classification and exhibiting self-normalization.

From: Neural sampling machine with stochastic synapse allows brain-like learning and inference

Fig. 4

a Network architecture of the NSM consisting of an input layer, three hidden fully connected layers and an output layer. b Exact match witnessed between the measured switching probability of the stochastic selector device and theoretically predicted probability for a Bernoulli distribution, highlighting that our stochastic selector device can inject Bernoulli multiplicative noise. c Evolution of the test accuracy for the simulated hardware-NSM using the FeFET-based analog weight cell and the stochastic selector as a function of the epochs. d Comparison of the performance of the simulated hardware-NSM with a deterministic feedforward multilayer perceptron (MLP) and the theoretical NSM model with full precession synaptic weights and a Bernoulli multiplicative noise for the stochastic synapses. e Evolution of the weights of the third layer during learning for three different networks- an MLP without any regularization, an MLP with additional regularization added and the simulated hardware-NSM. f Evolution of the 15th, 50th and 85th percentiles of the input distributions to the last hidden layer during training for all the three networks. Overall, NSM exhibits a tighter distribution of the weights and activation concentrated around its mean, highlighting the inherent self-normalizing feature. MLP multilayer perceptron, NSM neural sampling machine, Q quantile.

Back to article page