Extended Data Fig. 7: Noise-resilient training of CNNs, LSTMs and RBMs. | Nature

Extended Data Fig. 7: Noise-resilient training of CNNs, LSTMs and RBMs.

From: A compute-in-memory chip based on resistive random-access memory

Extended Data Fig. 7

a, Change in CIFAR-10 test-set classification accuracy under different weight noise levels during inference. Noise is represented as fraction of the maximum absolute value of weights. Different curves represent models trained at different levels of noise injection. b, Change in voice command recognition accuracy with weight noise levels. c, Change in MNIST image-reconstruction error with weight noise levels. d, Decreasing of image-reconstruction error with Gibbs sampling steps during RBM inference. e, Differences in weight distributions when trained without and with noise injection.

Back to article page