Table 2 Summary of the investigated neural network architectures

From: Layer ensemble averaging for fault tolerance in memristive neural networks

Dataset

Network Architecture

Biases

Activations

Quantization

Optimizer

Test Accuracy

Yin-Yang (2-task variant)

4 × 12 × 6 × 3

Y

Tanh

BRECQ52

Adam with elastic weight consolidation49

72.85 ± 1.25

MNIST

784 × 150 × 10

N

ReLU

WAGE55

Stochastic gradient descent

94.49 ± 0.12

  1. The test accuracy is reported as mean ± standard deviation across 4 and 8 independent runs for the Yin-Yang and MNIST networks respectively.