Fig. 5: Hardware demonstration of layer ensemble averaging for the Yin-Yang classification problem. | Nature Communications

Fig. 5: Hardware demonstration of layer ensemble averaging for the Yin-Yang classification problem.

From: Layer ensemble averaging for fault tolerance in memristive neural networks

Fig. 5: Hardware demonstration of layer ensemble averaging for the Yin-Yang classification problem.The alt text for this image may have been generated using AI.

a Overview of the dataset and employed architecture of the 3-layer perceptron network used for multi-task classification. b Mean test accuracies for layer ensemble averaging at increasing values of redundancy parameters \(\alpha \in [1,\,4]\) and \(\beta \in [1,\,\alpha ]\) under different combinations of the greedy and random mapping algorithms with the simple and reduced mapping error (RME) encoding algorithms, where \(\alpha\) indicates the total number of redundant mappings of each conductance matrix and \(\beta\) indicates how many rows (out of \(\alpha\)) contribute to the current averaging process for each output. We include individual data points from 20 iterations as well as summary boxes to highlight the underlying distribution. Each box follows Tukey’s rule, where the middle line indicates the median, box boundaries indicate first and third quartiles, and error bars indicate points at \(\pm \,1.5\) inter-quartile range. c A subset of hardware mappings \({{\bf{G}}}_{{\bf{pos}}}\) and \({{\bf{G}}}_{{\bf{neg}}}\) of the first fully connected network layer (with dimensions \(4\times 12\)) from a single cycle of the inference process for each configuration. The kernel, column, and row labels correspond to physical locations on the chip. All experimental results are gathered using the Daffodil mixed-signal prototyping platform.

Back to article page