Extended Data Fig. 10: Computational complexities of different training methods. | Nature Machine Intelligence

Extended Data Fig. 10: Computational complexities of different training methods.

From: Dual adaptive training of photonic neural networks

Extended Data Fig. 10

The computational complexities, evaluated with FLOPs, of AT, PAT, and DAT with internal states (IS) for training DPNNs and MPNNs are compared under different input sizes (a) and PNN block numbers (b). The legend ‘AT’ represents adaptive training, and the postfix ‘(A)’ indicates the use of the angular spectrum method to implement diffractive weighted interconnections. The configurations used in the numerical or physical experiments are indicated by dotted vertical lines. The curves are plotted based on Supplementary Tables 1 and 2. See Supplementary Section 14 for a detailed description.

Back to article page