Table 2 Experiments on various neural arrays

From: Optical neural network via loose neuron array and functional learning

(a)

       
 

Regular-2

Regular-3

Normal-3

Uniform

LFNN

FNN

Digital DNN

LC Neurons

2048 × 3

3072 × 3

3072 × 3

3072 × 3

2048 × 3

N.A.

1-layer MNIST

91.03%

92.07%

92.40%

92.45%

91.02%

91.39%

92.71%

2-layer MNIST

96.61%

97.30%

97.55%

97.65%

94.77%

95.45%

98.32%

3-layer CIFAR10

47.48%

50.61%

51.73%

52.53%

45.62%

46.19%

53.62%

(b)

    
 

Bernoulli-0

Bernoulli-20

Bernoulli-40

Bernoulli-60

LC Neurons

2048 × 3

1638 × 3

1229 × 3

819 × 3

Simulation

89.80%

82.07%

79.17%

77.29%

LFNN

89.13%

81.83%

79.36%

75.65%

  1. (a) Classification accuracy of four neuron array simulations, the actual LFNN captured output, the FNN predicted LFNN output, and the equal-layer digital dense neural network (DNN) reference. The LFNN has 12, 288 trainable variables per layer, including 6, 144 to control LC neurons and 6, 144 to control the input/output intensity gains. The FNN has 28, 438, 144 trainable variables per layer. The digital dense neural network comprises dense layers connected by ReLU with neuron sizes of (784, 10), (784, 784, 10), and (3072, 3072, 3072, 10), respectively.
  2. (b) Physical assessment of random neuron arrays. The test is conducted in the numerical simulation and the actual LFNN device for 1-layer MNIST classification. The Bernoulli arrays (Bernoulli-) are modified from the regular-2 array by randomly disabling 0% to 60% of LC neurons as well as all input/output intensity gains. The figure on the right visualizes a Bernoulli-60 array, where 60% of random LC neurons are disabled.