Extended Data Fig. 2: Loss of plasticity in the Slowly-Changing Regression problem. | Nature

Extended Data Fig. 2: Loss of plasticity in the Slowly-Changing Regression problem.

From: Loss of plasticity in deep continual learning

Extended Data Fig. 2

a, The target function and the input in the Slowly-Changing Regression problem. The input has m + 1 bits. One of the flipping bits is chosen after every T time steps and its value is flipped. The next m − f bits are i.i.d. at every time step and the last bit is always one. The target function is represented by a neural network with a single hidden layer of LTUs. Each weight in the target network is −1 or 1. b, Loss of plasticity is robust across different activations. These results are averaged over 100 runs; the solid lines represent the mean and the shaded regions correspond to ±1 standard error.

Back to article page