Fig. 6: The architectures of the neural networks implemented. | Communications Physics

Fig. 6: The architectures of the neural networks implemented.

From: Artificial intelligence for improved fitting of trajectories of elementary particles in dense materials immersed in a magnetic field

Fig. 6

a Recurrent neural network (RNN). In high-level terms, the RNN consists of five bi-directional GRU layers, followed by a linear layer that projects the sum of the outputs of the GRU layers into a vector of length three. b Transformer encoder. It consists of five encoder layers, followed by a linear layer that projects the sum of the outputs of the encoder layers into a vector of length three. For both models, the input hit position (xi, yi, zi) is summed to the network’s output, allowing it only to learn the 'residuals' of the reconstructed hits concerning the true node states (\({\overrightarrow{S}}_{{{{{{\mathrm{in}}}}}}}\to {\overrightarrow{S}}_{{{{{{\mathrm{out}}}}}}}\)).

Back to article page