Fig. 4: Historical evolution of artificial neural networks and deep learning, where the horizontal axis represents time, and the vertical axis represents research and development activities.

a Biological neuron model458. b The single-layer perceptron: an artificial neuron calculates the weighted sum (∑) of the inputs (based on weights θ1 − θn), and maps them to the output through an activation function. c CNN: convolutional neural network, consists of input layer, convolution layer, pooling layer, full connection layer and output layer. d RNN: recurrent neural network, the input of the hidden layer includes not only the output of the input layer but also the output of the hidden layer at the previous moment. e RBM: Restricted Boltzmann Machines, an undirected probability graph model with an input layer and a hidden layer. f DBM: Deep Boltzmann Machine, consists of several RBM units stacked. The connections between all layers are undirected. g DBN: Deep Belief Network, consists of several DBM units stacked. The connection between the right two layers is undirected, while other connections are directed. h Residual block: consists of two sets of convolutional layers activated by ReLU stacked one above the other