Figure 3
From: Constructing neural networks with pre-specified dynamics

Graphic depiction of gFTP main stages. (a) Main steps to attain a consistent transition graph. gFTP takes a user-defined transition graph as input. An auxiliary graph D is constructed, which contains information about possible inconsistencies. Arc superpositions are detected (in this case, superposition between a complete path, connecting nodes 1, 2, and 3, and the arc between nodes 1 and 3 (an antiparallel superposition, since we can go from 1 to 3 though green arcs, and from 3 to 1 through a cyan arc). Detected superpositions are applied to traversable sets (all arcs in D are now green because they must be treated as if they were of the same label; cyan arcs were inverted because superposition was antiparallel). Depth First Search (DFS) is conducted to search for cycles. The absence of cycles indicates that graph G is consistent. Cycle presence requires the expansion of one node in G that composes the cycle in D after applying superpositions (node 2 was expanded in this case, but expansion of node 1 was also possible). A new cycle of the algorithm is executed starting from the expanded graph. (b) Main steps to obtain matrices \(\textbf{Z}_{s}\), \(\textbf{Z}_{t}\), \(\textbf{W}_{y}\), \(\textbf{W}_{r}\) from \(G_{cons}\). A row vector of ones and zeros is constructed in each iteration (vector’s elements are neuron’s output in each transition). This vector must differentiate at least two nodes of \(G_{cons}\), not already differentiated by previous row vectors in \(\textbf{Z}_{t}\). It must also be delta consistent. These row vectors are generated and concatenated, forming \(\textbf{Z}_{t}\), until all nodes are differentiated. Matrix \(\textbf{Z}_{s}\) is constructed with the same vectors, sorted according to transitions in \(G_{cons}\). An accelerated perceptron is trained, where each row of [\(\textbf{Y}\);\(\textbf{Z}_{s}\)] is a sample to classify, and each column of \(\textbf{Z}_{t}\) a set of classes. The algorithm stops if the perceptron training error reaches zero in less than max_iter iterations, and outputs all matrices. If the error is above zero after max_iter iterations, a new round of vectors that differentiate all nodes are generated and concatenated to \(\textbf{Z}_{t}\), \(\textbf{Z}_{s}\).