Extended Data Fig. 8: Difference between output and input entropies.
From: Neuronal parts list and wiring diagram for a visual system

The difference between output and input entropies (units of nats) quantifies the degree of divergence or convergence. This difference is equivalent to the logarithm of the ratio of out- and in-perplexities. The connectivity of the top types (top left) is more divergent, as the output entropy is greater than the input entropy. The connectivity of the bottom types (bottom right) is more convergent, as the input entropy is greater than the output entropy.