Fig. 9: Attention-invariant tuning. | Nature Communications

Fig. 9: Attention-invariant tuning.

From: Modeling attention and binding in the brain through bidirectional recurrent gating

Fig. 9: Attention-invariant tuning.The alternative text for this image may have been generated using AI.

a Orientation tuning curves of four neurons from layers 1, 3, 5, and 7 (from left to right) of our 8-layer model trained on the curve-tracing task. For the analysis of the attention-invariant tuning, we only consider neurons with Gaussian-like tuning curves (e.g., the two plots on the left) and reject the rest. b The ratio of neurons that respond to the target bar (i.e., the target bar is in their receptive field) increases in deeper layers, while their orientation tuning curves become more complex (i.e., less Gaussian-like). c Attention increases the response amplitude but has no significant effect on the width, asymptote, and preferred orientation of the tuning curves. c-i Our results from the penultimate feature layer of our model are compared to (c-ii) findings from V4 cortical neurons in macaques121. The red arrows and values indicate the median in each graph. Number of neurons used for the analysis, n = 883. The plots are styled similar to those in McAdams & Maunsell (1999)121.

Back to article page