Fig. 4: Hierarchical semantic relationship between word representations. | Nature

Fig. 4: Hierarchical semantic relationship between word representations.

From: Semantic encoding during language comprehension at single-cell resolution

Fig. 4

a, Left: the activity of each neuron was regressed onto 300-dimensional word embedding vectors. A PC analysis was then used to dimensionally reduce this space from the concatenated set model parameters such that the cosine distance between each projection reflected the semantic relationship between words as represented by the neural population. Right: PC space with arrows highlighting two representative word projections. The explained variance and correlation between cosine distances for word projections derived from the word embedding space versus neural data (n = 258,121 possible word pairs) are shown in Extended Data Fig. 7a,b. b, Left: activities of neurons for word pairs based on their vectoral cosine distance within the 300-dimensional embedding space (z-scored activity change over percentile cosine similarity, red regression line; Pearson’s correlation, r = 0.17). Right: correlation between vectoral cosine distances in the word embedding space and difference in neuronal activity across possible word pairs (orange) versus chance distribution (grey, n = 1,000, P = 0.02; Extended Data Fig. 7c). c, Left: scatter plot showing the correlation between population-averaged neuronal activity and the cophenetic distances between words (n = 100 bins) derived from the word embedding space (red regression line; Pearson’s correlation, r = 0.36). Right: distribution of correlations between cophenetic distances and neuronal activity across the different participants (n = 10).

Back to article page