Abstract
Emotions comprise multiple coordinated responses, including facial expressions and subjective experiences. Although functional magnetic resonance imaging (fMRI) studies have identified brain regions associated with facial or emotional responses, few have simultaneously assessed and statistically dissociated components. Additionally, the functional networks associated with these emotional responses and the dynamic interplay between such networks remain uncertain. To investigate these issues, we measured fMRI while participants viewed emotional films, as well as their facial videos and dynamic valence ratings. Regional activity analysis revealed that facial responses (lip-corner-pulling actions) were associated with activation in the limbic regions and somatosensory motor cortices. Subjective emotional responses (the absolute values of valence ratings) were associated with activity in the medial parietal and lateral temporoparietal cortices. Independent component analysis revealed that the independent components associated with facial and subjective responses included the abovementioned activated regions. Dynamic causal modeling of these independent components supported a model in which the visual/auditory processing component modulated the facial response component, which subsequently influenced the subjective response component. Our findings imply that, during emotional processing, facial responses are initially generated by the limbic and sensorimotor cortical networks; subsequently, these responses give rise to subjective experiences through activity in the medial parietal-lateral temporoparietal networks.
Similar content being viewed by others
Introduction
Emotion is a significant yet complex psychological phenomenon, attributable in part to its multicomponent nature1,2. Extensive psychophysiological evidence demonstrates that emotional stimuli evoke a range of responses, including subjective feelings and bodily reactions, that are systematically coordinated3. Among these responses, the subjective experience of emotional valence and corresponding facial expressions has garnered considerable attention. The subjective experience constitutes a core component of emotion, with emotional valence—defined as the continuum from positive to negative affective states—serving as a fundamental low-dimensional descriptor2,4,5. Facial expressions are among the most distinctive bodily responses associated with emotional states6. Numerous studies have demonstrated a systematic association between subjective emotional valence and facial expressions7,8,9,10,11,12,13,14,15,16,17,18,19,20,21. For example, a previous study found that participants exposed to emotionally positive films showed a positive association between their positively valenced subjective experiences and increased activation of the zygomatic major muscle, which is responsible for the lip corner pulling action19. However, the strength of the association was moderate; for example, the correlation coefficient between dynamic valence ratings and zygomatic major muscle activity was approximately 0.219. The data imply that facial and subjective emotional responses may reflect related but different underlying processes. Previous studies have suggested a number of mechanistic relationships between these responses, which can largely be classified into two categories. First, some researchers, including Darwin22, proposed that facial expressions are readouts of inner emotional experiences, a concept referred to as the readout hypothesis23 and constituting the commonsense view24. Others, including James25, proposed that facial expressions are produced first and subsequently produce or modulate emotional experiences, which is referred to as the facial feedback hypothesis26,27,28,29. Although several empirical investigations have tested the facial feedback hypothesis, the results have been inconsistent, including both positive30,31,32 and negative33 findings, and it therefore remains a matter of debate.
To elucidate the neural mechanisms underlying emotion, numerous functional neuroimaging studies using functional magnetic resonance imaging (fMRI) and positron emission tomography have examined brain regions associated with facial expressions and subjective experiences. Several studies have investigated the neural substrates involved in the production of emotional facial expressions by concurrently recording facial electromyography34,35 or video data36,37. These studies identified activation in subcortical regions, including the amygdala34,35,37, basal ganglia36, thalamus37, and cerebellum37, and in cortical regions including the occipital and temporal cortices37, lateral posterior parietal cortices37, somatosensory motor cortices37, supplementary motor cortices36, and prefrontal cortices37. Numerous studies investigated the neural correlates of the intensity or arousal of positive and negative subjective emotional experiences. These studies reported the involvement of subcortical regions such as the amygdala38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53, basal ganglia43,46,47,48,49,54,55,56, and cerebellum46,48,49,51,57,58, as well as various neocortical areas including the occipital and temporal cortices38,47,48,50,51,53, medial parietal regions (e.g., posterior cingulate cortex and precuneus)47,50,51,54,59, lateral posterior parietal cortices46,48,51,58, insular cortex43,46,47,55,59,60,61, somatosensory motor cortices48, and prefrontal cortices39,42,46,47,48,49,50,52,54,57,58,59,60,62,63. Taken together, these findings imply that a broad network of brain regions, spanning subcortical and cortical structures, may be implicated in facial expression and subjective emotional processing.
However, few studies have simultaneously recorded facial and subjective emotional responses while statistically dissociating their effects. This represents a critical gap in the existing literature, as numerous psychological studies have reported moderate correlations among various components of emotional responses. These findings raise the possibility that the observed associations with facial expressions or subjective emotional experiences may, at least in part, reflect the influence of distinct underlying emotional processes. Exceptionally, one fMRI study has directly addressed this issue37. In that study, the researchers recorded the facial reactions of 13 participants in response to humorous cartoons and subsequently collected humor perception ratings after the scanning session. Brain activity associated with smiling, after statistically controlling for humor perception, was observed in subcortical regions, including the amygdala, basal ganglia, and cerebellum, as well as in neocortical regions such as the temporal and prefrontal cortices. Conversely, activity related to humor perception, after removing the influence of facial expression, was predominantly found in neocortical areas, including the temporo-parieto-occipital junction, temporal cortex, and prefrontal cortex. These findings imply a functional dissociation between the neural substrates underlying facial expressions and those supporting subjective emotional experience. However, the study37 had several limitations. Measurement of humor perception may not fully reflect purely subjective emotional experiences, as it entails cognitive appraisal of the stimuli64. Moreover, the study37 used only positively valenced stimuli and included a relatively small sample size. Other studies have reported inconsistent findings regarding the neural correlates of facial and subjective responses. For instance, the relationship between amygdala activity and self-reported emotional experience has varied across investigations. Further studies are needed to clarify these associations. Our study was designed to address these concerns by testing the hypothesis that distinct neural regions are differentially associated with facial and subjective emotional responses, consistent with the results reported previously37.
Furthermore, it remains unclear whether the brain regions associated with facial and subjective emotional responses form distinct functional networks. To our knowledge, no prior study has investigated segregated patterns of functional connectivity corresponding to facial and subjective emotional responses. Considering that previous studies have identified functional network configurations during emotional processing65 and the execution of facial expressions66 involving multiple interconnected brain regions, we hypothesized that the brain areas exhibiting activity related to facial or subjective responses would constitute a functionally integrated neural network.
In addition, assuming the existence of distinct neural networks associated with facial and subjective emotional responses, the dynamic relationship between these networks remains uncertain. As noted above, several psychological theories have long proposed models of emotional responses that may correspond to such network dynamics. The readout hypothesis posits that the recognition of emotional stimuli initially elicits subjective emotional experiences, which subsequently generate facial emotional expressions23. In contrast, proponents of the facial feedback hypothesis have argued for the opposite sequence, suggesting that facial expressions themselves can influence and shape subjective emotional experiences25. Alternatively, some studies have proposed that stimulus recognition elicits both subjective and facial emotional responses67. Considering the relatively robust empirical support for the facial feedback hypothesis30,31,32, we hypothesized that the neural network involved in stimulus recognition would first activate the network associated with facial responses, which would subsequently modulate the network underlying subjective emotional experiences.
To test these hypotheses, we conducted fMRI while participants viewed emotionally evocative films, during which their facial reactions were simultaneously video recorded. Following image acquisition, participants provided cued-recall dynamic valence ratings. First, we performed analysis using a general linear model (GLM) incorporating both facial responses, quantified using Action Unit (AU) 12 (lip corner pulling), as defined by the Facial Action Coding System (FACS)68,69, and subjective emotional responses, indexed by the absolute values of dynamic valence ratings as a measure of emotional intensity (regardless of valence polarity) as in previous studies54,70,71,72,73. Due to the use of a 32-channel head coil that covered the upper half of the face, we limited our analysis to AU 12, a prototypical indicator of positive emotional expression (Fig. 1). We used the cued-recall procedure based on the proposal that online introspective monitoring could interfere with the authenticity of emotional responses74,75,76 and the data showing high positive correlations between online and cued-recall dynamic ratings in response to emotionally evocative films14,77. We analyzed the absolute values of valence ratings because meta-analyses of neuroimaging studies have reported that several emotion-related brain regions described above, such as the amygdala, can be consistently activated in response to both negative and positive emotions relative to neutral emotions78,79,80. Our analysis statistically dissociated neural activity associated with facial and subjective emotional responses. Next, to identify large-scale functional connectivity networks associated with each response type, we conducted a group-level independent component analysis (ICA) of the fMRI data81. Functional connectivity was defined as temporally synchronized activity among spatially distinct brain regions82. ICA is one of the two principal methods used to examine functional connectivity, the other being seed-based correlation analysis83. Although seed-based methods are suited for identifying connectivity between a predefined seed region and the rest of the brain, ICA is more appropriate for uncovering global connectivity networks across multiple brain regions84. Finally, to examine the dynamic coupling between the independent components (ICs) associated with facial and subjective emotional responses, we applied dynamic causal modeling (DCM)85 to the ICs. DCM for ICs has been proposed as a suitable method for investigating causal interactions between large-scale brain networks comprising multiple functionally distinct regions86,87,88.
Results
Subjective and facial emotional responses
Figure 2 shows the group mean second-by-second dynamic valence ratings (left) and AU 12 (right) during each film, respectively. The emotionally evocative film clips elicited facial and subjective emotional responses. For example, the valence ratings and AU 12 for the negative film showed a slight increase followed by a decline, reflecting the film’s content: a pleasant group gathering scene followed by a massacre. Repeated measures analyses of variance (ANOVAs) with emotion as a factor showed significant main effects of emotion both for averaged valence ratings (mean ± standard error, −0.84 ± 0.11, −0.27 ± 0.12, 1.76 ± 0.15 for negative, neutral, and positive, respectively) and averaged AU 12 (mean ± standard error, 0.03 ± 0.03, 0.06 ± 0.04, 0.22 ± 0.06 for negative, neutral, and positive, respectively) (F[2, 64] = 106.99 and 5.99, p = 0.000 and 0.004, η²p = 0.77 and 0.16, respectively). Holm-corrected multiple comparisons revealed that the averaged valence ratings were significantly higher for the positive films than for the neutral and negative films, and for the neutral films than for the negative films (t > 3.01, p < 0.005). The averaged AU 12 responses were significantly higher for the positive films than for the neutral and negative films (t > 2.72, p < 0.02). When the absolute value of valence was calculated as the measure of emotional intensity, ANOVA for the averaged values (mean ± standard error, 0.91 ± 0.09, 0.48 ± 0.10, 1.76 ± 0.15 for negative, neutral, and positive, respectively) showed a significant main effect of emotion (F[2, 64] = 34.87, p = 0.000, η²p = 0.52). Multiple comparisons revealed significantly higher values for both the positive and negative films than neutral films and for the positive than negative films (t > 2.75, p < 0.01).
Regional brain activity
For fMRI data analysis, a random-effects analysis89 was first conducted to identify regions showing significant activation associated with film observation, facial emotional responses, and subjective emotional responses.
Contrasts associated with the film observation revealed widespread activation across neocortical regions. Significant clusters were identified in the bilateral occipital and temporal lobes, left and right parietal lobes, bilateral dorsomedial frontal lobes, and left ventromedial frontal cortex (Table 1 and Fig. 3).
Contrasts associated with facial emotional responses (i.e., AU 12) revealed significant activation in the bilateral somatosensory and motor cortices, including the postcentral gyrus, insula, operculum, precentral gyrus, supplementary motor area, and middle cingulate cortex (Table 1 and Fig. 3). In addition, widespread activation was observed in the right limbic regions, with prominent foci in the amygdala and putamen.
Contrasts assessing positive associations with subjective emotional responses (indexed by the absolute values of dynamic valence ratings) demonstrated significant activity in the bilateral medial parietal regions, including the precuneus; the bilateral lateral temporal regions, including the middle temporal gyri; and the bilateral cerebellum (Table 1 and Fig. 3).
Independent component analysis
Next, group ICA of the fMRI data was conducted to analyze functional brain networks associated with film observation, facial emotional responses, and subjective emotional responses. The group ICA estimated 30 ICs, and then the reconstructed time courses of the ICs were then evaluated by random-effects analyses as in the regional activity analyses discussed above.
Contrasts associated with the film observation revealed three significant components: IC #5 in the early visual areas, IC #7 in the higher-order visual areas (e.g., the middle temporal gyrus), and IC #11 in the early auditory areas (Fig. 4). Further contrasts revealed one component to be significantly associated with facial emotional responses (IC #13) and another to be significantly related to subjective emotional responses (IC #15) (Fig. 4). These components encompassed brain regions previously identified in the regional brain activity analyses, indicating functional connectivity among these areas.
Dynamic causal modeling for independent components
Finally, we applied stochastic DCM90,91,92 to the ICs identified in the above ICA to examine the dynamic interactions between the functional brain networks associated with sensory processing (ICs #5, #7, and #11 for the early visual processing, higher-order visual processing, and auditory processing, respectively), facial emotional responses (IC #13), and subjective emotional responses (IC #15). Based on ample psychological evidence showing associations between facial and subjective emotional responses (e.g., ref. 7), we assumed the interaction between facial and subjective emotional response networks. In addition, based on anatomical evidence indicating that the large-scale connections are basically reciprocal93,94, we assumed bidirectional connectivity across the network. We tested the relationships between facial and subjective emotional responses by constructing three model variants, in which the sensory processing networks interacted with both facial and subjective emotional response networks independently, the subjective response network only, or the facial response network only. In addition, we explored relationships among the three sensory processing networks by constructing three variants with convergence of the early visual information in the higher-order visual processing, convergence of both early visual and auditory information in the higher-order visual processing (i.e., multimodal sensory association95,96), and independent processing of the three networks. In total, we constructed nine models (Fig. 5). The models were systematically compared using the random-effects Bayesian model selection97. In addition, we grouped the models into three families98 according to the relationships between the sensory and facial/subjective emotional response networks and conducted model-family comparisons.
The nine models were divided into three families (each containing the three models in dashed boxes), based on three alternative hypotheses, in which the sensory processing components interacted with both facial and subjective emotional response components, the subjective response component only, or the facial response component only. The arrows indicate connections between components.
The exceedance probability in the model comparison revealed that the model in which the three sensory processing networks interacted independently with the facial emotional response system, which in turn interacted with the subjective emotional response system, was optimal (Fig. 6). Comparisons of model families confirmed that the models with connectivity between the sensory and facial response systems were superior to the other models positing connectivity between the sensory and subjective response systems or connections of the sensory processing networks with both emotional response networks (Fig. 6).
Discussion
The results of our regional brain activity analyses demonstrated that the production of emotional facial expressions measured by AU 12 were associated with activity in several brain regions, including both subcortical and neocortical structures. In particular, subcortical activation was observed in the amygdala, putamen, and thalamus, whereas neocortical activation involved the somatosensory regions, insular cortex, supplementary motor area, and middle cingulate cortex. The activity observed in these regions is largely consistent with previous studies, which have specifically identified facial-expression-related activity in the amygdala34,35,37, basal ganglia36, thalamus37, somatosensory and motor cortices37, and supplementary motor cortices36. Several lesion studies have investigated the neural correlates of emotional facial expressions, reporting that damage to specific brain regions, including the thalamus, insular cortex, supplementary motor area, and middle cingulate cortex, can impair the production of these expressions, consistent with our findings (e.g., ref. 99; for a review, see ref. 100). Furthermore, prior clinical studies have indicated that patients with temporal lobe epilepsy exhibit unilateral weakness in emotional facial expressions contralateral to the side of mesial temporal sclerosis, consistent with our findings101,102.
Notably, amygdala activity was associated with facial, but not subjective, emotional responses. Consistent with prior studies, our findings revealed statistically dissociated facial and subjective emotional responses37. However, numerous previous studies have reported inconsistencies, with amygdala activity being linked to subjective emotional responses (e.g., ref. 38). These findings could be explained by the fact that previous studies did not measure facial emotional responses, which are more directly associated with amygdala activity and also positively correlated with facial responses, and detected spurious associations between amygdala activity and subjective emotional responses. Our findings are consistent with previous neuroimaging103,104,105,106,107,108,109,110,111,112,113,114,115,116,117,118,119,120,121,122, neurophysiological123,124, and lesion125,126,127 studies indicating that the amygdala is involved in nonconscious emotional processing prior to the production of subjective experiences. Some recent neuroimaging studies have also shown that the amygdala is activated in response to emotional stimuli independently of conscious evaluation of stimuli128,129. In particular, the amygdala appears to contribute to the rapid, preconscious appraisal of the emotional salience of stimuli130,131, a process that may trigger immediate emotional facial expressions132,133, which corresponds to the role of the appraisal process proposed in psychological emotion theories1,4,134,135,136,137. The amygdala may enable such rapid and complex computation through its rich intra- and interregional connectivity138,139,140.
Conversely, our findings revealed that subjective emotional responses, statistically dissociated from facial emotional responses, were primarily associated with activity in the medial parietal cortex and lateral temporoparietal regions. This pattern of activation aligns with previous neuroimaging studies that have investigated the neural correlates of subjective emotional experiences (e.g., ref. 59), including studies that have explicitly aimed to dissociate facial and subjective emotional components37. Notably, activation in these regions is consistent with earlier findings implicating the medial parietal and temporoparietal areas in non-emotional, self-reflective subjective states (e.g., ref. 141; for a review, see refs. 142,143). Our findings support psychological models suggesting that valenced subjective experiences function as a monitoring component, emerging from the integration and representation of changes in other affective and cognitive components4. Notably, the medial parietal and lateral temporoparietal regions have also been implicated in processes related to the estimation of others’ mental states (e.g., refs. 144,145; for reviews, see ref. 146,147,148). This overlap aligns with the conceptualization of theory of mind, or mentalizing, as the cognitive capacity to attribute mental states to oneself and others149. It supports the proposal that subjective emotional experiences may be constructed through mechanisms similar to those involved in understanding the subjective experiences of others29. Taken together, these findings imply that subjective emotional experiences are likely constructed via mentalizing processes, emphasizing a convergence between self-referential emotional appraisal and social cognitive functions.
In addition, our group ICA revealed that the aforementioned brain regions associated with facial or subjective emotional responses form distinct ICs. Our findings are consistent with anatomical evidence from non-human primates demonstrating white matter (WM) connectivity among the amygdala, basal ganglia, and sensorimotor cortices (for reviews, see refs. 138,150,151), and between the medial parietal and lateral temporoparietal regions (for reviews, see refs. 152,153), as well as their connections with the visual cortices93,154. The results are also consistent with previous findings of functional or effective connectivity among these regions (e.g., ref. 66), although no studies have directly demonstrated associations between such connectivity and either facial or subjective emotional responses. Our findings provide novel evidence implying that these brain regions cooperate as integrated functional neural networks, each supporting the implementation of either facial or subjective components of emotional experience.
Furthermore, our DCM for ICs revealed dynamic interactions among the relevant neural networks. In particular, the sensory processing networks were functionally connected with the network underlying facial, but not subjective, emotional responses. Subsequently, the facial and subjective emotional response networks exhibited bidirectional interactions. Our findings have important theoretical implications for understanding the psychological mechanisms underlying emotional responses. Previously, Darwin22 proposed that facial expressions are expressed as the readouts of inner emotional experiences. Several researchers subsequently supported this commonsense view of a direct pathway from subjective feelings to bodily changes23. In contrast, James25 proposed the inverted version of this proposition that stimuli evoke bodily responses, which subsequently give rise to subjective emotional experiences. Several subsequent theories have suggested that facial expressions play a formative role in constructing subjective emotional experiences (e.g., ref. 28; for a review, see ref. 29). There remains long-standing contention regarding these hypotheses32, partly due to the methodological limitations of behavioral data in elucidating the underlying mechanisms. Our findings regarding the neural dynamics provide empirical support for the foundational premise of James’s theory25 and the facial feedback hypothesis. In particular, our study revealed that the sensory processing of emotionally salient stimuli initiates activation of appraisal and facial response systems, which subsequently influence the emergence of subjective emotional experiences.
Our DCM for ICs also suggested that the early and higher-order visual processing networks and auditory processing networks interacted independently with facial response networks. The results may be plausible if we assume that the facial response network utilizes the amygdala for the input of sensory information, because anatomical evidence indicates that the amygdala receives inputs from both the visual and auditory areas138,139. In addition, the results are consistent with previous findings reporting that the amygdala modulated the activity in the widespread sensory areas during the perception of emotional stimuli155,156,157,158,159,160,161.
This study had several limitations. First, we assessed facial emotional responses solely in the lower face. This constraint arose from the use of a head coil that covered the upper face, limiting our ability to evaluate action units involving the upper facial region, such as brow lowering, an expression typically associated with negative emotional states. Future studies using half-head coils or other advanced imaging setups are needed to enable comprehensive assessment of upper and lower facial expressions.
Second, we assessed only valence ratings for subjective emotional responses. Although the dimensional emotional perspective generally posits that subjective emotional experience can be well represented by the two dimensions of valence and arousal3,162, we did not assess arousal ratings because several previous studies reported that the absolute values of valence ratings in response to emotional visual stimuli could produce overlapping information with arousal ratings163. Our preliminary analyses also showed a positive correlation between the absolute values of valence ratings and arousal ratings for the films used in this study (mean ± standard error, r = 0.66 ± 0.04; see “Methods”). However, there remains debate about the relationship between valence and arousal164,165 and it may be possible to stimulate valence and arousal states independently by certain stimuli, such as surprising films166,167,168. Testing of subjective arousal remains a matter for future research.
Third, we assessed subjective emotional ratings using the cued-recall procedure. We selected this procedure based on the proposal that online introspective monitoring could interfere with the authenticity of emotional responses74,75,76, such as the reduction of immersion75 and the data showing significant positive correlations between online and cued-recall ratings for the stimuli used in the present study (mean ± standard error, r = 0.55 ± 0.07; see “Methods”)14. However, the data imply that online and cued-recall ratings could have substantial differences (i.e., 56% in variance), and our regressors of subjective emotional experience yielded incomplete estimation of online subjective emotional responses. Future studies should test online ratings and to confirm the generalizability of the present findings.
Fourth, the sample size was small, which may have resulted in a lack of power169,170,171,172 for detecting associations between facial or subjective emotional responses and other brain regions. For example, some previous studies have suggested that the posterior superior temporal sulcus could be involved in the production of facial expressions173,174,175,176. Several studies showed that the dorsomedial prefrontal cortex was activated during a task involving mentalizing (e.g., ref. 177; for reviews, see refs. 178,179,180). The data imply that more brain regions may be involved in the facial or subjective emotional responses, which should be investigated in future studies.
Finally, our fMRI measurement may have lacked the temporal resolution to dissociate functional networks. Electrophysiological measures, including scalp or intracranial electroencephalography, are needed to investigate neural activity with high temporal resolution. Some previous electrophysiological studies showed that the amygdala was activated as early as 100 ms in response to emotional photographs181,182. Other studies reported that the medial parietal cortex was activated while evaluating one’s own mental state at about 300 ms183. Several other studies also reported electrical activity in the posterior cortices while viewing emotionally evocative films at different temporal or frequency profiles184,185,186,187. These data imply that the brain regions and networks identified in this study may exhibit different temporal profiles during the production of facial and subjective emotional responses. Future electrophysiological studies are warranted to test this idea.
In conclusion, our fMRI study delineated the regional brain activities, functional networks, and dynamic interaction patterns specifically associated with facial and subjective emotional responses. Regional brain activity analyses revealed that facial responses measured by AU 12 were primarily associated with limbic, motor, and somatosensory regions, whereas subjective emotional experiences, reflected by absolute valence ratings, were associated with medial parietal and lateral temporoparietal regions. Group ICA identified that these regions formed distinct ICs. DCM further demonstrated that the IC associated with visual recognition interacted with the facial motor response IC, which subsequently influenced the IC underlying subjective emotional responses. These findings imply that emotional responses are implemented through a dynamic and hierarchical interaction from the limbic–motor network to the mentalizing network.
Methods
Participants
Our study included 33 healthy volunteers (11 females and 22 males; mean ± standard deviation, 22.3 ± 2.9 years). The sample size was determined heuristically185, guided by prior fMRI studies investigating neural activity associated with facial and subjective emotional responses (n = 1337; n = 2835). All participants were right-handed, as determined using the Edinburgh Handedness Inventory188, and possessed normal or corrected-to-normal visual acuity. Before enrollment, the experimental procedures were thoroughly explained, and written informed consent was obtained from all participants. The Ethics Committee of the Unit for Advanced Studies of the Human Mind, Kyoto University, approved the study protocol. All ethical regulations relevant to human research participants were followed.
Experimental design
A within-subject, one-factor design was used, with emotion (negative, neutral, and positive) as the sole factor.
Stimuli
Film clips were used to induce negative, neutral, and positive emotional states. Negative affect was induced using scenes from “Cry Freedom” that portrayed acts of violence against vulnerable individuals. A neutral emotional state was evoked through the presentation of a standard screensaver display. These film stimuli were developed by Gross and Levenson189. To induce positive affect, a comedic dialogue between two individuals, excerpted from commercial films (M-1 Grand Prix The Best 2007–2009, Yoshimoto, Tokyo, Japan), was shown, which was used in a study by Sato et al.14. The durations of the negative, neutral, and positive stimuli were 156, 150, and 196 s, respectively. The efficacy of these stimuli in eliciting the intended subjective and facial emotional responses has been validated in multiple prior studies14,17,19,20,166. Specifically, the negative film we used was shown to be the most effective in eliciting negative emotional states (i.e., the ratio of mean to standard deviation of valence ratings) among films developed by Gross and Levenson189 and suitable for Japanese participants (i.e., those with Japanese dubbing or subtitles)166. Although the negative, neutral, and positive films were developed to induce anger, neutral, and amusement emotions14,189, the dimensional emotional states were used to refer to the films, because previous studies assessing categorical emotion ratings showed that the films could elicit multiple emotional categorical states166,189. It has been shown to be generally difficult to stimulate a single emotional category using film stimuli190. In addition to the three films, a film clip from “Silence of the Lambs”189 was used for practice. All visual stimuli subtended a visual angle of 9.3° vertically and 7.0° horizontally.
Presentation apparatus
Experimental events were controlled using Presentation Software, version 14.8 (Neurobehavioral Systems, Albany, CA, USA), running on a Windows-based computer (Microsoft, Redmond, WA, USA). Visual stimuli were projected via a liquid crystal projector (DLA-F110; Victor, Yokohama, Japan) at a refresh rate of 60 Hz onto a mirror positioned within the MRI scanner in front of the participants.
Participants’ facial responses were recorded using the MRI Communication Relay System (Panasonic, Tokyo, Japan), which comprised an MRI-compatible video camera, frame synchronizer (FA-125; Panasonic), video timer (VTG-55D; Panasonic), digital mixer (WR-D01; Panasonic), memory card portable recorder (AG-HPD24; Panasonic), and workstation (HP Z800; Hewlett-Packard Laboratories, Palo Alto, CA, USA).
For off-line dynamic valence ratings, a Windows-based laptop computer (CF-SV8; Panasonic) was used.
Procedure
In the fMRI experiment, participants completed three film viewing trials in a block design paradigm. Each trial began with the presentation of a fixation point—a small white “+” displayed on a black background—for 2 s, followed by a plain white screen for 10 s. Subsequently, the film stimulus was presented. After each film, the plain white screen reappeared for an additional 10 s. Interspersed between trials were off-epochs lasting 18 s to allow for signal stabilization and baseline comparison. The order of film presentation was counterbalanced across participants to mitigate order effects. Participants were informed that they would view a series of films during which brain activity and facial video data would be recorded. Prior to fMRI data acquisition, participants completed a practice trial to familiarize themselves with the procedure.
Following the imaging session, participants completed a subjective rating task outside the scanner. During each trial, one of the three film stimuli was presented on a monitor, accompanied by horizontal nine-point scales for assessing valence. Participants were instructed to recall their emotional experience from the initial viewing and continuously rate this recalled experience in terms of valence by manipulating a computer mouse. Mouse coordinates were recorded continuously at a sampling rate of 10 Hz and converted into second-by-second dynamic valence rating scores. This cued-recall paradigm was used to avoid inducing online introspective monitoring, which may interfere with the authenticity of emotional responses74,75,76. Notably, a previous study reported significant positive correlations between online and cued-recall ratings for the stimuli used in the present study (mean ± standard error, r = 0.66 ± 0.04)14.
MRI acquisition
Neuroimaging data were acquired using a 3-Tesla MRI system (MAGNETOM Verio; Siemens, Malvern, PA, USA) equipped with a 32-channel head coil. To minimize head motion, participants’ heads were stabilized using elastic padding. Functional images were acquired as 76 consecutive slices aligned parallel to the anterior–posterior commissure plane, providing whole-brain coverage. Imaging was performed using a T2*-weighted multiband gradient-echo echo-planar imaging sequence was used with the following parameters: repetition time (TR) = 2,000 ms; echo time (TE) = 41.2 ms; flip angle (FA) = 80°; multiband acceleration factor = 4; matrix size = 96 × 96; voxel size = 2 × 2 × 2 mm. At the beginning of each fMRI run, a gradient-echo field map was acquired to correct for geometric distortions (TR = 738 ms; TE1/TE2 = 4.92/7.38 ms [ΔTE = 2.46 ms]; FA = 60°; matrix size = 96 × 96; 76 slices with the same orientation and geometry as the functional echo-planar imaging). Following the functional image acquisition, a high-resolution T1-weighted anatomical image was obtained using a magnetization-prepared rapid acquisition gradient-echo sequence (TR = 2,250 ms; TE = 3.06 ms; FA = 9°; inversion time = 900 ms; field of view = 256 × 256 mm; matrix size = 256 × 256; voxel size = 1 × 1 × 1 mm).
Statistics and reproducibility: facial and subjective response analysis
Participants’ emotional facial expressions throughout the entire runs were coded from video recordings using FACS68,69. FACS is a comprehensive, anatomically based coding system that describes visible facial muscular movements in terms of AUs without attributing interpretive meaning. Although AU 4 (brow lowering), a prototypical indicator of negative emotional expression, and AU 12 (lip corner pulling), a prototypical indicator of positive emotional expression, are relevant for tracking dynamic valence changes20, the use of a 32-channel head coil (Fig. 1) precluded reliable analysis of AU 4. Therefore, only AU 12 was evaluated. A trained FACS coder, blinded to the study conditions, scored AU 12 on a second-by-second basis using a binary coding scheme. To assess inter-rater reliability, a second coder independently evaluated 10% of the data, comprising randomly selected 1-min segments from each participant. Inter-coder agreement was high, with a Cronbach’s alpha of 0.80. The binary time-series data for AU 12 were subjected to statistical tests for behavioral data, and then resampled to match the fMRI TR (2 s) and used as input for subsequent statistical analyses.
Participants’ subjective emotional responses during film viewing were assessed using second-by-second dynamic valence rating scores. The valence data were sampled at 1-s intervals for statistical tests of behavioral data, and resampled at 2-s intervals to align with the imaging data and served as input for subsequent statistical analyses. Based on meta-analysis findings of neuroimaging data showing that several emotion-related brain regions could be consistently activated in response to both negative and positive emotions relative to neutral emotions78,79,80, we calculated the absolute values of the valence ratings to quantify the intensity of positive or negative emotion as in several previous studies70,71,72,73. As the absolute values of valence ratings in response to emotional visual stimuli can produce overlapped information with arousal ratings163, we did not perfom independent assessment of arousal ratings. To confirm this rationale, we conducted a preliminary analysis of the previous data set, assessing second-by-second dynamic ratings of valence and arousal while 20 participants watched the negative, neutral, and positive emotionally evocative films used in the present study14. The mean ± standard error intraindividual correlation coefficient between the absolute values of valence ratings and arousal ratings was 0.55 ± 0.07 (one-sample t test contrasting with zero [two-tailed], t[19] = 7.85, p = 0.000, d = 1.76), implying overlapping of information regarding emotional intensity.
For the statistical evaluation of AU 12, valence ratings, and absolute valence ratings, the average values during each film were calculated and subjected to one-way repeated-measures ANOVA with emotion (negative, neutral, and positive) as a factor, followed by multiple comparisons using the Holm method. Statistical significance was set at p < 0.05.
Statistics and reproducibility: image analysis
Neuroimaging data were analyzed using the Statistical Parametric Mapping software package (SPM12; revision 7771; http://www.fil.ion.ucl.ac.uk/spm), implemented in MATLAB R2018a (MathWorks, Natick, MA, USA). The analysis pipeline comprised preprocessing, regional brain activity analysis, group ICA, and DCM for ICs (Fig. 7).
For preprocessing, all functional images were initially corrected for slice timing. Subsequently, images from each run were realigned to the first scan to correct for head motion and were unwarped to correct for geometric distortions and for the interaction of motion and distortion based on the field map, using the FieldMap Toolbox191,192. Realignment parameters indicated minimal motion (maximum translation < 3.1 mm; mean ± standard deviation translation = 0.47 ± 0.33 mm, 0.42 ± 0.28 mm, and 1.28 ± 0.69 mm in the x, y, and z directions, respectively). Notably, motion was limited to < 2 mm for 28 participants. Functional images were coregistered to the skull-stripped anatomical image. Subsequently, all anatomical and functional images were spatially normalized to Montreal Neurological Institute space using the unified segmentation–spatial normalization procedure based on the anatomical image191. The normalized functional images were resampled to a voxel size of 2 × 2 × 2 mm and smoothed using an 8 mm full-width at half-maximum (FWHM) isotropic Gaussian kernel to account for inter-individual anatomical variability. Previous methodological work has demonstrated that an 8-mm FWHM provides optimal sensitivity and inter-subject comparability for GLM group analyses, whereas other findings imply that variations in FWHM (0, 4, and 8 mm) exert minimal influence on group-level ICA results193.
Random-effects analyses were conducted to identify significantly activated voxels at the population level89. Initially, single-subject analyses were performed using the GLM framework194. Task events, including film observation and facial reactions, were modeled using boxcar and delta functions, respectively. Subjective responses were incorporated as parametric modulators of the film observation. These three task-related regressors were convolved with a canonical hemodynamic response function. As multicolinearity among regressors can be problematic, we conducted preliminary analyses to evaluate the variance inflation factor (VIF). The results showed mean ± standard deviation (range) VIF = 1.1 ± 0.1 (1.0–1.6), implying no problematic multicolinearity compared with commonly used rules of 10195. A high-pass filter comprising a discrete cosine basis set with a cutoff period of 384 s was applied to remove low-frequency signal drift. To mitigate motion-related artifacts and physiological confounds, such as respiratory, cardiac, or vascular fluctuations, additional nuisance regression was conducted using the PhysIO Toolbox (version 3.2.0), part of the Translational Algorithms for Psychiatry-Advancing Science software collection (https://www.tnu.ethz.ch/de/software/tapas)196. Nuisance regressors included six motion parameters derived from the realignment step, as well as six WM and six cerebrospinal fluid (CSF) components calculated using the CompCor approach197. To extract the WM and CSF components for each participant, the anatomical image was segmented to create WM and CSF masks. The time-series signals were then extracted from voxels within these masks and subjected to principal component analysis. The first five principal components and mean signal were included as nuisance regressors of each of the WM and CSF signals. Serial autocorrelations were modeled assuming a first-order autoregressive process. These were estimated from the pool of active voxels using restricted maximum likelihood and used to whiten the data and the design matrix198. Contrast images corresponding to each task-related regressor from the first-level (single-subject) analysis were entered into a full factorial model for the second-level (random-effects) analysis. Corrections for non-sphericity, accounting for potential dependencies and unequal variance across factor levels, were implemented using the restricted maximum likelihood approach to ensure valid GLM assumptions of independent and identically distributed errors198.
Initially, regional brain activity associated with film observation, emotional facial responses, and emotional subjective responses was investigated. Clusters were considered statistically significant if they exceeded an extent threshold of p < 0.05, family-wise error-corrected for the whole brain, with a cluster-forming threshold of p < 0.001 (uncorrected). Anatomical labeling and identification of brain structures were conducted using the Automated Anatomical Labeling atlas and Brodmann area maps (Brodmann.nii) available via the MRIcron software package (https://www.nitrc.org/projects/mricron)199,200.
We conducted preliminary analyses of the above-described GLM using the original valence ratings (−4 to 4) instead of the absolute valence ratings. We searched for brain regions whose activity increased or decreased monotonically with subjective emotional valence. It has been argued that the original and absolute valence ratings reflect valence-specific and valence-general responses, respectively80. The analyses revealed that either the positive or negative contrasts of the subjective ratings produced significant activations reported in the Results section (e.g., increased activity in the bilateral lateral temporal regions with increasing valence, and increased activity in the bilateral medial parietal regions, lateral temporal regions, and cerebellum with decreasing valence; Supplementary Table 1 and Supplementary Fig. 1), as well as several other regions (e.g., activity in the left inferior frontal gyrus with increasing valence). Other effects, including those associated with film observation and facial emotional responses, showed almost identical patterns (e.g., activations in the bilateral somatosensory and motor cortices and the right limbic regions related to facial responses; Supplementary Table 1 and Supplementary Fig. 1). These results indicate that: most brain regions associated with valence-general subjective responses (i.e., those identified using absolute valence ratings) were also detectable using the original valence ratings; some regions may exhibit valence-specific patterns, either within these areas or in distinct regions; and the choice between original or absolute valence ratings had minimal impact on other effects related to film observation and facial emotional responses. Given these findings, and our focus on valence-general emotional responses, supported by prior meta-analyses showing that emotion-related brain activity is predominantly valence-general80, we reported the results for subjective emotional responses using the absolute values of valence ratings in the Results section.
To assess functional connectivity, group ICA was performed. Prior to ICA, spatially preprocessed data underwent further cleaning through nuisance regression, incorporating the same nuisance variables used in the GLM analysis: three discrete cosine transform basis functions (high-pass filter with a cutoff of 384 s), six head motion parameters, and six WM and six CSF-related time courses. Next, denoised data were analyzed using the Group ICA of fMRI Toolbox (GIFT, version 4.0b)81. Dimensionality reduction was conducted via principal component analysis at the individual and group levels. To identify ICs associated with film observation, emotional facial responses, and emotional subjective responses, temporal concatenation across all participants was used. The Infomax algorithm201 was used to estimate 30 ICs. Component stability was assessed using ICASSO (http://research.ics.aalto.fi/ica/icasso/) with 20 iterations. Finally, data were back-reconstructed using the default GIFT settings, with z-scoring applied to derive each participant’s time courses and spatial maps.
For each of the 30 ICs, we assessed their involvement in the three task conditions using temporal and spatial sorting procedures. In the temporal sorting procedure, we extracted the time courses associated with each IC for all participants and performed two-level random-effects analyses similar to those conducted for regional activity. For the first-level (corresponds to “Temporal Sorting” in GIFT), we regressed the back-reconstructed time course of the 30 components for each participant onto a first-level design matrix containing only task-related regressors (film observation, emotional facial responses, and emotional subjective responses), yielding one beta weight per regressor that reflected the strength and direction of the task-related activity for the time-series of that IC. These beta weights were then entered into second-level random-effects analyses (corresponds to “Stats on Beta Weights” in GIFT) across participants to test whether each task-related regressor significantly related each IC. Statistical significance was assessed using a one-tailed threshold of p < 0.05, corrected for multiple comparisons across the 30 ICs using the false discovery rate (FDR) method. Following temporal sorting, spatial sorting was performed to select further task-relevant components. This process involved evaluating the spatial correspondence between the IC maps and the activation maps obtained from the prior regional activity analysis (i.e., SPM{T} maps for the contrast of film observation, emotional facial responses, or emotional subjective responses thresholded at uncorrected p < 0.001). Specifically, we calculated the spatial correlation coefficient between each IC map and the SPM-derived template map to assess their degree of overlap. To identify components potentially related to, but not identical with, the task-related activations identified by SPM, we applied a spatial correlation threshold of 0.3. While the threshold may appear modest, it aligns with multiple prior studies that have employed similar spatial correlation criteria (r = 0.26–0.3) for ICA spatial sorting202,203,204,205. These precedents demonstrate that a moderate threshold effectively balances sensitivity to partially overlapping neural networks with the exclusion of noise-driven or redundant components. In our analysis pipeline, spatial sorting was applied as a complementary step following rigorous temporal sorting with FDR correction. This two-stage approach ensured robust and reliable identification of task-related ICA components that extended beyond conventional SPM-based findings.
To investigate the causal relationships among the ICs, DCM85 was applied using DCM12. This approach enables the evaluation of functional network patterns, including large-scale brain networks comprising multiple regions86,87,88. We investigated the intrinsic connectivity among the ICs associated with film observation, which were presumed to reflect sensory processing (early visual, higher-order visual, and auditory networks), facial expressions, and subjective emotional responses. For each participant, we extracted the time-series data corresponding to the ICs of interest and used them as inputs for DCM. Subsequently, individual hypothetical models were constructed, incorporating bidirectional (i.e., forward and backward) intrinsic connections (Fig. 5). Based on psychological and anatomical evidence, bidirectional connectivity was assumed between facial and subjective emotional response networks, as well as among sensory networks. Nine DCM models were constructed to test hypotheses about whether sensory networks interact with facial and subjective emotional response networks, and how sensory networks converge or process information. A detailed description of the network model is provided in the Results section.
We used the stochastic DCM approach, which enables modeling of both time-varying endogenous fluctuations and sensory perturbations elicited by continuous film viewing90,91,92. Model selection was conducted using a random-effects Bayesian model selection framework97, and exceedance probabilities were used as the evaluation metric, based on the premise that one model was more likely than the others to best explain the group-level data206,207. We then compared the families98 in the same way.
Reporting summary
Further information on research design is available in the Nature Portfolio Reporting Summary linked to this article.
Data availability
Supplementary Data 1: The source data behind Fig. 2 in the paper. For the fMRI data, unthresholded statistical maps were uploaded to the NeuroVault.org database (http://neurovault.org/collections/4178/).
Code availability
This study used openly available software and codes. The SPM12 software used for fMRI data analysis is available at http://www.fil.ion.ucl.ac.uk/spm. The GIFT software used for group ICA is available at https://trendscenter.org/software/gift/.
References
Ekman, P. Biological and cultural contributions to body and facial movement. In Anthropology of the Body (ed. Blacking, J.) 34–84 (Academic Press, London, 1977).
Frijda, N. The psychologists’ point of view. In Handbook of Emotions 3rd ed. (eds Lewis, M., Haviland-Jones, J. M. & Barrett, L. F.) 68–87 (Guilford, New York, 2008).
Lang, P. J., Bradley, M. M. & Cuthbert, B. N. Emotion, motivation, and anxiety: brain mechanisms and psychophysiology. Biol. Psychiatry 44, 1248–1263 (1998).
Scherer, K. R. The dynamic architecture of emotion: evidence for the component process model. Cogn. Emot. 23, 1307–1351 (2009).
Izard, C. E. The many meanings/aspects of emotion: definitions, functions, activation, and regulation. Emot. Rev. 2, 363–370 (2010).
Ekman, P. An argument for basic emotions. Cogn. Emot. 6, 169–200 (1992).
Greenwald, M. K., Cook, E. W. & Lang, P. J. Affective judgment and psychophysiological response: dimensional covariation in the evaluation of pictorial stimuli. J. Psychophysiol. 3, 51–64 (1989).
Lang, P. J., Greenwald, M. K., Bradley, M. M. & Hamm, A. O. Looking at pictures: affective, facial, visceral, and behavioral reactions. Psychophysiology 30, 261–273 (1993).
Bradley, M. M. & Lang, P. J. Affective reactions to acoustic stimuli. Psychophysiology 37, 204–215 (2000).
Horio, T. EMG activities of facial and chewing muscles of human adults in response to taste stimuli. Percept. Mot. Skills 97, 289–298 (2003).
Larsen, J. T., Norris, C. J. & Cacioppo, J. T. Effects of positive and negative affect on electromyographic activity over zygomaticus major and corrugator supercilii. Psychophysiology 40, 776–785 (2003).
Tan, J.-W. et al. Repeatability of facial electromyography (EMG) activity over corrugator supercilii and zygomaticus major on differentiating various emotions. J. Ambient Intell. Humaniz Comput. 3, 3–10 (2012).
Sato, W., Fujimura, T., Kochiyama, T. & Suzuki, N. Relationships among facial mimicry, emotional experience, and emotion recognition. PLoS ONE 8, e57889 (2013).
Sato, W., Kochiyama, T. & Yoshikawa, S. Physiological correlates of subjective emotional valence and arousal dynamics while viewing films. Biol. Psychol. 157, 107974 (2020).
Sato, W. et al. Facial EMG correlates of subjective hedonic responses during food consumption. Nutrients 12, 1174 (2020).
Sato, W. et al. Brow and masticatory muscle activity senses subjective hedonic experiences during food consumption. Nutrients 13, 4216 (2021).
Sato, W. et al. Emotional valence sensing using a wearable facial EMG device. Sci. Rep. 11, 5757 (2021).
Sato, W., Yoshikawa, S. & Fushiki, T. Facial EMG activity is associated with hedonic experiences but not nutritional values while viewing food images. Nutrients 13, 11 (2021).
Sato, W. & Kochiyama, T. Exploration of emotion dynamics sensing using trapezius EMG and fingertip temperature. Sensors 22, 6553 (2022).
Zhang, J. et al. Sensing emotional valence and arousal dynamics through automated facial action unit analysis. Sci. Rep. 14, 19563 (2024).
Sato, W. et al. Dynamic concordance between subjective and facial EMG hedonic responses during the consumption of gel-type food. Curr. Res Food Sci. 9, 100263 (2025).
Darwin, C. The Expression of the Emotions in Man and Animals (Murray, London, 1872).
Dimberg, U. Facial electromyography and emotional reactions. Psychophysiology 27, 481–494 (1990).
Horstmann, G. What do facial expressions convey: feeling states, behavioral intentions, or action requests? Emotion 3, 150–166 (2003).
James, W. What is an emotion? Mind 9, 188–205 (1884).
Tomkins, S. Affect Imagery Consciousness, Vol. 1: The Positive Affects (Springer, New York, 1962).
Buck, R. Prime theory: an integrated approach to motivation and emotion. Psychol. Rev. 92, 389–413 (1985).
Zajonc, R. B. Emotion and facial efference: a theory reclaimed. Science 228, 15–21 (1985).
Laird, J. D. Feelings: The Perception of Self (Oxford University Press, New York, 2007).
Noah, T., Schul, Y. & Mayo, R. When both the original study and its failed replication are correct: feeling observed eliminates the facial feedback effect. J. Pers. Soc. Psychol. 114, 657–664 (2018).
Coles, N. A. A multi-lab test of the facial feedback hypothesis by the Many Smiles Collaboration. Nat. Hum. Behav. 6, 1731–1742 (2022).
Crowley, J. S., Silverstein, M. L., Reghunathan, M. & Gosman, A. A. Glabellar botulinum toxin injection improves depression scores: a systematic review and meta-analysis. Plast. Reconstr. Surg. 150, 211e–220e (2022).
Wagenmakers, E. J. et al. Registered replication report: Strack, Martin, & Stepper (1988). Perspect. Psychol. Sci. 11, 917–928 (2016).
Heller, A. S., Greischar, L. L., Honor, A., Anderle, M. J. & Davidson, R. J. Simultaneous acquisition of corrugator electromyography and functional magnetic resonance imaging: a new method for objectively measuring affect and neural activity concurrently. Neuroimage 58, 930–934 (2011).
Heller, A. S., Lapate, R. C., Mayer, K. E. & Davidson, R. J. The face of negative affect: trial-by-trial corrugator responses to negative pictures are positively associated with amygdala and negatively associated with ventromedial prefrontal cortex activity. J. Cogn. Neurosci. 26, 2102–2110 (2014).
Iwase, M. et al. Neural substrates of human facial expression of pleasant emotion induced by comic films: a PET study. Neuroimage 17, 758–768 (2002).
Wild, B. et al. Humor and smiling: cortical regions selective for cognitive, affective, and volitional components. Neurology 66, 887–893 (2006).
Ketter, T. A. et al. Anterior paralimbic mediation of procaine-induced emotional and psychosensory experiences. Arch. Gen. Psychiatry 53, 59–69 (1996).
Zald, D. H. & Pardo, J. V. Emotion, olfaction, and the human amygdala: amygdala activation during aversive olfactory stimulation. Proc. Natl. Acad. Sci. USA 94, 4119–4124 (1997).
Canli, T., Zhao, Z., Brewer, J., Gabrieli, J. D. & Cahill, L. Event-related activation in the human amygdala associates with later memory for individual emotional response. J. Neurosci. 20, RC99 (2000).
Schneider, F., Habel, U., Kessler, C., Salloum, J. B. & Posse, S. Gender differences in regional cerebral activity during sadness. Hum. Brain Mapp. 9, 226–238 (2000).
Anderson, A. K. & Sobel, N. Dissociating intensity from valence as sensory inputs to emotion. Neuron 39, 581–583 (2003).
Phan, K. L. et al. Neural correlates of individual ratings of emotional salience: a trial-related fMRI study. Neuroimage 21, 768–780 (2004).
Sato, W., Yoshikawa, S., Kochiyama, T. & Matsumura, M. The amygdala processes the emotional significance of facial expressions: an fMRI investigation using the interaction between expression and face direction. Neuroimage 22, 1006–1013 (2004).
Habel, U., Klein, M., Kellermann, T., Shah, N. J. & Schneider, F. Same or different? Neural correlates of happy and sad mood in healthy males. Neuroimage 26, 206–214 (2005).
Grimm, S. et al. Segregated neural representation of distinct emotion dimensions in the prefrontal cortex-an fMRI study. Neuroimage 30, 325–340 (2006).
Stark, R. et al. Hemodynamic brain correlates of disgust and fear ratings. Neuroimage 37, 663–673 (2007).
Colibazzi, T. et al. Neural systems subserving valence and arousal during the experience of induced emotions. Emotion 10, 377–389 (2010).
Gerdes, A. B. M. et al. Brain activations to emotional pictures are differentially associated with valence and arousal ratings. Front. Hum. Neurosci. 4, 175 (2010).
Viinikainen, M., Katsyri, J. & Sams, M. Representation of perceived sound valence in the human brain. Hum. Brain Mapp. 33, 2295–2305 (2012).
Landa, A. et al. Distinct neural circuits subserve interpersonal and non-interpersonal emotions. Soc. Neurosci. 8, 474–488 (2013).
Wilson-Mendenhall, C. D., Barrett, L. F. & Barsalou, L. W. Neural evidence that human emotions share core affective properties. Psychol. Sci. 24, 947–956 (2013).
Haj-Ali, H., Anderson, A. K. & Kron, A. Comparing three models of arousal in the human brain. Soc. Cogn. Affect Neurosci. 15, 1–11 (2020).
Posner, J. et al. The neurophysiological bases of emotion: an fMRI study of the affective circumplex using emotion-denoting words. Hum. Brain Mapp. 30, 883–895 (2009).
Sescousse, G., Redoute, J. & Dreher, J. C. The architecture of reward value coding in the human orbitofrontal cortex. J. Neurosci. 30, 13095–13104 (2010).
Sescousse, G., Barbalat, G., Domenech, P. & Dreher, J. C. Imbalance in the sensitivity to different types of rewards in pathological gambling. Brain 136, 2527–2538 (2013).
Garrett, A. S. & Maddock, R. J. Separating subjective emotion from the perception of emotion-inducing stimuli: an fMRI study. Neuroimage 33, 263–274 (2006).
Brand, M., Snagowski, J., Laier, C. & Maderwald, S. Ventral striatum activity when watching preferred pornographic pictures is correlated with symptoms of internet pornography addiction. Neuroimage 129, 224–232 (2016).
Heinzel, A. et al. How do we modulate our emotions? Parametric fMRI reveals cortical midline structures as regions specifically involved in the processing of emotional valences. Brain Res. Cogn. Brain Res. 25, 348–358 (2005).
Levesque, J. et al. Neural circuitry underlying voluntary suppression of sadness. Biol. Psychiatry 53, 502–510 (2003).
Anders, A. Brain activity underlying emotional valence and arousal: a response-related fMRI study. Hum. Brain Mapp. 23, 200–209 (2004).
Zald, D. H., Lee, J. T., Fluegel, K. W. & Pardo, J. V. Aversive gustatory stimulation activates limbic circuits in humans. Brain 121, 1143–1154 (1998).
Matsunaga, M. et al. Structural and functional associations of the rostral anterior cingulate cortex with subjective happiness. Neuroimage 134, 132–141 (2016).
Ruch, W. The perception of humor. In Emotion, Qualia, and Consciousness (ed. Kaszniak, A. W.) 410–425 (World Scientific Publisher, Tokyo, 2001).
Levy, H. C. et al. An examination of the association between subjective distress and functional connectivity during discarding decisions in hoarding disorder. Biol. Psychiatry Cogn. Neurosci. Neuroimaging 6, 1013–1022 (2021).
Sato, W., Kochiyama, T. & Yoshikawa, S. The widespread action observation/execution matching system for facial expression processing. Hum. Brain Mapp. 44, 3057–3071 (2023).
Laird, J. D. & Berglas, S. Individual differences in the effects of engaging in counter-attitudinal behavior. J. Pers. 43, 286–304 (1975).
Ekman, P. & Friesen, W. V. Facial Action Coding System (Consulting Psychologist, Palo Alto, CA, 1978).
Ekman, P., Friesen, W. V. & Hager, J. C. Facial Action Coding System (Research Nexus, Network Research Information, Salt Lake City, UT, 2002).
Gerber, A. J. et al. An affective circumplex model of neural systems subserving valence, arousal, and cognitive overlay during the appraisal of emotional faces. Neuropsychologia 46, 2129–2139 (2008).
Kreifelts, B. et al. Non-verbal emotion communication training induces specific changes in brain function and structure. Front. Hum. Neurosci. 7, 648 (2013).
Citron, F. M. M., Gray, M. A., Critchley, H. D., Weekes, B. S. & Ferstl, E. C. Emotional valence and arousal affect reading in an interactive way: neuroimaging evidence for an approach-withdrawal framework. Neuropsychologia 56, 79–89 (2014).
Wade-Bohleber, L. M., Thoma, R. & Gerber, A. J. Neural correlates of subjective arousal and valence in health and panic disorder. Psychiatry Res. Neuroimaging 305, 111186 (2020).
Rosenberg, E. L. & Ekman, P. Coherence between expressive and experiential systems in emotion. Cogn. Emot. 8, 201–229 (1994).
Lambie, J. A. & Marcel, A. J. Consciousness and the varieties of emotion experience: a theoretical framework. Psychol. Rev. 109, 219–259 (2002).
Heinzel, A., Moerth, S. & Northoff, G. The central role of anterior cortical midline structures in emotional feeling and consciousness. Psyche 16, 23–47 (2010).
Mauss, I. B., Levenson, R. W., McCarter, L., Wilhelm, F. H. & Gross, J. J. The tie that binds? Coherence among emotion experience, behavior, and physiology. Emotion 5, 175–190 (2005).
Kober, H. et al. Functional grouping and cortical-subcortical interactions in emotion: a meta-analysis of neuroimaging studies. Neuroimage 42, 998–1031 (2008).
Lindquist, K. A., Wager, T. D., Kober, H., Bliss-Moreau, E. & Barrett, L. F. The brain basis of emotion: a meta-analytic review. Behav. Brain Sci. 35, 121–143 (2012).
Lindquist, K. A., Satpute, A. B., Wager, T. D., Weber, J. & Barrett, L. F. The brain basis of positive and negative affect: evidence from a meta-analysis of the human neuroimaging literature. Cereb. Cortex 26, 1910–1922 (2016).
Calhoun, V. D., Adali, T., Pearlson, G. D. & Pekar, J. J. A method for making group inferences from functional MRI data using independent component analysis. Hum. Brain Mapp. 14, 140–151 (2001).
Engel, T. E., Schölvinck, M. L. & Lewis, C. M. The diversity and specificity of functional connectivity across spatial and temporal scales. Neuroimage 245, 118692 (2021).
Joel, S. E., Caffo, B. S., van Zijl, P. C. M. & Pekar, J. J. On the relationship between seed-based and ICA-based measures of functional connectivity. Magn. Reson. Med. 66, 644–657 (2011).
Wu, J. et al. Neurodevelopmental changes in the relationship between stress perception and prefrontal-amygdala functional circuitry. Neuroimage Clin. 20, 267–274 (2018).
Friston, K. J., Harrison, L. & Penny, W. Dynamic causal modelling. Neuroimage 19, 1273–1302 (2003).
Stevens, M. C., Kiehl, K. A., Pearlson, G. D. & Calhoun, V. D. Functional neural networks underlying response inhibition in adolescents and adults. Behav. Brain Res. 181, 12–22 (2007).
St Jacques, P. L., Kragel, P. A. & Rubin, D. C. Dynamic neural networks supporting memory retrieval. Neuroimage 57, 608–616 (2011).
Goulden, N. et al. The salience network is responsible for switching between the default mode network and the central executive network: replication from DCM. Neuroimage 99, 180–190 (2014).
Holmes, A. P. & Friston, K. J. Generalisability, random effects and population inference. Neuroimage 7, S754 (1998).
Daunizeau, J., Stephan, K. E. & Friston, K. J. Stochastic dynamic causal modelling of fMRI data: should we care about neural noise? Neuroimage 62, 464–481 (2012).
Friston, K. J., Li, B., Daunizeau, J. & Stephan, K. E. Network discovery with DCM. Neuroimage 56, 1202–1221 (2011).
Li, B. et al. Generalised filtering and stochastic DCM for fMRI. Neuroimage 58, 442–457 (2011).
Felleman, D. J. & Van Essen, D. C. Distributed hierarchical processing in the primate cerebral cortex. Cereb. Cortex 1, 1–47 (1991).
Rockland, K. S. About connections. Front. Neuroanat. 9, 61 (2015).
Mesulam, M. M. From sensation to cognition. Brain 121, 1013–1052 (1998).
Huntenburg, J. M., Bazin, P. L. & Margulies, D. S. Large-scale gradients in human cortical organization. Trends Cogn. Sci. 22, 21–31 (2018).
Stephan, K. E. et al. Dynamic causal models of neural system dynamics: current state and future extensions. J. Biosci. 32, 129–144 (2007).
Penny, W. D. et al. Comparing families of dynamic causal models. PLoS Comput. Biol. 6, e1000709 (2010).
Hopf, H. C., Mueller-Forell, W. & Hopf, N. J. Localization of emotional and volitional facial paresis. Neurology 42, 1918–1923 (1992).
Miele, G., Lavorgna, L., Marrapodi, M. M. & Abbadessa, G. Emotional facial palsy: an unusual and rarely explored neurological sign. Neurol. Sci. 43, 6305–6307 (2022).
Cascino, G. D., Luckstein, R. R., Sharbrough, F. W. & Jack, C. R. Jr. Facial asymmetry, hippocampal pathology, and remote symptomatic seizures: a temporal lobe epileptic syndrome. Neurology 43, 725–727 (1993).
Jacob, A., Cherian, P. J., Radhakrishnan, K. & Sarma, P. S. Emotional facial paresis in temporal lobe epilepsy: its prevalence and lateralizing value. Seizure 12, 60–64 (2003).
Morris, J. S., Ohman, A. & Dolan, R. J. Conscious and unconscious emotional learning in the human amygdala. Nature 393, 467–470 (1998).
Morris, J. S., de Gelder, B., Weiskrantz, L. & Dolan, R. J. Differential extrageniculostriate and amygdala responses to presentation of emotional faces in a cortically blind field. Brain 124, 1241–1252 (2001).
Sheline, Y. I. et al. Increased amygdala response to masked emotional faces in depressed subjects resolves with antidepressant treatment: an fMRI study. Biol. Psychiatry 50, 651–658 (2001).
Critchley, H. D., Mathias, C. J. & Dolan, R. J. Fear conditioning in humans: the influence of awareness and autonomic arousal on functional neuroanatomy. Neuron 33, 653–663 (2002).
Etkin, A. et al. Individual differences in trait anxiety predict the response of the basolateral amygdala to unconsciously processed fearful faces. Neuron 44, 1043–1055 (2004).
Killgore, W. D. S. & Yurgelun-Todd, D. A. Activation of the amygdala and anterior cingulate during nonconscious processing of sad versus happy faces. Neuroimage 21, 1215–1223 (2004).
Nomura, M. et al. Functional association of the amygdala and ventral prefrontal cortex during cognitive evaluation of facial expressions primed by masked angry faces: an event-related fMRI study. Neuroimage 21, 352–363 (2004).
Pasley, B. N., Mayes, L. C. & Schultz, R. T. Subcortical discrimination of unperceived objects during binocular rivalry. Neuron 42, 163–172 (2004).
Williams, M. A., Morris, A. P., McGlone, F., Abbott, D. F. & Mattingley, J. B. Amygdala responses to fearful and happy facial expressions under conditions of binocular suppression. J. Neurosci. 24, 2898–2904 (2004).
Liddell, B. J. et al. A direct brainstem-amygdala-cortical ‘alarm’ system for subliminal signals of fear. Neuroimage 24, 235–243 (2005).
Jiang, Y. & He, S. Cortical responses to invisible faces: dissociating subsystems for facial-information processing. Curr. Biol. 16, 2023–2029 (2006).
Dannlowski, U. et al. Amygdala reactivity to masked negative faces is associated with automatic judgmental bias in major depression: a 3 T fMRI study. J. Psychiatry Neurosci. 32, 423–429 (2007).
Suslow, T. et al. Attachment avoidance modulates neural response to masked facial emotion. Hum. Brain Mapp. 30, 3553–3562 (2009).
Duan, X., Dai, Q., Gong, Q. & Chen, H. Neural mechanism of unconscious perception of surprised facial expression. Neuroimage 52, 401–407 (2010).
Ottaviani, C. et al. Amygdala responses to masked and low spatial frequency fearful faces: a preliminary fMRI study in panic disorder. Psychiatry Res. 203, 159–165 (2012).
Yang, J., Cao, Z., Xu, X. & Chen, G. The amygdala is involved in affective priming effect for fearful faces. Brain Cogn. 80, 15–22 (2012).
Dannlowski, U. et al. Childhood maltreatment is associated with an automatic negative emotion processing bias in the amygdala. Hum. Brain Mapp. 34, 2899–2909 (2013).
Suslow, T. et al. Neural correlates of affective priming effects based on masked facial emotion: an fMRI study. Psychiatry Res Neuroimaging 211, 239–245 (2013).
Chen, C., Hu, C. H. & Cheng, Y. Mismatch negativity (MMN) stands at the crossroads between explicit and implicit emotional processing. Hum. Brain Mapp. 38, 140–150 (2017).
Sato, W., Kochiyama, T., Minemoto, K., Sawada, R. & Fushiki, T. Amygdala activation during unconscious visual processing of food. Sci. Rep. 9, 7277 (2019).
Bayle, D. J., Henaff, M. A. & Krolak-Salmon, P. Unconsciously perceived fear in peripheral vision alerts the limbic system: a MEG study. PLoS ONE 4, e8207 (2009).
Wang, Y. et al. Rapid processing of invisible fearful faces in the human amygdala. J. Neurosci. 43, 1405–1413 (2023).
Kubota, Y. et al. Emotional cognition without awareness after unilateral temporal lobectomy in humans. J. Neurosci. 20, RC97 (2000).
Glascher, J. & Adolphs, R. Processing of the arousal of subliminal and supraliminal emotional stimuli by the human amygdala. J. Neurosci. 23, 10274–10282 (2003).
Sato, W. et al. Impairment of unconscious emotional processing after unilateral medial temporal structure resection. Sci. Rep. 14, 4269 (2024).
Pierce, J. E., Blair, R. J. R., Clark, K. R. & Neta, M. Reappraisal-related downregulation of amygdala BOLD activation occurs only during the late trial window. Cogn. Affect Behav. Neurosci. 22, 777–787 (2022).
Bo, K. et al. A systems identification approach using Bayes factors to deconstruct the brain bases of emotion regulation. Nat. Neurosci. 27, 975–987 (2024).
Vuilleumier, P. How brains beware: neural mechanisms of emotional attention. Trends Cogn. Sci. 9, 585–594 (2005).
Sato, W. et al. Rapid and multiple-stage activation of the human amygdala for processing facial signals. Commun. Integr. Biol. 6, e24562 (2013).
Tassinary, L., Orr, S., Wolford, G., Napps, S. & Lanzetta, J. The role of awareness in affective information processing: an exploration of the Zajonc hypothesis. Bull. Psychon. Soc. 22, 489–492 (1984).
van der Ploeg, M. M. Peripheral physiological responses to subliminally presented negative affective stimuli: a systematic review. Biol. Psychol. 129, 131–153 (2017).
Frijda, N. H. The Emotions (Cambridge University Press, Cambridge, 1986).
Smith, C. A. & Lazarus, R. S. Emotion and adaptation. In Handbook of Personality: Theory and Research (ed. Pervin, L. A.) 609–637 (Guilford Press, New York, 1990).
Ellsworth, P. C. Some implications of cognitive appraisal theories of emotion. In International Review of Studies on Emotion (ed. Strongman, K.) 143–161 (Wiley, New York, 1991).
Roseman, I. & Smith, C. Appraisal theory: overview, assumptions, varieties, controversies. In Appraisal Processes in Emotion: Theory, Methods, Research (eds Scherer, K.R., Schorr, A. & Johnstone, T.) 3–19 (Oxford University Press, Oxford, 2001).
Amaral, D. G., Price, J. L., Pitkanen, A. & Carmichael, S. T. Anatomical organization of the primate amygdaloid complex. In The Amygdala: Neurobiological Aspects of Emotion, Memory, and Mental Dysfunction (ed. Aggleton, J. P.) 1–66 (Wiley-Liss, New York, 1992).
Freese, J. L. & Amaral, D. G. Neuroanatomy of the primate amygdala. In The Human Amygdala (eds Whalen, P. J. & Phelps, E. A.) 3–42 (Guilford Press, New York, 2009).
Fox, A. S. & Shackman, A. J. An honest reckoning with the amygdala and mental illness. Am. J. Psychiatry 181, 1059–1075 (2024).
Kjaer, T. W., Nowak, M. & Lou, H. C. Reflective self-awareness and conscious states: PET evidence for a common midline parietofrontal core. Neuroimage 17, 1080–1086 (2002).
Pfeifer, J. H. & Peake, S. J. Self-development: integrating cognitive, socioemotional, and neuroimaging perspectives. Dev. Cogn. Neurosci. 2, 55–69 (2012).
Frewen, P., Schroeter, M. L. & Riva, G. Neuroimaging the consciousness of self: review, and conceptual-methodological framework. Neurosci. Biobehav. Rev. 112, 164–212 (2020).
Ochsner, K. N. Reflecting upon feelings: an fMRI study of neural systems supporting the attribution of emotion to self and other. J. Cogn. Neurosci. 16, 1746–1772 (2004).
Lombardo, M. V. et al. Shared neural circuits for mentalizing about the self and others. J. Cogn. Neurosci. 22, 1623–1635 (2010).
Spreng, R. N., Mar, R. A. & Kim, A. S. N. The common neural basis of autobiographical memory, prospection, navigation, theory of mind, and the default mode: a quantitative meta-analysis. J. Cogn. Neurosci. 21, 489–510 (2009).
Denny, B. T., Kober, H., Wager, T. D. & Ochsner, K. N. A meta-analysis of functional neuroimaging studies of self- and other judgments reveals a spatial gradient for mentalizing in medial prefrontal cortex. J. Cogn. Neurosci. 24, 1742–1752 (2012).
Murray, R. J., Schaer, M. & Debbane, M. Degrees of separation: a quantitative neuroimaging meta-analysis investigating self-specificity and shared neural activation between self- and other-reflection. Neurosci. Biobehav. Rev. 36, 1043–1059 (2012).
Premack, D. & Woodruff, G. Does the chimpanzee have a theory of mind? Behav. Brain Sci. 1, 515–526 (1978).
Aggleton, J. P. The amygdala: what’s happened in the last decade? In The Amygdala: A Functional Analysis (ed. Aggleton, J. P.) 1–30 (Oxford University Press, New York, 2000).
Nambu, A. Somatotopic organization of the primate basal ganglia. Front. Neuroanat. 5, 26 (2011).
Cavanna, A. E. & Trimble, M. R. The precuneus: a review of its functional anatomy and behavioural correlates. Brain 129, 564–583 (2006).
Mantini, D. et al. Default mode of brain function in monkeys. J. Neurosci. 31, 12954–12962 (2011).
Kravitz, D. J., Saleem, K. S., Baker, C. I., Ungerleider, L. G. & Mishkin, M. The ventral visual pathway: an expanded neural framework for the processing of object quality. Trends Cogn. Sci. 17, 26–49 (2013).
Foley, E., Rippon, G., Thai, N. J., Longe, O. & Senior, C. Dynamic facial expressions evoke distinct activation in the face perception network: a connectivity analysis study. J. Cogn. Neurosci. 24, 507–520 (2012).
Mukherjee, P. et al. Altered amygdala connectivity within the social brain in schizophrenia. Schizophr. Bull. 40, 152–160 (2014).
Sato, W., Kochiyama, T., Uono, S., Yoshikawa, S. & Toichi, M. Direction of amygdala-neocortex interaction during dynamic facial expression processing. Cereb. Cortex 27, 1878–1890 (2017).
Gard, A. M. et al. Amygdala functional connectivity during socioemotional processing prospectively predicts increases in internalizing symptoms in a sample of low-income, urban, young men. Neuroimage 178, 562–573 (2018).
Sato, W. et al. Atypical amygdala–neocortex interaction during dynamic facial expression processing in autism spectrum disorder. Front. Hum. Neurosci. 13, 351 (2019).
Bo, K. et al. Decoding neural representations of affective scenes in retinotopic visual cortex. Cereb. Cortex 31, 3047–3063 (2021).
Liu, T. T. et al. Layer-specific, retinotopically-diffuse modulation in human visual cortex in response to viewing emotionally expressive faces. Nat. Commun. 13, 6302 (2022).
Russell, J. A. A circumplex model of affect. J. Pers. Soc. Psychol. 39, 1161–1178 (1980).
Kron, A., Pilkiw, M., Banaei, J., Goldstein, A. & Anderson, A. K. Are valence and arousal separable in emotional experience? Emotion 15(35), 44 (2015).
Kuppens, P., Tuerlinckx, F., Russell, J. A. & Barrett, L. F. The relation between valence and arousal in subjective experience. Psychol. Bull. 139, 917–940 (2013).
Maffei, A. & Angrilli, A. E-MOVIE - experimental MOVies for induction of emotions in neuroscience: an innovative film database with normative data and sex differences. PLoS ONE 14, e0223124 (2019).
Sato, W., Noguchi, M. & Yoshikawa, S. Emotion elicitation effect of films in a Japanese sample. Soc. Behav. Pers. 35, 863–874 (2007).
Deng, Y., Yang, M. & Zhou, R. A new standardized emotional film database for Asian culture. Front Psychol. 8, 1941 (2017).
Zupan, B. & Eskritt, M. Eliciting emotion ratings for a set of film clips: a preliminary archive for research in emotion. J. Soc. Psychol. 160, 768–789 (2020).
Cremers, H. R., Wager, T. D. & Yarkoni, T. The relation between statistical power and inference in fMRI. PLoS ONE 12, e0184923 (2017).
Sambuco, N. fMRI replicability during emotional scene viewing: functional regions and sample size. Psychophysiology 59, e14000 (2022).
Marek, S., Tervo-Clemmens, B. & Calabro, F. J. Reproducible brain-wide association studies require thousands of individuals. Nature 603, 654–660 (2022).
Ma, L. et al. Effect of scanning duration and sample size on reliability in resting state fMRI dynamic causal modeling analysis. Neuroimage 292, 120604 (2024).
Hennenlotter, A. et al. A common neural basis for receptive and expressive communication of pleasant facial affect. Neuroimage 26, 581–591 (2005).
Anders, S. et al. Compensatory premotor activity during affective face processing in subclinical carriers of a single mutant Parkin allele. Brain 135, 1128–1140 (2012).
Kircher, T. et al. Affect-specific activation of shared networks for perception and execution of facial expressions. Soc. Cogn. Affect Neurosci. 8, 370–377 (2013).
Krautheim, J. T. et al. Emotion specific neural activation for the production and perception of facial expressions. Cortex 127, 17–28 (2020).
Gallagher, H. L. et al. Reading the mind in cartoons and stories: an fMRI study of ‘theory of mind’ in verbal and nonverbal tasks. Neuropsychologia 38, 11–21 (2000).
Van Overwalle, F. & Baetens, K. Understanding others’ actions and goals by mirror and mentalizing systems: a meta-analysis. Neuroimage 48, 564–584 (2009).
Schurz, M., Radua, J., Aichhorn, M., Richlan, F. & Perner, J. Fractionating theory of mind: a meta-analysis of functional brain imaging studies. Neurosci. Biobehav. Rev. 42, 9–34 (2014).
Molenberghs, P., Johnson, H., Henry, J. D. & Mattingley, J. B. Understanding the minds of others: a neuroimaging meta-analysis. Neurosci. Biobehav. Rev. 65, 276–291 (2016).
Oya, H., Kawasaki, H., Howard, M. A. 3 & Adolphs, R. Electrophysiological responses in the human amygdala discriminate emotion categories of complex visual stimuli. J. Neurosci. 22, 9502–9512 (2002).
Sato, W. et al. Rapid amygdala gamma oscillations in response to fearful facial expressions. Neuropsychologia 49, 612–617 (2011).
Tan, K. M. et al. Electrocorticographic evidence of a common neurocognitive sequence for mentalizing about the self and others. Nat. Commun. 13, 1919 (2022).
Dmochowski, J. P., Sajda, P., Dias, J. & Parra, L. C. Correlated components of ongoing EEG point to emotionally laden attention - a possible marker of engagement? Front. Hum. Neurosci. 6, 112 (2012).
Cohen, S. S. & Parra, L. C. Memorable audiovisual narratives synchronize sensory and supramodal neural responses. eNeuro 3, ENEURO0203–ENEU162016 (2016).
Maffei, A. Spectrally resolved EEG intersubject correlation reveals distinct cortical oscillatory patterns during free-viewing of affective scenes. Psychophysiology 57, e13652 (2020).
Lakens, D. Sample size justification. Collabra Psychol. 8, 33267 (2022).
Oldfield, R. C. The assessment and analysis of handedness: the Edinburgh inventory. Neuropsychologia 9, 97–113 (1971).
Gross, J. J. & Levenson, R. W. Emotion elicitation using films. Cogn. Emot. 9, 87–108 (1995).
Gilman, T. L. et al. A film set for the elicitation of emotion in research: a comprehensive catalog derived from four decades of investigation. Behav. Res. Methods 49, 2061–2082 (2017).
Ashburner, J. & Friston, K. J. Unified segmentation. Neuroimage 26, 839–851 (2005).
Hutton, C. et al. Image distortion correction in fMRI: a quantitative evaluation. Neuroimage 16, 217–240 (2002).
Alahmadi, A. A. S. Effects of different smoothing on global and regional resting functional connectivity. Neuroradiology 63, 99–109 (2021).
Worsley, K. J. & Friston, K. J. Analysis of fMRI time-series revisited–again. Neuroimage 2, 173–181 (1995).
O’Brien, R. M. A caution regarding rules of thumb for variance inflation factors. Qual. Quant. 41, 673–690 (2007).
Kasper, L., Bollmann, S. & Diaconescu, A. O. The PhysIO toolbox for modeling physiological noise in fMRI data. J. Neurosci. Methods 276, 56–72 (2017).
Behzadi, Y., Restom, K., Liau, J. & Liu, T. T. A component based noise correction method (CompCor) for BOLD and perfusion based fMRI. Neuroimage 37, 90–101 (2007).
Friston, K. J. et al. Classical and Bayesian inference in neuroimaging: applications. Neuroimage 16, 484–512 (2002).
Tzourio-Mazoyer, N. et al. Neural correlates of woman face processing by 2-month-old infants. Neuroimage 15, 454–461 (2002).
Rolls, E. T., Huang, C. C., Lin, C. P., Feng, J. & Joliot, M. Automated anatomical labelling atlas. Neuroimage 206, 116189 (2020).
Bell, A. J. & Sejnowski, T. J. An information-maximization approach to blind separation and blind deconvolution. Neural Comput. 7, 1129–1159 (1995).
Reineberg, A. E., Andrews-Hanna, J. R., Depue, B. E., Friedman, N. P. & Banich, M. T. Resting-state networks predict individual differences in common and specific aspects of executive function. Neuroimage 104, 69–78 (2015).
van Timmeren, T., Zhutovsky, P., van Holst, R. J. & Goudriaan, A. E. Connectivity networks in gambling disorder: a resting-state fMRI study. Int. Gambl. Stud. 18, 242–258 (2018).
Herman, A. M., Critchley, H. D. & Duka, T. Trait impulsivity associated with altered resting-state functional connectivity within the somatomotor network. Front. Behav. Neurosci. 14, 111 (2020).
Palmucci, M. & Tagliazucchi, E. Divergences between resting state networks and meta-analytic maps of task-evoked brain activity. TONIJ 15, 1–10 (2022).
Liu, L. et al. Children with reading disability show brain differences in effective connectivity for visual, but not auditory word comprehension. PLoS ONE 5, e13492 (2010).
Seghier, M. L., Josse, G., Leff, A. P. & Price, C. J. Lateralization is predicted by reduced coupling from the left to right prefrontal cortex during semantic decisions on written words. Cereb. Cortex 21, 1519–1531 (2011).
Acknowledgements
This project was supported by the Japan Science and Technology Agency (JST) Mirai Program (JPMJMI20D7) and JST CREST (JPMJCR17A5). The authors would like to thank Masaru Usami for technical support. This study was conducted using the MRI scanner and related facilities of the Institute for the Future of Human Society, Kyoto University.
Author information
Authors and Affiliations
Contributions
Conceived and designed the experiments: W.S. and T.K. Performed the experiments: W.S., N.A., and K.A. Analyzed the data: W.S. and T.K. Wrote the paper: W.S., T.K., N.A., K.A., and S.Y.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Peer review
Peer review information
Communications Biology thanks the anonymous reviewers for their contribution to the peer review of this work. Primary Handling Editor: Benjamin Bessieres. [A peer review file is available].
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.
About this article
Cite this article
Sato, W., Kochiyama, T., Abe, N. et al. Neural network dynamics associated with facial and subjective emotional responses. Commun Biol 9, 91 (2026). https://doi.org/10.1038/s42003-025-09361-5
Received:
Accepted:
Published:
Version of record:
DOI: https://doi.org/10.1038/s42003-025-09361-5









