Abstract
While using fully autonomous vehicles is expected to radically change the way we live our daily lives, it is not yet available in most parts of the world, so we only have sporadic results on passenger reactions. Furthermore, we have very limited insights into how passengers react to an unexpected event during the ride. Previous physiological research has shown that passengers have lower levels of anxiety in the event of a human-driven condition compared to a self-driving condition. The aim of our current study was to investigate these differences in unexpected road events in real-life passenger experiences. All subjects were driven through a closed test track in human-driven and then self-driving mode. During the journey, unforeseen obstacles were encountered on the path (deer and human-shaped dummies appeared). Using physiological measurements (EEG, eye movements, head movements and blinking frequencies) our results suggest that passengers had moderate affective preferences for human-driven conditions. Furthermore, multifractal spectra of eye movements and head movements were wider and blinking frequencies were decreased during unexpected events. Our findings further establish real-world physiological measurements as a source of information in researching the acceptance and usage of self-driving technologies.
Similar content being viewed by others
Introduction
It has been argued that the most challenging roadblocks towards widespread acceptance of self-driving vehicles could be more social than technical in nature1,2,3,4. Perception of usefulness5,6, trust7,8, safety9,10 have all been identified as important factors that influence acceptance. Investigating the perceptual factors of technology acceptance, however, has its own challenges. Perception is rather an umbrella term, as it may refer to the opinion of the user, feelings, ideas, attitudes colored by personality traits, previous experiences, desires etc11,12. These factors to this day are commonly measured using questionnaires or interviews13,14. However, from a more biological or ecological perspective, perception refers to awareness of factors of the environment that have significance with respect to our goals, actions, and well-being. Most importantly, perceptual processes involve the visual and auditory system the central nervous system (CNS) and they are closely coupled to cardiovascular, respiratory, and motor systems. This strongly suggests that an effective research method investigating factors of perception and acceptance of AV technologies must include physiological measurements.
In a previous pilot study, electroencephalography (EEG) and eye-movement data were collected in a 3-minute-long real-world driving situation15. Participants in the front passenger seat experienced both self-driving and human driving modes. In the analyses, both measurement types signaled a difference between traditional and self-driving: participants’ physiological signals, such as frontal alpha asymmetry and multifractal spectrum, suggested a higher preference for human drivers in contrast to the autonomous driving mode. However, even for a road trip this short, the experience is not homogeneous. During a road trip, passengers experienced regular or expected events such as acceleration, deceleration, and changing lanes following the curvature of the pavement. Irregular or less expected events may also occur when sudden deceleration or acceleration or quick path alterations are necessary to avoid collisions. A passenger, being affected by the movement of the vehicle, may form conscious or unconscious expectations about the dynamics of these events. The extent to which these expectations are being met could be a factor in the perception of safety and usefulness of any given mode of transportation16,17.
Our aim in the current investigation was to test human-driven and self-driving scenarios with the addition of unexpected events to passenger experience. Our main question was how a combination of complex physiological measures could be used to identify differences in reactions to unexpected situations when comparing self-driving with human-driven scenarios. For this experiment we utilized a car that had a self-driving system more closely based on human driving compared to the one we used in our pilot15. Even though experimental assessment of physiological and psychological responses to different types of self-driving technologies is outside of the scope of the current study, our analyses may serve as starting points towards such investigations. For adding unexpected events, life size dummies made of plastic and other relatively collision-safe materials were placed on the side of the road, prompting quick evasion maneuvers in both human-driven and self-driving conditions. We expected these events to either override the effect of driving conditions or interact with them in ways that allow us to draw conclusions about the possible moderating effect of the unexpected events on the physiology of the passengers. If measured differences between self-driving and human driving conditions get moderated or eliminated by unexpected events, it may signal an important message to developers and users alike: firsthand experiences of safety and the ability to handle critical situations on the road may be an essential factor in technology acceptance. In the following, we enumerate the main characteristics of the measurements we utilized during the data collection as well as the most important results from the literature and articulate our hypotheses.
EEG
In the past decade, there has been a rapid growth in the availability of portable sensors, making it possible to design ecologically valid field studies to measure physiological signals such as eye movements or the electric activity of the brain18. EEG signals consist of different frequency ranges, and their ratio allows one to draw conclusions about changes in one’s mental and emotional state over the recording19.
Numerous studies demonstrated that higher frequency oscillations reflect more alerted and aroused states when the power ratio of higher (beta: 13–30 Hz, gamma: 30–45 Hz) and lower frequencies (alpha: 8–12 Hz, theta: 4–8 Hz) are compared20,21. In contrast, lower frequencies dominate when the participant is relaxed. For example, concentrated attention was correlated with higher relative gamma activity21, and enhanced stress level was reflected by increased beta/alpha21,22, decreased alpha/beta and theta/beta23 or increased relative gamma ratio24.
In addition to arousal, a further well-studied area of EEG band signatures is to follow-up emotional and motivational states. Based on the hemispherical (left-minus-right) differences in the alpha power at frontal electrodes (frontal alpha asymmetry):25,26, higher values indicate more positive or approaching attitudes, while lower values indicate more negative or withdrawal attitudes26,27,28. Frontal alpha asymmetry was found not only to be a good primer of depression26 but also an indicative tool in applied sciences29.
Both arousal and frontal alpha asymmetry were used in studies investigating drivers’ or passengers’ reactions to unexpected road events, mostly in simulated tasks. For example, when the driver detected a hazard cue, the power of the alpha band immediately decreased, and the beta power increased approximately 300 ms after the cue appeared30. Similarly, when a ship driver could not avoid a collision, higher frequencies dominated in comparison to successful collision avoidance31. Explicitly focusing on self-driving vehicles, in a case study32, participants were presented with positive (smooth highway driving) and negative (erratic driving and violating common rules of the road) driving situations. The beta-to-alpha power ratio increased in the negative scenario, suggesting elevated stress levels when being exposed to hazardous driving. Similarly, when participants were facing take-over situations in an autonomous vehicle simulator, they subjectively rated the multi-modal warning signals as the most effective on a Likert-scale. Moreover, these warning signals were predominantly accompanied by the presence of higher frequencies in EEG21.
Regarding emotional valence33, when malfunctions appeared during a simulated drive in an autonomous vehicle, frontal alpha power reduction was present in the right but not in the left hemisphere during the fully automated condition compared to when participants were able to control the vehicle. The authors interpreted this effect as an enhanced motivation of the driver towards controlling the vehicle, which was in line with participants’ explicitly verbalized preferences33. In a further study utilizing narrow vehicles, higher arousal and higher frontal alpha asymmetry values were measured when the vehicle was able to tilt in curved sections of the road, which was in line with subjective evaluations reflecting user satisfaction34. Comparable results were found when instead of a traffic situation, wheelchair users were sitting in an autonomous wheelchair driving across a narrow/constrained or a wide/open path. Narrow paths resulted in frontal alpha asymmetry pattern related to avoidance during these hazardous situations35.
Eye movements and head movements
Recently, more focused analyses have targeted eye and head movements in the context of developing driver assistance systems and self-driving navigation technologies. Drivers and passengers are surrounded by a highly dynamic, rapidly changing environment in which the visual information that is relevant for navigation must be sampled in specific ways, often including a wide variety of eye and head movements36. It has been pointed out that despite being linked to distinct neuromuscular systems, eye and head movements are highly correlated37 and are a source of an extensive amount of information about the driver or the passenger. In a more general context, fluctuations in movements of the eyes, head or hands could be harnessed as an information “substrate” spreading across the perceptual/motor system linked to coordination and cognition38. In this line of research, multifractal analysis of movement data (more precisely, high rate sampling of posture, head, hands or eye displacement) has been repeatedly shown to signal cognitive transitions and processes, including problem-solving39, magnitude perception40, perceptual intent41, visual recognition42, comprehension43, and memory44. In our previous study15, these measures also indicated a connection to anxiety and heightened awareness among passengers traveling in self-driving modes. Focusing on fluctuations and displacement data also facilitates non-invasive, real-world research methods that allow for experimentation in close to real-life circumstances. Besides complex movement patterns, relatively simple changes in spontaneous blink frequency (SBF) have been reported to show correlation with anxiety and novel stimulation45,46.
Hypotheses
The goal of the present study was to investigate the fluctuation changes in physiological signals of a passenger when unexpected road events appear and the potential differences between situations when a human person drives the car (Human condition) versus when the vehicle requires no manual input from a human driver to complete the track (Self-driving condition).
For the EEG measurement, we hypothesized lower frontal alpha asymmetry values in the self-driving condition. We also hypothesized higher arousal and lower frontal alpha asymmetry values to unexpected road events. For the eye-, and headtracking data, we expected to see the effect of novelty and anxiety manifest in a narrower multifractal spectrum for the self-driving conditions countered by the contribution of unexpected events where a heightened need for visual information may result in broader spectra. For the same reason, blinking frequencies were expected to show lower values for the novel and unexpected stimulation.
Methods
Participants and procedure
41 healthy adult volunteers (mean age: 39.175 years, SD = 11.200 years, from 21 to 65 years, 18 females, 3 left-handed) participated in the present study. All of them reported normal or corrected-to-normal vision and hearing and no psychiatric or neurological problems. Participants were recruited via social media, and they received no monetary compensation. They gave written informed consent prior to the study. The experiment was conducted in accordance with the Declaration of Helsinki and the protocol was approved by the United Ethical Review Committee for Research in Psychology (EPKEB), Hungary under ref. number SZTE-PI 2022-71(2021-70, 2020-89). The size of the sample used in this study was based on research resources and the number of participants in similar experiments15,33,47.
Before and after the experiment, participants were required to fill personality and demographic questionnaires via an online form. As the main scope of the present study is to address differences in the physiological signals during the ride, the data from the surveys will be presented elsewhere.
The experiment took place at ZalaZONE vehicle testing environment in Zalaegerszeg, Zala county, Hungary, on the 30th and 31st of May 2022. ZalaZONE48 was proved to be the most suitable test environment for testing self-driving vehicles in Hungary for the two-day test series. The ZalaZONE proving ground features a high-speed handling course, with dimensions of 2000 m in length and 12 m in width, and a soft gravel/asphalt run-off area supported by an 80 cm basalt foundation. This module is ideal for testing and developing vehicles under different driving conditions due to its varied topography and ability to simulate real-world driving scenarios. It is particularly suitable for chassis development, steering, damping, braking, and acceleration maneuvers required in the vehicle and tire development industry, as well as advanced driver assistance systems and automated driving test scenarios. The course can also be used for inter-module testing, enabling various traffic situations to be simulated. The vehicle was KIA Negra test car, modified by ZalaZONE to be able to navigate the entire track without any help or manual input from the driver. In short, the modified vehicle was operated through its electronic actuators, which executed the pre-recorded movements of a human driver.
To establish unexpected driving situations, two dummies were placed on the road that the vehicle had to avoid. The two blue dots in Fig. 1. indicate the position of the two dummies. These dummies were devices used during official ISO 19206-2, Euro NCAP, CNCAP, JNCAP tests, suitable for testing under rough conditions. The deer weighed 7.5 kg and measured 1490 × 1210 × 270 mm. The child dummy weighed approx. 4 kg and was 1154 mm tall. The unexpected event or “encounter” happened as the vehicle approached the dummies and made a quick path modification to avoid collision. The maneuver was safe for a participant wearing a seatbelt. It was an imitation of an event that did not prompt emergency actions but required path correction.
When participants arrived at the test field, they were debriefed about the goal and the schedule of the experiment. They gave informed consent for the experiment, and completed a short questionnaire, then EEG, eye-tracking glasses and smart watch were mounted on them.
After that, the participant and two experimenters holding separate portable laptops for EEG and eye tracking signals were seated in the vehicle. The driver was seated in the driver’s seat (front left) and the participant was seated in the passenger’s seat (front right). The two experimenters were seated in the back seat, operating the recording equipment. Participants were instructed to behave like a passenger in general while sitting relaxed and minimizing head and body movement. The route was taken twice: first, a professional driver was driving (Human condition), and in the second round, the self-driving mode was switched on and the driver released the steering wheel (Self-driving condition) or vice versa.
In the self-driving condition, the self-driving mode worked under the supervision of the operator. The vehicle stayed in self-driving mode as long as the operator was holding the Deadman switch. When the operator deemed a situation risky, he could take control of the vehicle by releasing this switch. In addition, there was a red Emergency button in the vehicle, which, when pressed, also deactivated self-driving. The vehicle travelled at a maximum speed of 60 km/h.
The order of human and self-driving conditions was counterbalanced between participants: 21 participants started with the human, and 20 participants started with the self-driving condition. Both types of blocks lasted about 3 min and were recorded separately. Before and after the two runs, participants completed a set of questionnaires. The whole session at the testing environment took about an hour per participant.
Measures and data analysis
EEG
Continuous EEG data was recorded with a portable OpenBCI 4-channel Ganglion board utilizing Lab Streaming Layer (LSL) from OpenBCI GUI at a 200 Hz sampling rate. Four gold cup electrodes were attached to participants’ scalp with conductive paste (Ten20) in accordance with the 10–20 system49 to F3, F4, FPz and Oz. Two additional electrodes were attached to the left and right mastoids serving as reference and ground electrodes, respectively. Impedances were kept below 30 kΩ.
Preprocessing data
On the continuous EEG data, independent component analysis (ICA) was applied to detect and correct artefacts such as eye movements, muscular or heart signals by the FastICA algorithm of scikit-learn Python module (version 1.1.0;50). The artefact-corrected EEG was filtered offline; a bandpass filter (7–48 Hz, 9th order Butterworth) was applied as this range characterized the frequencies of our interest. After filtering, continuous data was segmented into 2-second long epochs with 1-second overlapping parts. Epochs with a signal range exceeding ± 100 µV (typically due to movement or blink artefacts) were excluded from further analysis.
Power spectral density (PSD) values were calculated for alpha (8–12 Hz), beta (13–30 Hz) and gamma (30–45 Hz) frequency bands by applying the Welch method. Two indices were calculated for each epoch: first, affectivity as indexed by frontal alpha asymmetry was computed as the difference between log10 transformed values of F4 and F325,26. Second, arousal was defined as the ratio of PSD in the beta and gamma range to the alpha range at the averaged F3 and F4 electrodes. Higher values represent more positive emotional valence24,25,26 and higher arousal20,21. Outlier epochs that deviated from the participant’s individual mean by at least 3 standard deviations (SD) were removed before averaging data for both affectivity and arousal. Data was averaged into four segments based on the position of the dummies. The deer dummy was placed at about 50% of the route and the human pedestrian was placed at about 90% of the route. These spots were used as anchors for segmenting the data into different road events. The first segment started at the beginning of the route and lasted until the deer dummy minus 5 s (1000 time points). The second segment started 5 s before the deer dummy and lasted until 60% of the block. The third segment started at 60% of the block and lasted until the human pedestrian dummy minus 5 s. The fourth segment started at 5 s before the human pedestrian dummy and lasted until the end of the route. The first and third segments were considered as “smooth” while the second and fourth segments were considered as “unexpected” parts of the ride. The interval of 1000 time points was chosen because the dummies appeared in the visual field approximately at this time.
Eye tracking
Eye movements were recorded using the Core System manufactured by Pupil Labs (Berlin, Germany) connected to a portable laptop computer (ROG Zephyrus GA410QE). This system has two types of cameras: one is positioned forward to record the field of view of the wearer in HD video at 30 Hz, and a second infrared camera recorded the participant’s eye movements. Pupil Player software developed by the manufacturer (version 3.5, https://pupil-labs.com/products/core/), was used to export the eye movements data for further analysis.
Preprocessing data
Real-life eye-tracking data has more noise that those recorded in controlled laboratory environments. Therefore, we improved accuracy using convolution algorithms to detect the pupil area in the raw video data of the eye cameras. The segmentation procedure was set to be as sensitive as possible, i.e., the area that best matched to the pupil was selected as the output of the function. This also caused the system to indicate a match scattered over the entire area of the image, even when no pupil was present51.
The center, x y size, and rotation angle of the ellipsis that best fit the segmented areas were determined. Next, the eye coordinates were converted to degrees of rotation by determining the center and extent of the vitreous body (bulbus oculi) and the horizontal rotation of the camera using a line connecting the horizontal eye muscles (m. rectus medialis and m. rectus lateralis). Using these values, the xy pixel coordinates of the eye were converted to YX degrees of rotation52. To accept the validity of the measurement, we ignored coordinates that were outside the area of the bulbus oculi and also ignored points between which the speed of the eyeball exceeded 500 degrees/Sect53. The omitted points were filled in with 20 ms gaps (cca. 2 frames in 120 Hz sample rate) using linear interpolation in the timeframe. Gaps greater than 20 ms were marked as measurement errors. In the case where the time intervals considered as measurement errors were equal for both eyes, they were labeled as blinks. To suppress the sampling frequency harmonics and ensure proper representation of the signal waveforms, we used a 60 Hz second-order Butterworth low pass filter at the end of the signal processing pipeline.
Head movement and blinking
To calculate the head movement, the 30 Hz video from the forward pointing camera was used. The images from the camera were compared with the previous images looking for common details, using OpenCV Point Feature Matching From the displacement function. We mapped the points that moved in the same direction and speed, thus determining the total camera movement in x (Horizontal) - y (Vertical) and z (Rotation) directions. We have ignored the effect of pincushion distortion caused by the forward tilt. We also ignored the points which were outside the windscreen area as well as did not follow the movement of most of the points. The resulting displacement coordinates were converted to degrees by knowing the approximate distance of car internals and camera’s angle of view54. Although the procedure is not 100% accurate, it is a very strong approximation to determine head displacement.
The recorded pupil and head movement data were segmented according to the events experienced on the handling track. More specifically, three segments were separated: one for normal travel with no unexpected event, one for encountering the deer dummy and one for encountering the human dummy. These segments were equal in length to the collected time series. For each segment, the width of the multifractal spectrum was calculated using the Chhabra and Jensen estimation method55 for both eyes, and for head movements in the x and y planes. Blinking frequencies were also summed for each segment.
Statistical analysis
For EEG data, we compared overall differences in arousal and affectivity between human and self-driving conditions, and the type of the route events. Separately for arousal and frontal alpha asymmetry values, a two-way repeated measures ANOVA was run: Condition (Human/Self-driving) × Route event (Smooth/Deer/Kid). Similar tests were applied to the multifractal spectrum of eye and head movements and for blinking frequencies. Statistical analyses were conducted in R Statistical language56. Generalized eta-square (η2G) effect sizes57,58 are also reported.
Results
Because of the excessive amount of movement artifacts or data recording issues during the ride (for example, lost connection between the EEG amplifier and the laptop), data of 9 participants had to be excluded from further analysis of EEG data. We excluded data of 7 participants from eye movement, 6 participants of head movement analysis due to low signal confidences.
EEG affectivity (frontal alpha asymmetry)
Regarding frontal alpha asymmetry, a significant main effect of Condition was present (F(1, 31) = 5.800, p = 0.022, η2G = 0.052), suggesting lower values in the Self-driving condition in general. The main effect of Route event (F(2, 62) = 0.521, p = 0.597, η2G < 0.001) and the Condition × Route event interaction (F(2, 62) = 0.004, p = 0.996, η2G < 0.001) interaction were not significant. Frontal alpha asymmetry values are plotted in Fig. 2a.
EEG arousal
No significant effects were detected in arousal levels. The main effects of the Condition (F(1, 31) = 0.004, p = 0.950, η2G < 0.001) and the Route event (F(2, 62) = 1.457, p = 0.241, η2G < 0.001) were not significant. The Condition × Route event (F(2, 62) = 0.121, p = 0.886, η2G < 0.001) was not significant either. Arousal values are presented in Fig. 2b.
Eye movements
For the investigated segments, the main effect of Condition was not significant. There was a significant main effect for Route event (smooth, deer, kid) F(2, 68) = 12.416, p < 0.001, η2G < 0.111 for the left eye and F(2, 66) = 5.9, p = 0.004, η2G = 0.055 for the right eye. For both eyes, post hoc Tukey tests revealed that the encounter with the deer was linked to significantly broader spectra than either of the two other event types for the left eye (smooth vs. deer t(33) =-4.38, ptukey < 0.001; deer vs. kid t(33) = 3.84, ptukey < 0.001). Similar results were found for the right eye as well (smooth vs. deer t(33) = -2.606, ptukey = 0.035; deer vs. kid t(33) = 2.98, ptukey = 0.014). Figure 3 illustrates the overall pattern in multifractal spectrum width in the two Conditions of travel. There was no main effect of the Condition.
Head movements
Multifractal spectrum width was significantly different in head movements for event types in both horizontal (x) F(2, 68) = 11.057, p < 0.001, η2G = 0.086 and vertical (y) planes: F(2, 68) = 9.09, p < 0.001, η2G = 0.068. Post hoc Tukey test revealed that both unexpected Route events (deer, kid) were linked to significantly broader spectra than normal traveling in the horizontal plane (smooth vs. deer t(34) =-4.071, ptukey < 0.001, smooth vs. kid t(34) =-3.869, ptukey < 0.001). In the vertical plane Post hoc Tukey test revealed that the difference between no event vs. deer were approaching significance t(34) =-2.23, ptukey = 0.08, smooth vs. kid t(34) =-3.92, ptukey = 0.001 and deer vs. kid t(34) =-2.88, ptukey = 0.018 were both significant. The effect of mode of transportation generated broader spectra in head movement for the Self-driving condition but it was approaching significant in the horizontal plane F(1, 34) = 4.04, p = 0.052, η2G = 0.019. Figure 4 illustrates the overall pattern in the three Conditions. There was no main effect of the Condition. (Fig. 4. here)
Blinking frequency
Significantly lower blinking frequencies (main effect of Route event) were recorded during the unexpected events F(2, 128) = 33.34, p < 0.001, η2G = 0.154). The interaction between Route events and travel Conditions was also significant F(2, 128) = 4.49, p = 0.013, η2G = 0.024). This latter interaction was due to the difference between smooth and eventful driving conditions. Figure 5 shows differences of blinking frequency in the two Conditions of travel (Fig. 5 here).
Discussion
In this study, we aimed to compare reactions to unexpected situations in self-driving versus human-driven scenarios using a combination of complex physiological measures. We recorded and analyzed EEG, eye movement, head movement and blinking frequency in relation to driving conditions and road events. Despite some contrasts, our results were comparable to earlier findings15. The expanded design allowed us to see possible interactions between the effect of driving modes and the experienced events.
The most notable result of the EEG analyses showed a significant difference in frontal alpha asymmetry for human and self-driving conditions, suggesting lower affectivity values during the novel experience of the vehicle navigating by itself. This replicates previous findings that participants preferred the situation when the vehicle was driven by a human driver over the self-driving mode15. As it was hypothesized earlier, a lack of perceived control over the movement of the vehicle32,33 is a possible explanation for the observed disparity here.
Multifractal analysis of both eye movements and head movements indicated a strong effect of the events that demanded quick path corrections from both human and self-driving systems. The sudden need for heightened awareness and more detailed information seeking behavior may be among the reasons for the measured differences along with the need for visual stabilization during the quick evading maneuver. The fact that the spectral structure of the eye movements and the head movements were not identical but indicated a very similar susceptivity to events is a rather promising finding. The previously reported15 overall narrower spectrum in eye movements for the self-driving condition was reduced or missing from these results. The absence of significant differences in the multifractal structure of the eye movements may index, among other causes, differences in the navigation mechanics of the utilized self-driving system or the moderating effect of the included events. In the current study, the self-driving vehicle’s steering, accelerating and decelerating behavior were modeled very closely after human driving. A comparison of the velocity profiles of the two driving modes for each event revealed very similar acceleration patterns (average peak difference was 0.02 g). This finding highlights the possibility of using eye-tracking or head-tracking to evaluate self-driving technologies based on how close they “feel” to human driving. Another factor behind the similarity between the modes of transportation could have been the events that prompted evading action in both human and automated driving; diminishing the importance of navigation type in comparison to concerns of safe travel.
Broader multifractal spectra recorded during the encounter with the deer dummy may indicate the heightened need for visual information in a critical situation. The second encounter was arguably less surprising due to the placement of the kid dummy on the side of the road on the last straight portion of the track, where it was clearly visible ahead (as demonstrated in Fig. 1). Very similarly to eye movements, recorded head movements also showed wider multifractal spectra during the events, indicating either a heightened need for visual information and/or a preparedness to counterbalance the forces of the quick maneuvers to avoid collision. One measure that clearly indexed the difference between human-driven and self-driving under normal circumstances was blinking frequency. Passengers blinked with significantly lower frequencies in the self-driving condition before encountering the dummies. During these novel experiences, the blinking rate in both human and self-driving conditions got even lower, diminishing the difference between modes of transportation. The pattern of the data suggests that under smooth driving conditions the novelty of self-driving induced a somewhat stronger inhibition of blinking. The unexpected events introduced an even higher effect in this direction, further lowering the blinking rate similarly in both driving modes.
The study presented here has certain limitations. We collected a large amount of data, but due to time and resource limitations, we could not include a wider range of analyses. For example, multifractal analyses of the EEG channels or fixation/saccade-based analyses of the eye tracking data remain to be conducted in follow-up reports. Another type of limitation is the way we reported aggregated and, therefore, simplified values of the events. While the analyses preserve the complexity of the movements, some aggregation is required to demonstrate the differences or changes over time as a function of events or conditions. One might also wonder whether the duration of the trials was long enough to measure reliable differences as the trials were relatively short. This again, was due to resource limitations, and running the experiment requires managing a large infrastructure, which is time-consuming. To a certain degree, we compensated the brevity of the experience with high sample rates that allowed for a fine-scale analysis of the events. On the other hand, it is important to note that some physiological responses do not require extended exposure or long timescales. Events that are of importance in vehicle navigation are generally fast. In this study we made an attempt to bring these two relatively quick structures to correspond. To conclude, our findings further establish real-world physiological measurements as a source of information in researching acceptance and usage of self-navigating technologies. The answer to our research question is that a combination of complex physiological measures can be applied to identify the differences in response to an unexpected event between self-driving and human driving. Our main results suggest that participants demonstrated a more withdrawal attitude in the self-driving condition, and that in critical road situations there is an enhanced need for visual information and preparedness. Incorporating physiological data into technological development is still in its infancy. Human-machine interaction59, stress and fatigue indicators21,30,31,32,33,34,35,60, cognitive load47, wearable medical monitoring61 are some of the more salient examples of creating feedback loops in development and utilization influenced by continuous measurement on user’s physiology. Widespread acceptance of new modes of transportation will likely benefit from these interactive loops especially in situations where developers need to monitor whether user expectations are being met.
Data availability
The datasets generated during and/or analyzed during the current study are available in the OSF repository, [https://osf.io/dhcaf/?view_only=a12e150a53a94585a32597c129ee5d4f]. Codes used in the analyses of the datasets are available from the corresponding author on reasonable request.
Change history
10 April 2025
A Correction to this paper has been published: https://doi.org/10.1038/s41598-025-97000-8
References
Cohen, T. et al. A constructive role for social science in the development of automated vehicles. Transp. Res. Interdiscipl Perspect. 6, 100133 (2020).
Grindsted, T. S., Christensen, T. H., Freudendal-Pedersen, M., Friis, F. & Hartmann-Petersen, K. The urban governance of autonomous vehicles – in love with AVs or critical sustainability risks to future mobility transitions. Cities 120, 103504 (2022).
Hőgye-Nagy, Á., Kovács, G. & Gy, K. Acceptance of self-driving cars among the university community: effects of gender, previous experience, technology adoption propensity, and attitudes toward autonomous vehicles. Transp. Res. Part. F: Traffic Psychol. Behav. 94, 353–361 (2023).
Patel, K. et al. Ann. Identifying individuals’ perceptions, attitudes, preferences, and concerns of shared autonomous vehicles: during- and post-implementation evidence. Transp. Res. Interdisciplinary Perspect. 18, 100785 (2023).
Leicht, T., Chtourou, A. & Youssef, K. B. Consumer innovativeness and intentioned autonomous car adoption. J. High. Technol. Manag. Res. 29 (1), 1–1 (2018).
Kaye, S. A., Li, X., Oviedo-Trespalacios, O. & Afghari, A. P. Getting in the path of the robot: Pedestrians’ acceptance of crossing roads near fully automated vehicles. Travel Behav. Soc. 26, 1–8 (2022).
Wang, H., Feng, J., Li, K. & Chen, L. Deep understanding of big geospatial data for self-driving: data, technologies, and systems. Future Generation Comput. Syst. 137, 146–163 (2022).
Kenesei, Z. et al. Trust and perceived risk: how different manifestations affect the adoption of autonomous vehicles. Transp. Res. Part. A: Policy Pract. 164, 379–393 (2022).
Cho, Y., Park, J., Park, S. & Jung, E. S. Technology acceptance modeling based on user experience for autonomous vehicles. Journal of the Korean Ergonomics Society 36(2), 87–108 (2017).
Acharya, S. & Mekker, M. Importance of the reputation of the data manager in the acceptance of connected vehicles. Commun. Transp. Res. 2, 100053 (2022).
Xiao, J. & Goulias, K. G. Perceived usefulness and intentions to adopt autonomous vehicles. Transp. Res. Part. A: Policy Pract. 161, 170–185 (2022).
Tan, H., Zhao, X. & Yang, J. Exploring the influence of anxiety, pleasure, and subjective knowledge on public acceptance of fully autonomous vehicles. Comput. Hum. Behav. 131, 107187 (2022).
Keszey, T. Behavioral intention to use autonomous vehicles: Systematic review and empirical extension. Transp. Res. Part C. 119, 1–16 (2020).
Lukovics, M. et al. Combining survey-based and neuroscience measurements in customer acceptance of self-driving technology. Transp. Res. Part. F: Traffic Psychol. Behav. 95, 46–58 (2023).
Palatinus, Z. et al. Physiological measurements in social acceptance of self-driving technologies. Sci. Rep. 12, 13312 (2022).
Gilpin, L. H. Anticipatory thinking: A testing and representation challenge for self-driving cars. 55th Annual Conference on Information Sciences and Systems (CISS) (2021) 1–2. (2021).
Wirthmüller, F., Schlechtriemen, J., Hipp, J. & Reichert, M. Teaching vehicles to anticipate: a systematic study on probabilistic behavior prediction using large data sets. IEEE Trans. Intell. Transp. Syst. 22, 7129–7144 (2019).
Ladouce, S., Donaldson, D. I., Dudchenko, P. A. & Ietswaart, M. Understanding minds in real-world environments: Toward a mobile cognition approach. Frontiers in Human Neuroscience 10(2017) (2017).
Luck, S. J. An introduction to the event-related potential technique. MIT Press (2014).
Kim, T. Y., Ko, H. & Kim, S. H. Data analysis for emotion classification based on bio-information in self-driving vehicles. J Adv Transport. 1–11 (2020). (2020). https://doi.org/10.1155/2020/8167295
Lee, J. & Yang, J. H. Analysis of driver’s EEG given take-over alarm in SAE Level 3 automated driving in a simulated environment. Int. J. Automot. Technol. 21(3), 719–728. https://doi.org/10.1007/s12239-020-0070-3 (2020).
Jun, G. & Smitha, K. G. EEG based stress level identification. IEEE International Conference on Systems, Man, and Cybernetics (SMC), 003270–003274. (2016). https://doi.org/10.1109/SMC.2016.7844738
Yi Wen, T. Y. & Aris, S. Electroencephalogram (EEG) stress analysis on alpha/beta ratio and theta/beta ratio. Indones J. Elect. Eng. Comput. Sci. 17(1), 175–182. https://doi.org/10.11591/ijeecs.v17.i1.pp175-182 (2020).
Minguillon, J., Lopez-Gordo, M. A. & Pelayo, F. Stress assessment by prefrontal relative gamma. Front. Comput. Neurosci. 10, 101. https://doi.org/10.3389/fncom.2016.00101 (2016).
Davidson, R. D., Ekman, P., Saron, C. D., Senulis, J. A. & Friesen, W. V. Approach-withdrawal and cerebral asymmetry: Emotional expression and brain physiology I. J. Pers. Soc. Psychol. 58(2), 330–341 (1990).
Harmon-Jones, E. & Gable, P. A. On the role of asymmetric frontal cortical activity in approach and withdrawal motivation: an updated review of the evidence. Psychophysiology 55(1). https://doi.org/10.1111/psyp.12879 (2018).
Hartikainen, K. M. Emotion-attention Interaction in the right hemisphere. Brain Sci. 11(8), 1006. https://doi.org/10.3390/brainsci11081006 (2021).
Sun, L., Peräkylä, J. & Hartikainen, K. M. Frontal alpha asymmetry, a potential biomarker for the effect of neuromodulation on brain’s affective circuitry—preliminary evidence from a deep brain stimulation study. Front. Hum. Neurosci. 11, 584. https://doi.org/10.3389/fnhum.2017.00584 (2017).
Briesemeister, B. B., Tamm, S., Heine, A. & Jacobs, A. M. Approach the good, withdraw from the bad—a review on frontal alpha asymmetry measures in applied psychological research. Psychology 4(03), 261–267. https://doi.org/10.4236/psych.2013.43A039 (2013).
Cao, R. et al. Hemispheric asymmetry of functional brain networks under different emotions using EEG data. Entropy 22, 939 (2020).
Wang, Z. et al. Emotional state evaluation during collision avoidance operations of seafarers using ship bridge simulator and wearable EEG. In. 6th International Conference on Transportation Information and Safety (ICTIS), 415–422 (IEEE, 2021). (2021).
Park, C., Shahrdar, S. & Nojoumian, M. IEEE,. EEG-based classification of emotional state using an autonomous vehicle simulator. In 2018 IEEE 10th Sensor Array and Multichannel Signal Processing Workshop (SAM), 297–300 (2018).
Seet, M. et al. Differential impact of autonomous vehicle malfunctions on human trust. IEEE Trans. Intell. Transp. Syst. 23(1), 548–557. https://doi.org/10.1109/TITS.2020.3013278 (2022).
Gwak, J. et al. Effects of Tilting mechanism of narrow vehicle on psychophysiological states of driver. Int. J. Automot. Eng. 11(3), 124–128. https://doi.org/10.20485/jsaeijae.11.3_124 (2020).
Abdur-Rahim, J. et al. Multi-sensor based state prediction for personal mobility vehicles. PloS One 11(10), e0162593 (2016).
Martin, S., Tawari, A. & Trivedi, M. M. Monitoring head dynamics for driver assistance systems: A multi-perspective approach. 16th International IEEE Conference on Intelligent Transportation Systems (ITSC 2286–2291 (2013). https://doi.org/10.1109/ITSC.2013.6728568
Watanabe, Y. et al. Dynamic analysis of head movements by means of a three-dimensional position measurement system. Graefe’s Arch. Clin. Exp. Ophthalmol. 226, 418–424. https://doi.org/10.1007/BF02170000 (1988).
Carver, N. S., Bojovic, D. & Kelty-Stephen, D. G. Multifractal foundations of visually-guided aiming and adaptation to prismatic perturbation. Hum. Mov. Sci. 55, 61–72 (2017).
Dixon, J. A., Holden, J. G., Mirman, D. & Stephen, D. G. Multifractal dynamics in the emergence of cognitive structure. Top. Cogn. Sci. 4, 51–62 (2012).
Palatinus, Z., Dixon, J. A. & Kelty-Stephen, D. G. Fractal fluctuations in quiet standing predict the use of mechanical information for haptic perception. Ann. Biomed. Eng. 41, 1625–1634 (2013).
Palatinus, Z., Kelty-Stephen, D. G., Kinsella-Shaw, J., Carello, C. & Turvey, M. T. Haptic perceptual intent in quiet standing affects multifractal scaling of postural fluctuations. J. Exp. Psychol. Hum. Percept. Perform. 40, 1808 (2014).
Freije, M. et al. Multifractal Detrended Fluctuation Analysis of Eye-Tracking Data. Lecture Notes in Computational Vision and Biomechanics vol. 27 484 (2018).
Wallot, S., O’Brien, B., Coey, C. A. & Kelty-Stephen, D. Power-law fluctuations in eye movements predict text comprehension during connected text reading. In CogSci (2015).
Fetterhoff, D. Multifractal Complexity of Hippocampal Memory Processing (Wake Forest University, 2015).
Maffei, A. & Angrilli, A. Spontaneous blink rate as an index of attention and emotion during film clips viewing. Physiol. Behav. 204, 256–263 (2019).
Bacher, L. F. & Smotherman, W. P. Systematic temporal variation in the rate of spontaneous eye blinking in human infants. Dev. Psychobiol. 44, 140–145. https://doi.org/10.1002/dev.10159 (2004).
Figalová, N. et al. From driver to supervisor: Comparing cognitive load and EEG-Based attentional resource allocation across automation levels. Int. J. Hum. Comput. Stud. 182, 103169 (2024).
Szalay, Z., Hamar, Z. & Nyerges, A. Novel design concept for an automotive proving ground supporting multilevel CAV development. Int. J. Veh. Des. 80(1), 1–22 (2020).
Nuwer, M. R. et al. IFCN standards for digital recording of clinical EEG. Electroencephalogr. Clin. Neurophysiol. 106(3), 259–261. https://doi.org/10.1109/SMC.2016.7844738 (1998).
Pedregosa, F. et al. Scikit-learn: machine learning in Python. J. Mach. Learn. Res. 12, 2825–2830 (2011).
Kothari, R. S., Chaudhary, A. K., Bailey, R. J., Pelz, J. B. & Diaz, G. J. Ellseg: An ellipse segmentation framework for robust gaze tracking. IEEE Trans. Vis. Comput. Graph. 27(5), 2757–2767 (2021).
Krafka, K. et al. Eye tracking for everyone. In Proceedings of the IEEE conference on computer vision and pattern recognition 2176–2184. (2016).
Bahill, A. T., Clark, M. R. & Stark, L. The main sequence, a tool for studying human eye movements. Math. Biosci. 24(3–4), 191–204 (1975).
Bradski, G. The OpenCV library. Dr Dobb’s Journal: Softw. Tools Prof. Program. 25(11), 120–123 (2000).
Chhabra, A. & Jensen, R. V. Direct determination of the f(α) singularity spectrum. Phys. Rev. Lett. 62, 1327–1330 (1989).
R Core Team. R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. URL (2021). https://www.R-project.org/
Bakeman, R. Recommended effect size statistics for repeated measures designs. Behav. Res. Methods, 37(3), 379–384. https://doi.org/10.3758/BF03192707 (2005).
Olejnik, S., Algina, J. G. Eta omega squared statistics. Measures of effect size for some common research designs. Psychol. Methods. 8(4), 434–447. https://doi.org/10.1037/1082-989X.8.4.434 (2003).
Silva, H., Fairclough, S. H., Holzinger, A., Jacob, R. J. K. & Tan, D. S. Introduction to the special issue on physiological Computing for Human-Computer Interaction. ACM Trans. Comput-Hum Interact. 21, 1–4 (2015).
Majid, N. A. et al. Development of body stress Analyzer based on physiological Signal. J. Phys. Conf. Ser., 1529. (2020).
Orphanidou, C. A review of big data applications of physiological signal data. Biophys. rev. 11, 83–87 (2019).
Acknowledgements
We thank Tamás Králik, Júlia Simon and Balázs Szabó from Mindtech Ltd. for their contribution in data collection and for providing the mobile EEG devices.
Funding
This research was funded by National Research, Development and Innovation Office – NKFIH, OTKA K137571.
Author information
Authors and Affiliations
Contributions
Z.P. oversaw data collection and analysis, participated in design, writing the manuscript. M.V. analyzed the EEG data, and participated in designing the study, data collection, writing the manuscript. Z.D. worked out the data cleaning and data transformation methods and participated in the analyses. M.L. managed interdisciplinary connections and infrastructure, participated in writing the manuscript. Z.M-P. managed research resources and participated in designing the study. S.P. participated in organizing the primary research, participated in data collection.All authors approved the final version of the manuscript for submission and are responsible for the content.
Corresponding authors
Ethics declarations
Competing interests
The authors declare no competing interests.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
The original online version of this Article was revised: The Funding section in the original version of this Article was omitted. The Funding section now reads: “This research was funded by National Research, Development and Innovation Office – NKFIH, OTKA K137571.”
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Palatinus, Z., Lukovics, M., Volosin, M. et al. Passenger physiology in self-driving vehicles during unexpected events. Sci Rep 15, 7899 (2025). https://doi.org/10.1038/s41598-024-81960-4
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/s41598-024-81960-4