Academic interests
- Processing and perception of human and AI-generated voices
- Social and emotional neural systems
- Neuroscience of personality and individual differences
PhD Project
Neural dynamics of processing natural and digital emotions: How does the human brain process and perceive the difference between emotional human and AI-generated voices?
We are actively recruiting participants for my project.
Research Group
The Cognitive and Affective Neuroscience lab (CANlab)
Teaching
I have taught some lectures and led a few seminars in:
I have also advised / am co-advising students in:
Background
- 2021 - M.Phil. Cognitive Neuroscience, UiO
- 2018 - M.Sc. Neuroscience, NTNU
- 2016 - B.A. Psychology, UH Mānoa, Hawaiʻi
Tags:
fMRI,
Social cognition,
Cognitive neuroscience,
Neuroscience,
Personality psychology
Publications
-
-
Skjegstad, Christine Leilani & Frühholz, Sascha
(2024).
Neural Dynamics of Processing Natural and Digital Emotional Vocalizations.
Show summary
AI-generated voices have a high level of auditory quality and resemble human voices, such that they might mimic human emotions to a high degree. In our study, we investigated how the human brain distinguishes between emotional natural (human) and synthetic (AI-generated) voices. 35 participants (23 females, mean age = 32.23, SD = 7.66) listened to vocal pseudo-speech emotional expressions (neutral, angry, fear, happy, pleasure) from both human and AI-generated sources. The experiment consisted of three parts: an fMRI study with a classification task to identify the voices as synthetic or natural, a perceptual task where participants rated characteristics of the voice stimuli, and an assessment of personality traits. Neutral AI voices were identified with 74.9% accuracy compared to only 23% accuracy for neutral human voices. Happy human voices were correctly identified 77% of the time while happy AI voices were identified with 34.5% accuracy. In general, human voices were rated as more natural, trustworthy, and authentic, especially the happy and pleasure expressions. Neuroimaging results revealed significant differences in brain activity in response to the AI and human voices. AI voices activated the right anterior midcingulate cortex, right dorsolateral prefrontal cortex and left thalamus, which may indicate increased vigilance and cognitive regulation. In contrast, human voices elicited stronger responses in the right hippocampus as well as regions associated with emotional processing and empathy such as the right inferior frontal gyrus, anterior cingulate cortex and angular gyrus. Human voices may elicit a sense of relatedness and AI voices may elicit heightened alertness.
View all works in Cristin
Published
Nov. 8, 2022 11:37 AM
- Last modified
Feb. 27, 2024 10:36 PM