Chuanji Gao

Date of Award

Summer 2020

Document Type

Open Access Dissertation



First Advisor

Svetlana V. Shinkareva


In everyday life, we receive affective information from a multisensory environment. What we see and what we hear jointly influence how we feel, think and act. Outstanding questions still remain about the essential behavioral and neural mechanism underlying how we combine visual and auditory affective signals. In this dissertation, I report a series of behavioral, EEG and fMRI experiments addressing this question. I found behaviorally there are congruency, visual dominance, and negativity dominance effects. Using ERPs, I showed that these behavioral effects can map onto different time course in audiovisual affective processing. Time-frequency analyses of EEG data showed that there are early sub-additive evoked theta, long-lasting supra-additive induced delta and beta activities. Meta-analysis of previous neuroimaging studies revealed the role of superior temporal cortex, amygdala, and thalamus in audiovisual affective processing. In an fMRI study, brain areas associated with audiovisual affective congruence and valence processing were identified, wherein superior temporal and anterior cingulate cortices have roles in both processes. Representational similarity analyses revealed modality-general brain areas that are sensitive to valence from both visual and auditory modalities; and modality-specific brain areas that are sensitive to either visual or auditory emotions. Together these convergent findings advance our understanding of behavioral and neural effects of audiovisual affective processing.