Date of Award

Fall 2023

Document Type

Open Access Thesis



First Advisor

Jessica Green


Successfully integrating multiple streams of information is paired with a variety of behavioral enhancements. Multisensory research over the past few decades has demonstrated the criticality of spatial proximity and stimulus effectiveness. Here, we focused on the temporal aspects and underlying neural activity of frequency-varying, audiovisual stimuli using scalp recorded electroencephalography (EEG). Our set of stimuli was particularly chosen because humans are most sensitive to the auditory sound and visual spatial frequencies that make up speech. Recent work has shown that audiovisual integration effects of low-level, simple stimuli are largest when both modalities fall within, compared to outside of, their respective sensitivity ranges. At the electrophysiological level, we found evidence for bilateral visual modality effects at each SOA condition, which resulted in speeded visual N1 latency and enhanced P2 amplitude event related potentials (ERPs). Additionally, when the stimuli were separated in time, we observed differences in sensory ERPs for stimuli that were perceived as simultaneous compared to those that were not integrated, as well as variations in the ERP effects based on the frequency range of the stimuli. These findings offer insight at the neural level that human sensitivity to low-level stimulus features likely plays an important role in the enhanced integration for audiovisual speech.


© 2024, Jonathan Conrady