Award Abstract # 1029084
Neural Basis of Cross-modal Influences on Perception

NSF Org: BCS
Division of Behavioral and Cognitive Sciences
Recipient: UNIVERSITY OF CALIFORNIA, SAN DIEGO
Initial Amendment Date: August 26, 2010
Latest Amendment Date: July 10, 2013
Award Number: 1029084
Award Instrument: Continuing Grant
Program Manager: alumit ishai
BCS
 Division of Behavioral and Cognitive Sciences
SBE
 Directorate for Social, Behavioral and Economic Sciences
Start Date: September 1, 2010
End Date: August 31, 2015 (Estimated)
Total Intended Award Amount: $625,694.00
Total Awarded Amount to Date: $625,694.00
Funds Obligated to Date: FY 2010 = $149,553.00
FY 2011 = $154,104.00

FY 2012 = $158,282.00

FY 2013 = $163,755.00
History of Investigator:
  • Steven Hillyard (Principal Investigator)
    shillyard@ucsd.edu
Recipient Sponsored Research Office: University of California-San Diego
9500 GILMAN DR
LA JOLLA
CA  US  92093-0021
(858)534-4896
Sponsor Congressional District: 50
Primary Place of Performance: University of California-San Diego
9500 GILMAN DR
LA JOLLA
CA  US  92093-0021
Primary Place of Performance
Congressional District:
50
Unique Entity Identifier (UEI): UYTTZT6G9DT1
Parent UEI:
NSF Program(s): Cognitive Neuroscience
Primary Program Source: 01001011DB NSF RESEARCH & RELATED ACTIVIT
01001112DB NSF RESEARCH & RELATED ACTIVIT

01001213DB NSF RESEARCH & RELATED ACTIVIT

01001314DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 1699
Program Element Code(s): 169900
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.075

ABSTRACT

Many of the objects and events that we encounter in everyday life, such as a barking dog or a honking car, are both seen and heard. A basic task our brains must carry out is to bring together the sensory information that is received concurrently by our eyes and ears, so that we perceive a world of unitary objects having both auditory and visual properties. With funding from the National Science Foundation, Dr. Steven A. Hillyard and colleagues, of the University of California, San Diego, are investigating when and where in the brain the visual and auditory signals are combined and integrated to produce coherent, multi-dimensional perceptions of objects in the environment. The sensory inputs from the eyes and ears are projected initially to separate regions of the brain specialized for perceiving the visual and auditory modalities, respectively. The timing and anatomical localization of neural interactions between auditory and visual inputs is being analyzed by means of scalp recordings of brain potentials that are triggered by these sensory events, together with magnetic resonance imaging of stimulus-induced brain activity patterns. A major aim is to analyze the brain interactions that cause a stimulus in one modality (auditory or visual) to alter the perception of a stimulus in the other modality. Three types of such auditory-visual interactions are being studied: (1) the brightness enhancement of a visual event when accompanied by a sound, (2) the ventriloquist illusion, which is a shift in perceived sound location towards the position of a concurrent visual event, and (3) the double-flash illusion that is induced when a single flash is interposed between two pulsed sounds. In each case, the precisely timed sequence of neural interactions in the brain that underlie the perceptual experience will be identified, and the influence of selective attention on these interactions will be determined. The overall aim is to understand the neural mechanisms by which stimuli in different sensory modalities are integrated in the brain to achieve unified perceptions of multi-modal events in the world.

Because much of our everyday experience involves recognizing and reacting to the sights and sounds of our surroundings, understanding the principles by which these auditory and visual inputs are synthesized in the brain is important. The ability to combine auditory and visual signals effectively is particularly important in teaching and learning situations, where spoken words must be put together with a variety of pictorial, graphic, and written information in order to understand the material. This research program can contribute to the development of more efficient learning environments and teaching techniques, and lead to improved designs for communication media such as audio-visual teaching tools, information displays, and critical warning signals. These studies are exploring the role of selective attention in synthesizing auditory and visual signals, research that can lead to improved teaching techniques that emphasize the training of attention. By studying the brain systems that enable auditory and visual inputs to be combined into perceptual wholes, this research can also help to understand what goes wrong in the brains of patients who suffer from abnormalities of perception, including those with learning disabilities, attention deficit disorder, autism, and schizophrenia.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

(Showing: 1 - 10 of 19)
Andersen, S., S.A. Hillyard and M.M. Müller. "Global facilitation of attended features is obligatory and restricts divided attention." Journal of Neuroscience , v.33 , 2013 , p.18247
Bonath, B., T. Noesselt, K. Krauel, S. Tyll, C. Tempelmann and S.A. Hillyard. "Audio-visual synchrony modulates the ventriloquist illusion and its neural/spatial representation in the auditory cortex." Neuroimage , v.98 , 2014 , p.425
Brang, D., V.L. Towle, S. Suzuki, S.A. Hillyard, S. DiTusa, Z. Dai, J. Tao, S. Wu, M. Grabowecky "Peripheral sounds rapidly activate visual cortex: evidence from electrocorticography" Journal of Neurophysiology , v.Epub , 2015 , p.Epub ahea
Brang, D., Z. Taich, S.A. Hillyard, M. Grabowecky, and V.S. Ramachandran. "Parietal connectivity mediates multisensory facilitation." NeuroImage , v.78 , 2013 , p.396-401
Ding, Y., A. Martinez, Z. Qu and S.A. Hillyard. "The earliest stages of visual cortical processing are not modified by attentional load." Human Brain Mapping, , v.35 , 2014 , p.3008
Feng, Wenfeng; Martinez, Antigona; Pitts, Michael; Luo, Yue-Jia; Hillyard, Steven A. "Spatial attention modulates early face processing" NEUROPSYCHOLOGIA , v.50 , 2012 , p.3461-3468
Feng, W., V.S. Störmer, A. Martinez, J.J. McDonald and S.A. Hillyard. "Sounds activate visual cortex and improve visual discrimination." Journal of Neuroscience , v.34 , 2014 , p.9817
Flevaris, A.V., A. Martinez and S.A. Hillyard. "Attending to global versus local stimulus features modulates neural processing of low versus high spatial frequencies: an analysis with event-related potentials." Frontiers in Psychology , v.5 , 2014 , p.5:277. e
Flevaris, A.V., A. Martinez and S.A. Hillyard. "Neural substrates of perceptual integration during bistable object perception." Journal of Vision , v.13 , 2013 , p.13 (13):1
Giuliano, R.J., C.M. Karns, H.J. Neville and S.A. Hillyard. "Early auditory evoked potential is modulated by selective attention and related to individual differences in visual working memory capacity." Journal of Cognitive Neuroscience , v.26 , 2014 , p.epub July
Hillyard, S.A., V.S. Stormer, W. Feng, A. Martinez, J.J. McDonald "Cross-modal orienting of visual attention" Neuropsychologia , v.Epub , 2015 , p.Epub ahea
(Showing: 1 - 10 of 19)

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

               Events in the natural world are perceived through multiple sensory modalities, and multi-modal interactions in the brain are critical for generating coherent perceptual experiences and adaptive behavioral responses.  Our NSF-sponsored research has been concerned with understanding how cross-modal interactions take place in the human brain to achieve perceptual integration. The major research theme over the past four years has been to study how the visual and auditory systems of the brain interact, and in particular to show how sounds influence our visual perceptions.  Our principal finding has been that a salient sound automatically attracts attention to its location so that the perception of a subsequent visual stimulus that occurs at the same location is enhanced.  Thus, when a visual stimulus follows a salient sound at the same location it is perceived more accurately, enters awareness more rapidly, appears brighter and elicits a larger evoked potential in the visual cortex than does a visual stimulus presented at a different location (e.g., in the opposite visual field).

 

            We investigated the neural basis of this cross-modal facilitation of visual perception by recording changes in the electroencephalogram (EEG) elicited by sounds.  We discovered that presenting a salient sound to one side elicits a positive potential shift in the contralateral visual cortex, which was termed the “Auditory Contralateral Occipital Positivity” (ACOP).  We hypothesized that this ACOP was the neural sign of the perceptual priming of the visual cortex by the sound.   In support of this hypothesis, we found that letters that were flashed at the location of an immediately preceding sound were discriminated more accurately than letters flashed in the opposite visual field.  Most importantly, the ACOP triggered by the sound was larger when the subsequent letter discrimination was correct as opposed to when it was incorrect.  We concluded that the spatially non-predictive sound automatically captured visual attention to the sound’s location, which resulted in a lateralized facilitation of visual processing that was indexed by the ACOP.

 

                Over the past year we have extended these studies of visual cortex activity elicited by sounds into the frequency domain, using wavelet analyses.   We found that the sounds provoked a robust desynchronization (blocking) of the ongoing alpha rhythm of the EEG, and that the time course of this alpha-blocking closely paralleled that of the ACOP slow potential. In support of the close connection between the ACOP and alpha blocking, we reanalyzed the data from the letter discrimination experiment and found that trials with correct discriminations had stronger alpha blocking that incorrect trials, thereby closely paralleling the findings with ACOP.  These studies demonstrate that the blocking of alpha rhythm provoked by a sound represents a fundamental neural mechanism by which cross-modal attention facilitates visual perception.

 

 

               These studies have advanced our understanding of the brain mechanisms by which information from different sensory modalities is combined in the brain to achieve perceptual coherence in a multi-sensory world.  This research also has broader implications for the fields of communication, education, and mental health.  Because much of our everyday perceptual experience involves multi-sensory integration, co-ordination between sensory inputs in different modalities helps us localize and re...

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page