Award Abstract # 1734744
NCS-FO: Active Listening and Attention in 3D Natural Scenes

NSF Org: DUE
Division Of Undergraduate Education
Recipient: THE JOHNS HOPKINS UNIVERSITY
Initial Amendment Date: August 7, 2017
Latest Amendment Date: December 20, 2022
Award Number: 1734744
Award Instrument: Standard Grant
Program Manager: Ellen Carpenter
DUE
 Division Of Undergraduate Education
EDU
 Directorate for STEM Education
Start Date: August 1, 2017
End Date: July 31, 2023 (Estimated)
Total Intended Award Amount: $948,067.00
Total Awarded Amount to Date: $1,155,675.00
Funds Obligated to Date: FY 2017 = $948,067.00
FY 2020 = $189,608.00

FY 2023 = $18,000.00
History of Investigator:
  • Cynthia Moss (Principal Investigator)
    cynthia.moss@jhu.edu
  • Rajat Mittal (Co-Principal Investigator)
  • Mounya Elhilali (Co-Principal Investigator)
  • Susanne Sterbing-D'Angelo (Co-Principal Investigator)
Recipient Sponsored Research Office: Johns Hopkins University
3400 N CHARLES ST
BALTIMORE
MD  US  21218-2608
(443)997-1898
Sponsor Congressional District: 07
Primary Place of Performance: Johns Hopkins University
3400 N. Charles Street
Baltimore
MD  US  21218-2686
Primary Place of Performance
Congressional District:
07
Unique Entity Identifier (UEI): FTMTDMBR29C7
Parent UEI: GS4PNKTRNKL3
NSF Program(s): ECR-EDU Core Research,
IntgStrat Undst Neurl&Cogn Sys
Primary Program Source: 04002324DB NSF STEM Education
01001718DB NSF RESEARCH & RELATED ACTIVIT

04001718DB NSF Education & Human Resource

04002021DB NSF Education & Human Resource
Program Reference Code(s): 8089, 8091, 8551, 8817, 9251
Program Element Code(s): 798000, 862400
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.076

ABSTRACT

As humans and other animals move around, the distance and direction between their bodies and objects in their environment are constantly changing. Judging the position of objects, and readjusting body movements to steer around the objects, requires a constantly updated map of three-dimensional space in the brain. Generating this map, and keeping it updated during movement, requires dynamic interaction between visual or auditory cues, attention, and behavioral output. An understanding of how spatial perception is generated in the brain comes from decades of research using visual or auditory stimuli under restricted conditions. Far less is known about the dynamics of how natural scenes are represented in freely moving animals. This project will bridge this gap by studying how freely flying bats navigate through their environment using echolocation. Specifically, a team of engineers and neuroscientists will investigate how the bat brain processes information associated with flight navigation. The project team will provide education and training in engineering and science to public school, undergraduate and graduate students, and to postdoctoral researchers. This research will also contribute to a rich library of materials, including videos and a website, which will be available to educators and scientists working in both the private and public sectors.

This project leverages innovative engineering tools, cutting-edge neuroscience methods and neuroethological modeling to pursue a multidisciplinary investigation of dynamic feedback between 3D scene representation, attention and action-selection in freely moving animals engaged in natural tasks. The echolocating bat, the subject of the project's research, actively produces the acoustic signals that it uses to represent natural scenes and therefore provides direct access to the sensory information that guides behavior. The specific goals of the project are to test the hypotheses that 1) natural scene representation operates through the interplay between sensory processing, adaptive motor behaviors, and attentional feedback, 2) spatio-temporal responses to sensory streams across ensembles of neurons sharpen when an animal adapts its behavior to attend to selected targets, and 3) spatio-temporal sharpening of neural responses enables figure-ground segregation in the natural environment. The project integrates 1) novel acoustic measurements and computational analyses to represent the sonar scene based on reconstructions of the bat's sonar transmitter and receiver characteristics, combined with a 3D acoustic model of the environment, 2) quantitative analysis of the echolocating bat's adaptive echolocation and flight behaviors as it negotiates complex environments, 3) multichannel neural telemetry recordings from the midbrain of the free-flying bat as it attends to targets, obstacles and other acoustic signals in its surroundings, and 4) computational modeling of auditory system architecture, attention and working memory mechanisms. Collectively, this research will deepen the understanding of behavioral modulation of natural scene representation.

This project is funded by Integrative Strategies for Understanding Neural and Cognitive Systems (NSF-NCS), a multidisciplinary program jointly supported by the Directorates for Computer and Information Science and Engineering (CISE), Education and Human Resources (EHR), Engineering (ENG), and Social, Behavioral, and Economic Sciences (SBE).

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

(Showing: 1 - 10 of 35)
Allen, K.M. "Communication with self, friends and foes by active sensing animals" Journal of experimental biology , v.224 , 2021 https://doi.org/https://doi.org/10.1242/jeb.242637 Citation Details
Allen, K.M. "Effect of background clutter on neural discrimination in the bat auditory midbrain" Journal of neurophysiology , 2021 https://doi.org/10.1152/jn.00109.2021 Citation Details
Allen, K.M. "Orienting our view of the superior colliculus: Specializations and general functions" Current opinion in neurobiology , v.71 , 2021 https://doi.org/https://doi.org/10.1016/j.conb.2021.10.005. Citation Details
Bellur, Ashwin and Elhilali, Mounya "Bio-Mimetic Attentional Feedback in Music Source Separation" ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) , 2020 https://doi.org/10.1109/ICASSP40776.2020.9054552 Citation Details
Diebold, Clarice Anna and Salles, Angeles and Moss, Cynthia F. "Adaptive Echolocation and Flight Behaviors in Bats Can Inspire Technology Innovations for Sonar Tracking and Interception" Sensors , v.20 , 2020 10.3390/s20102958 Citation Details
Huang, N. and Elhilali, M. "Push-pull competition between bottom-up and top-down auditory attention to natural soundscapes" ELife , v.9 , 2020 https://doi.org/10.7554/eLife.52984 Citation Details
Jones, Te K. and Moss, Cynthia F. "Visual cues enhance obstacle avoidance in echolocating bats" Journal of Experimental Biology , v.224 , 2021 https://doi.org/10.1242/jeb.241968 Citation Details
Kaya, Emine Merve and Huang, Nicolas and Elhilali, Mounya "Pitch, Timbre and Intensity Interdependently Modulate Neural Responses to Salient Sounds" Neuroscience , v.440 , 2020 https://doi.org/10.1016/j.neuroscience.2020.05.018 Citation Details
Kothari, N.B. and Wohlgemuth, M.J. and Moss, C.F. "Dynamic representation of 3D auditory space in the midbrain of the free-flying echolocating bat" eLife , v.7 , 2018 DOI: https://doi.org/10.7554/eLife.29053 Citation Details
Kothari, Ninad B. and Wohlgemuth, Melville J. and Moss, Cynthia F. "Adaptive sonar call timing supports target tracking in echolocating bats" The Journal of Experimental Biology , v.221 , 2018 10.1242/jeb.176537 Citation Details
Kothinti, Sandeep and Skerritt-Davis, Benjamin and Nair, Aditya and Elhilali, Mounya "Synthesizing Engaging Music Using Dynamic Models of Statistical Surprisal" ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) , 2020 https://doi.org/10.1109/ICASSP40776.2020.9054500 Citation Details
(Showing: 1 - 10 of 35)

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

We completed a successful interdisciplinary research project, Active listening and attention in 3D natural scenes, that advances understanding of cognitive and neural processes in realistic, complex environments.  This project leveraged innovative engineering tools, cutting-edge neuroscience methods and a neuroethological model system to pursue a multidisciplinary investigation of dynamic feedback between 3D scene representation, attention, and action-selection in freely moving animals engaged in natural tasks. We conducted research that shed light on 1) the interplay between sensory processing, adaptive motor behaviors, and spatial attention in building natural scene perception and 2) attention-evoked changes in the brain?s response to natural sounds.

Intellectual Merit

The success of our project emerged from on the team?s broad expertise in systems neuroscience, behavioral biology, acoustics and engineering to investigate the neural underpinning of natural scene representation and attention-guided action.

1. Our research yielded scientific advances by probing the active sensing system of echolocating bats, animals that perceive the 3D world by processing echo returns from their own sonar calls. The directional aim and temporal patterning of the bat?s echolocation calls provide a quantifiable metric of the animal?s attention to objects in the environment.

2.  Echolocating bats engaged in a variety of natural tasks in complex environments.  Our interdisciplinary work revealed the dynamics of neural processing in the context of natural behaviors.

3.  The computational acoustics tools that were developed through this project have direct applications for the analysis of human speech and hearing, bioacoustics, aeroacoustics, architectural acoustics, acoustic device design, environmental noise and even musical acoustics.  

Broader impacts

1.  Interdisciplinary Research Training.  Our project provided an interdisciplinary training platform for undergraduate and graduate students, as well as postdoctoral researchers. We involved a diverse group of graduate students and postdocs in the project, which allowed them to acquire new knowledge and a variety of research tools. The graduate students and postdocs also worked together in teams with undergraduate NSF REU students and Amgen Scholars who participated in research over the summer.  Many of the NSF REU students were inspired by their research experiences to apply to graduate school in biology, psychology, neuroscience and engineering, veterinary school, and medical school.  In addition, three Johns Hopkins students received REU stipends during the academic year to conduct research on one of the following projects: 1) Sonar target tracking in cluttered environments, 2) Target distance representation in the bat hippocampus, and 3) DREADD midbrain inactivation in bats performing a sonar-guided navigation task. These projects harnessed sophisticated behavioral assays, neural recordings and chemogenetic manipulations in bats tracking moving objects and steering around obstacles.  Not only did the students become deeply involved in research and learn new skills, but their discoveries advance knowledge of the mechanisms supporting real-world sensorimotor behaviors and can guide development of assistive devices for the elderly and disabled. 

2.     Education and Outreach

a.  This project's research activities contributed to our library of multimedia materials, which we make available to educators and scientists through our Johns Hopkins University Bat Lab website.   

b.  We hosted a Johns Hopkins outreach event in the spring of 2023 that featured a public lecture by Daniel Kish, who is blind since infancy and uses echolocation to navigate.  Daniel Kish provided valuable feedback to graduate students and postdocs who participated in a weekend hackathon to develop new devices and other technologies for visually impaired humans.  See attached photos.  Discoveries emerging from this project have advanced knowledge of the mechanisms supporting real-world sensorimotor behaviors and guide development of assistive devices for the blind. 

 

 

 


Last Modified: 09/16/2023
Modified by: Cynthia F Moss

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page