Award Abstract # 2223793
EFRI BRAID: Unsupervised Continual Learning with Hierarchical Timescales and Plasticity Mechanisms

NSF Org: EFMA
Office of Emerging Frontiers in Research and Innovation (EFRI)
Recipient: WEST VIRGINIA UNIVERSITY RESEARCH CORPORATION
Initial Amendment Date: September 16, 2022
Latest Amendment Date: September 16, 2022
Award Number: 2223793
Award Instrument: Standard Grant
Program Manager: Steve Zehnder
szehnder@nsf.gov
 (703)292-7014
EFMA
 Office of Emerging Frontiers in Research and Innovation (EFRI)
ENG
 Directorate for Engineering
Start Date: October 1, 2022
End Date: September 30, 2026 (Estimated)
Total Intended Award Amount: $2,000,000.00
Total Awarded Amount to Date: $2,000,000.00
Funds Obligated to Date: FY 2022 = $2,000,000.00
History of Investigator:
  • Gianfranco Doretto (Principal Investigator)
    gianfranco.doretto@mail.wvu.edu
  • Donald Adjeroh (Co-Principal Investigator)
  • Gary Marsat (Co-Principal Investigator)
  • Ngan Le (Co-Principal Investigator)
  • Nicholas Szorcinski (Co-Principal Investigator)
Recipient Sponsored Research Office: West Virginia University Research Corporation
886 CHESTNUT RIDGE ROAD
MORGANTOWN
WV  US  26505-2742
(304)293-3998
Sponsor Congressional District: 02
Primary Place of Performance: West Virginia University
1374 Evansdale Drive
Morgantown
WV  US  26506-6070
Primary Place of Performance
Congressional District:
02
Unique Entity Identifier (UEI): M7PNRH24BBM8
Parent UEI:
NSF Program(s): EPSCoR Co-Funding,
EFRI Research Projects
Primary Program Source: 01002223DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 8091, 9150
Program Element Code(s): 915000, 763300
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.041, 47.083

ABSTRACT

Humans and animals can easily adapt to their environment with limited information. They sense the world around them and continuously adapt their behavior to the current situation by changing the ?configuration? of their nervous system, a phenomenon called plasticity. Though this ability seems natural to humans, it is very difficult to achieve in software or hardware systems. In addition, current continuous learning methods are trained under unrealistic conditions and require supervision. This project aims to understand how to endow autonomous agents, such as robots, with the adaptability and resiliency of biology. Biological plasticity in weakly electric fish will guide engineering of new machine learning algorithms. These algorithms will enable autonomous agents to continuously sense and adapt to their environment without interrupting operations for manual training. This interdisciplinary project is integrated with a range of outreach activities involving local high schools and undergraduate students. Workshops and demonstrations on biology-inspired machine learning will be organized, aimed at spurring interest of rural students in coding and robotics.

A grand challenge in artificial intelligence (AI) is how to achieve unsupervised continual learning in the open world. Current methods used in AI and machine learning operate with single-modality data, collected and consumed in controlled conditions, typically in a supervised manner. However, biological systems achieve lifelong learning by processing streams of multisensory data that continuously shape their neural networks (plasticity) while retaining previous knowledge (stability). This dynamic adaptation operates unsupervised, on a range of timescales and rules. The project will study those principles observed in the cerebellar feedback pathways of electric fish, which are responsible for driving plasticity, enabling adaptation of its function at different timescales and learning and forgetting at multiple speeds. This will enable the translational development of novel paradigms in continual learning that will support new levels of resiliency and lifelong learning in real-time autonomous systems in the open world. To achieve this goal the project will overcome some key technical hurdles, e.g., in enabling 1) data efficiency in processing inputs continuously as time-variant, potentially correlated, data streams in a fully unsupervised manner; 2) flexibility to learn and forget at different speeds; 3) generation of suitable internal representations from multiple modalities to improve autonomous resilience.

This project is jointly funded by the Emerging Frontiers in Research and Innovation Brain-Inspired Dynamics for Engineering Energy-Efficient Circuits and Artificial Intelligence Program (BRAID) and the Established Program to Stimulate Competitive Research (EPSCoR).

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

(Showing: 1 - 10 of 18)
Duan, Tiehang and Wang, Zhenyi and Shen, Li and Doretto, Gianfranco and Adjeroh, Donald A and Li, Fang and Tao, Cui "Retain and Adapt: Online Sequential EEG Classification With Subject Shift" IEEE Transactions on Artificial Intelligence , v.5 , 2024 https://doi.org/10.1109/TAI.2024.3385390 Citation Details
Zaveri, Ram J and Brume, Voke and Doretto, Gianfranco "Few-Shot Adaptation for Morphology-Independent Cell Instance Segmentation" , 2024 https://doi.org/10.1109/ISBI56570.2024.10635320 Citation Details
Yamazaki, Kashu and Hanyu, Taisei and Vo, Khoa and Pham, Thang and Tran, Minh and Doretto, Gianfranco and Nguyen, Anh and Le, Ngan "Open-Fusion: Real-time Open-Vocabulary 3D Mapping and Queryable Scene Representation" , 2024 https://doi.org/10.1109/ICRA57147.2024.10610193 Citation Details
Yamazaki, K and Vo, K and Truong, Q S and Raj, B and Le, N "VLTinT: Visual-Linguistic Transformer-in-Transformer for Coherent Video Paragraph Captioning" , v.37 , 2023 Citation Details
Vo, Khoa and Pham, Trong-Thang and Yamazaki, Kashu and Tran, Minh and Le, Ngan "DNA: Deformable Neural Articulations Network for Template-free Dynamic 3D Human Reconstruction from Monocular RGB-D Video" , 2023 https://doi.org/10.1109/CVPRW59228.2023.00375 Citation Details
Szczecinski, Nicholas S. and Goldsmith, C. A. and Nourse, William R. P. and Quinn, Roger D. "A perspective on the neuromorphic control of legged locomotion in past, present, and future insect-like robots" Neuromorphic Computing and Engineering , v.3 , 2023 https://doi.org/10.1088/2634-4386/acc04f Citation Details
Phan, Thinh and Vo, Khoa and Le, Duy and Doretto, Gianfranco and Adjeroh, Donald and Le, Ngan "ZEETAD: Adapting Pretrained Vision-Language Model for Zero-Shot End-to-End Temporal Action Detection" , 2024 https://doi.org/10.1109/WACV57701.2024.00689 Citation Details
Pham, Trong Thang and Brecheisen, Jacob and Nguyen, Anh and Nguyen, Hien and Le, Ngan "I-AI: A Controllable & Interpretable AI System for Decoding Radiologists Intense Focus for Accurate CXR Diagnoses" , 2024 https://doi.org/10.1109/WACV57701.2024.00767 Citation Details
Nguyen, Toan and Vu, Minh Nhat and Vuong, An and Nguyen, Dzung and Vo, Thieu and Le, Ngan and Nguyen, Anh "Open-Vocabulary Affordance Detection in 3D Point Clouds" 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) , 2023 https://doi.org/10.1109/iros55552.2023.10341553 Citation Details
Mohamadi, Salman and Doretto, Gianfranco and Adjeroh, Donald A. "More Synergy, Less Redundancy: Exploiting Joint Mutual Information for Self-Supervised Learning" 2023 IEEE International Conference on Image Processing (ICIP) , 2023 https://doi.org/10.1109/ICIP49359.2023.10222547 Citation Details
Joo, Hyekang Kevin and Vo, Khoa and Yamazaki, Kashu and Le, Ngan "CLIP-TSA: Clip-Assisted Temporal Self-Attention for Weakly-Supervised Video Anomaly Detection" , 2023 https://doi.org/10.1109/ICIP49359.2023.10222289 Citation Details
(Showing: 1 - 10 of 18)

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page