
NSF Org: |
IIS Division of Information & Intelligent Systems |
Recipient: |
|
Initial Amendment Date: | July 28, 2021 |
Latest Amendment Date: | September 9, 2024 |
Award Number: | 2112633 |
Award Instrument: | Cooperative Agreement |
Program Manager: |
Todd Leen
tleen@nsf.gov (703)292-7215 IIS Division of Information & Intelligent Systems CSE Directorate for Computer and Information Science and Engineering |
Start Date: | October 1, 2021 |
End Date: | September 30, 2026 (Estimated) |
Total Intended Award Amount: | $19,995,808.00 |
Total Awarded Amount to Date: | $16,187,212.00 |
Funds Obligated to Date: |
FY 2022 = $4,175,000.00 FY 2023 = $3,590,702.00 FY 2024 = $4,616,064.00 |
History of Investigator: |
|
Recipient Sponsored Research Office: |
926 DALNEY ST NW ATLANTA GA US 30318-6395 (404)894-4819 |
Sponsor Congressional District: |
|
Primary Place of Performance: |
225 North Avenue, NW Atlanta GA US 30332-0002 |
Primary Place of
Performance Congressional District: |
|
Unique Entity Identifier (UEI): |
|
Parent UEI: |
|
NSF Program(s): |
GVF - Global Venture Fund, AI Research Institutes, AI Research Institutes, AI Institutes-Amazon Donation, AI Institutes-Google Donations, AI Institutes-Google Donations |
Primary Program Source: |
01002324DB NSF RESEARCH & RELATED ACTIVIT 01002425DB NSF RESEARCH & RELATED ACTIVIT 04002223DB NSF Education & Human Resource 04002324DB NSF STEM Education 04002425DB NSF STEM Education 4082CYXXDB NSF TRUST FUND 4082PYXXDB NSF TRUST FUND 01002122DB NSF RESEARCH & RELATED ACTIVIT 01002223DB NSF RESEARCH & RELATED ACTIVIT 01002324DB NSF RESEARCH & RELATED ACTIVIT 01002425DB NSF RESEARCH & RELATED ACTIVIT 01002526DB NSF RESEARCH & RELATED ACTIVIT 04002122DB NSF Education & Human Resource 04002223DB NSF Education & Human Resource 04002324DB NSF STEM Education 04002425DB NSF STEM Education 04002526DB NSF STEM Education 4082XXXXDB NSF TRUST FUND |
Program Reference Code(s): |
|
Program Element Code(s): |
|
Award Agency Code: | 4900 |
Fund Agency Code: | 4900 |
Assistance Listing Number(s): | 47.070, 47.075, 47.076, 47.079 |
ABSTRACT
People collaborate with one another in work, home, and social settings, and these interactions change over time based on the capabilities, roles, responsibilities, norms, and interpersonal relationships of those in the group. Human-AI Interaction (HAI) systems can provide assistance in managing group collaborations by providing timely information about the status, context, and needs of group members, and by interacting on their behalf with other such AI systems. The area of home care for aging adults is a prime example of a complex assistive setting inspiring this research. Older adults, family caregivers, medical professionals, friends and neighbors often collaborate to respond to changing needs. To assist in such settings, HAI systems need to: (a) model the physical, mental, and social capabilities and needs of people by integrating data across many sensory modalities; (b) detect physical, cognitive, social and psychological changes in user capabilities and needs; (c) understand the dynamic relationships and capabilities across the support network; and (d) adapt interactive behaviors in order to assist the user most effectively. This project will develop approaches in human-AI interaction that learn personalized models of human behavior and how they change over time, and use that knowledge to better collaborate, communicate, and assist the user. To drive these innovations, the Institute will serve as a nexus point for collaborative efforts across academia and industry. In addition to advanced research, these collaborations will actively build the next generation of talent for a diverse, well-trained workforce through a wide range of workforce development, education, outreach, broadening participation, and knowledge transfer programs designed to disseminate knowledge about, and enthusiasm for, the development of interactive AI systems.
The AI Institute for Collaborative Assistance and Responsive Interaction for Networked Groups (AI-CARING) will develop a discipline focused on personalized, longitudinal, collaborative AI -- characterized by the design, development, and deployment of interactive, intelligent HAI systems embedded within communities of users over extended periods of time (months and years). Envisioned HAI systems will take the form of virtual assistants embedded in common consumer devices (e.g., cell phones, smart speakers) that will interact with users via speech, gesture, visual, auditory, and mixed reality interfaces. HAI systems will establish personalized longitudinal models of user abilities, goals, values, and interpersonal relationships based on aggregated sensor observations and the history of past interactions. Building on such models, networked teams of agents will provide coordinated assistance through personalized and value-driven interactions that operate in accordance with users? personal and social norms. Researchers in computing, social sciences, and healthcare will collaborate to design, develop, and deploy HAI systems that include sample-efficient techniques for user modeling and personalization, robust methods for longitudinal human-AI teaming, socially-conscious and dignity-preserving AI methodologies, explainable systems, novel guidelines for experimental design, and novel benchmarks and metrics for these areas. Co-design approaches, research demonstrations and long-term field evaluations will involve households (instrumented with different types of sensors) that include older adults with cognitive and physical impairments, their family, informal caregivers, professional health providers and community partners. AI-CARING systems will reinforce daily routines, recognize changes in behavior, provide team support for caregivers, scaffold planning for interactions with professionals, and provide ethical encouragement and feedback regarding an individual's varying abilities. These fundamental capabilities will scaffold responsive and personalized Human-AI Interaction that will transform our day-to-day experiences with AI systems. The long-term impact of this work will go beyond caregiving, extending to any application that includes long-term Human-AI Interaction through speech, gesture, visual and mixed reality interfaces.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH
Note:
When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external
site maintained by the publisher. Some full text articles may not yet be available without a
charge during the embargo (administrative interval).
Some links on this page may take you to non-federal websites. Their policies may differ from
this site.
Please report errors in award information by writing to: awardsearch@nsf.gov.