
NSF Org: |
IIS Division of Information & Intelligent Systems |
Recipient: |
|
Initial Amendment Date: | February 23, 2023 |
Latest Amendment Date: | August 1, 2024 |
Award Number: | 2238313 |
Award Instrument: | Continuing Grant |
Program Manager: |
Cindy Bethel
cbethel@nsf.gov (703)292-4420 IIS Division of Information & Intelligent Systems CSE Directorate for Computer and Information Science and Engineering |
Start Date: | June 15, 2023 |
End Date: | November 30, 2028 (Estimated) |
Total Intended Award Amount: | $600,039.00 |
Total Awarded Amount to Date: | $294,983.00 |
Funds Obligated to Date: |
FY 2024 = $159,510.00 |
History of Investigator: |
|
Recipient Sponsored Research Office: |
601 S HOWES ST FORT COLLINS CO US 80521-2807 (970)491-6355 |
Sponsor Congressional District: |
|
Primary Place of Performance: |
200 W. Lake St. FORT COLLINS CO US 80521-4593 |
Primary Place of
Performance Congressional District: |
|
Unique Entity Identifier (UEI): |
|
Parent UEI: |
|
NSF Program(s): | HCC-Human-Centered Computing |
Primary Program Source: |
01002425DB NSF RESEARCH & RELATED ACTIVIT 01002526DB NSF RESEARCH & RELATED ACTIVIT 01002627DB NSF RESEARCH & RELATED ACTIVIT 01002728DB NSF RESEARCH & RELATED ACTIVIT |
Program Reference Code(s): |
|
Program Element Code(s): |
|
Award Agency Code: | 4900 |
Fund Agency Code: | 4900 |
Assistance Listing Number(s): | 47.070 |
ABSTRACT
Augmented reality (AR) glasses are transforming the workplace by allowing users to manipulate virtual objects in a real environment. From offices to manufacturing to service, AR technology offers far-reaching benefits, but improving how users interact with virtual worlds through AR glasses is essential to this transformation. AR systems must be designed to understand the subtleties and complexities of human communication, which come in a broad combination of modes?gestures, speech, text, gaze, and more. Current AR head-mounted systems cannot process multiple communication modes and they recognize only coarse gestures that are awkward and tiring for users to produce. One approach to reducing fatigue and promoting intuitive interaction is for the system to recognize and process microgestures. In this project, these simple, small finger movements, akin to those needed to control the volume on a car radio, are studied to build foundational knowledge needed to make human-AR interaction systems more intuitive and broadly accessible. Reducing the barriers to interaction with AR technology will allow people to concentrate on their tasks rather than the technology itself. This research impacts fields such as meteorology, finance, healthcare, and scientific visualization, facilitating rich data exploration, and improving data understanding and decision making. Implementing this research in massive open online courses (MOOCs) will benefit students learning independently and in education institutions, allowing access to anyone interested in this topic.
This project advances AR multimodal interaction research by studying intuitive microgesture interaction techniques and additional communication modalities. The proposed research plan for these studies involves inputs combined with speech, a 6 degrees-of-freedom (6DoF) pen, a controller, and midair gestures. Bimanual interaction will also be evaluated. The results of these studies will answer an important question, ?What should microgestures and multimodal interaction look like in AR??, leading to a unified framework of multimodal interactions and the creation of a Multimodal Interaction ToolKit (MITK). Immersive analytics (that is, 3D data visualizations) will be used as a domain for performing system validations. Further, the investigator will evaluate and extend the findings from AR to virtual reality (VR). This research advances the state of the art in AR and VR interaction by: (1) researching and developing microgestures for selection, manipulation, and navigation; (2) designing, developing, and evaluating novel unimodal and multimodal interaction techniques, including virtual widgets; and (3) expanding fundamental knowledge of microgestures and their use in joint tasks. The research team will develop a new multimodal interaction textbook and an open online course on multimodal interaction using immersive analytics.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH
Note:
When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external
site maintained by the publisher. Some full text articles may not yet be available without a
charge during the embargo (administrative interval).
Some links on this page may take you to non-federal websites. Their policies may differ from
this site.
Please report errors in award information by writing to: awardsearch@nsf.gov.