Skip to feedback

Award Abstract # 2238313
CAREER: HCC: Microgesture and Multimodal Interaction Techniques for Augmented Reality

NSF Org: IIS
Division of Information & Intelligent Systems
Recipient: COLORADO STATE UNIVERSITY
Initial Amendment Date: February 23, 2023
Latest Amendment Date: August 1, 2024
Award Number: 2238313
Award Instrument: Continuing Grant
Program Manager: Cindy Bethel
cbethel@nsf.gov
 (703)292-4420
IIS
 Division of Information & Intelligent Systems
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: June 15, 2023
End Date: November 30, 2028 (Estimated)
Total Intended Award Amount: $600,039.00
Total Awarded Amount to Date: $294,983.00
Funds Obligated to Date: FY 2023 = $135,473.00
FY 2024 = $159,510.00
History of Investigator:
  • Francisco Ortega (Principal Investigator)
    fortega@colostate.edu
Recipient Sponsored Research Office: Colorado State University
601 S HOWES ST
FORT COLLINS
CO  US  80521-2807
(970)491-6355
Sponsor Congressional District: 02
Primary Place of Performance: Colorado State University
200 W. Lake St.
FORT COLLINS
CO  US  80521-4593
Primary Place of Performance
Congressional District:
02
Unique Entity Identifier (UEI): LT9CXX8L19G1
Parent UEI:
NSF Program(s): HCC-Human-Centered Computing
Primary Program Source: 01002324DB NSF RESEARCH & RELATED ACTIVIT
01002425DB NSF RESEARCH & RELATED ACTIVIT

01002526DB NSF RESEARCH & RELATED ACTIVIT

01002627DB NSF RESEARCH & RELATED ACTIVIT

01002728DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 1045, 7367, 9251
Program Element Code(s): 736700
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070

ABSTRACT

Augmented reality (AR) glasses are transforming the workplace by allowing users to manipulate virtual objects in a real environment. From offices to manufacturing to service, AR technology offers far-reaching benefits, but improving how users interact with virtual worlds through AR glasses is essential to this transformation. AR systems must be designed to understand the subtleties and complexities of human communication, which come in a broad combination of modes?gestures, speech, text, gaze, and more. Current AR head-mounted systems cannot process multiple communication modes and they recognize only coarse gestures that are awkward and tiring for users to produce. One approach to reducing fatigue and promoting intuitive interaction is for the system to recognize and process microgestures. In this project, these simple, small finger movements, akin to those needed to control the volume on a car radio, are studied to build foundational knowledge needed to make human-AR interaction systems more intuitive and broadly accessible. Reducing the barriers to interaction with AR technology will allow people to concentrate on their tasks rather than the technology itself. This research impacts fields such as meteorology, finance, healthcare, and scientific visualization, facilitating rich data exploration, and improving data understanding and decision making. Implementing this research in massive open online courses (MOOCs) will benefit students learning independently and in education institutions, allowing access to anyone interested in this topic.

This project advances AR multimodal interaction research by studying intuitive microgesture interaction techniques and additional communication modalities. The proposed research plan for these studies involves inputs combined with speech, a 6 degrees-of-freedom (6DoF) pen, a controller, and midair gestures. Bimanual interaction will also be evaluated. The results of these studies will answer an important question, ?What should microgestures and multimodal interaction look like in AR??, leading to a unified framework of multimodal interactions and the creation of a Multimodal Interaction ToolKit (MITK). Immersive analytics (that is, 3D data visualizations) will be used as a domain for performing system validations. Further, the investigator will evaluate and extend the findings from AR to virtual reality (VR). This research advances the state of the art in AR and VR interaction by: (1) researching and developing microgestures for selection, manipulation, and navigation; (2) designing, developing, and evaluating novel unimodal and multimodal interaction techniques, including virtual widgets; and (3) expanding fundamental knowledge of microgestures and their use in joint tasks. The research team will develop a new multimodal interaction textbook and an open online course on multimodal interaction using immersive analytics.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

Plabst, Lucas and Raikwar, Aditya and Oberdörfer, Sebastian and Ortega, Francisco Raul and Niebling, Florian "Exploring Unimodal Notification Interaction and Display Methods in Augmented Reality" , 2023 https://doi.org/10.1145/3611659.3615683 Citation Details
Rodriguez, Richard and Sullivan, Brian T and Barrera_Machuca, Mayra Donaji and Batmaz, Anil Ufuk and Tornatzky, Cyane and Ortega, Francisco R "An Artists' Perspectives on Natural Interactions for Virtual Reality 3D Sketching" CHI Conference on Human Factors in Computing Systems (CHI '24) , 2024 https://doi.org/10.1145/3613904.3642758 Citation Details
Williams, Adam and Zhou, Xiaoyan and Batmaz, Anil_Ufuk and Pahud, Michel and Ortega, Francisco "A Pilot Study Comparing User Interactions Between Augmented and Virtual Reality" In: Bebis, G., et al. Advances in Visual Computing. ISVC 2023. Lecture Notes in Computer Science , 2023 https://doi.org/10.1007/978-3-031-47966-3_1 Citation Details
Zhou, Xiaoyan and Batmaz, Anil Ufuk and Williams, Adam Sinclair and Schreiber, Dylan and Ortega, Francisco Raul "I Did Not Notice: A Comparison of Immersive Analytics with Augmented and Virtual Reality" In Extended Abstracts of the CHI Conference on Human Factors in Computing Systems (CHI EA 24) , 2024 https://doi.org/10.1145/3613905.3651085 Citation Details

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page