Award Abstract # 1619273
CHS: Small: Wearable Interfaces to Direct Agent Teams with Adaptive Autonomy

NSF Org: IIS
Division of Information & Intelligent Systems
Recipient: NEW MEXICO STATE UNIVERSITY
Initial Amendment Date: July 8, 2016
Latest Amendment Date: June 14, 2019
Award Number: 1619273
Award Instrument: Standard Grant
Program Manager: Dan Cosley
dcosley@nsf.gov
 (703)292-8832
IIS
 Division of Information & Intelligent Systems
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: August 1, 2016
End Date: July 31, 2020 (Estimated)
Total Intended Award Amount: $495,628.00
Total Awarded Amount to Date: $495,628.00
Funds Obligated to Date: FY 2016 = $495,628.00
History of Investigator:
  • Phoebe Toups Dugas (Principal Investigator)
    phoebe.toupsdugas@monash.edu
  • Marlena Fraune (Co-Principal Investigator)
  • Son Tran (Co-Principal Investigator)
  • Igor Dolgov (Former Co-Principal Investigator)
Recipient Sponsored Research Office: New Mexico State University
1050 STEWART ST.
LAS CRUCES
NM  US  88003
(575)646-1590
Sponsor Congressional District: 02
Primary Place of Performance: New Mexico State University
NM  US  88003-8002
Primary Place of Performance
Congressional District:
02
Unique Entity Identifier (UEI): J3M5GZAT8N85
Parent UEI:
NSF Program(s): HCC-Human-Centered Computing
Primary Program Source: 01001617DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 7923, 7367, 9150
Program Element Code(s): 736700
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070

ABSTRACT

Unmanned robotic systems are set to revolutionize a number of vital human activities, including disaster response, public safety, citizen science, and agriculture, yet such systems are complex and require multiple pilots. As algorithms take over, and controls are simplified, workers benefit from directing, rather than controlling, these systems. Such simplifications could enable workers to use their hands and focus their perception in the physical world, relying on wearable interfaces (e.g., chording keyboards, gesture inputs) to manage teams of unmanned vehicles. Adaptive autonomy, in which unmanned systems alter their need for human attention in response to complexities in the environment, offers a solution in which workers can use minimal input to enact change. The present research combines wearable interfaces with adaptive autonomy to direct teams of software agents, which simulate unmanned robotic systems. The outcomes will support next-generation unmanned robotic system interfaces.

The objective of this project is to develop wearable interfaces for the direction of a team of software agents that make use of adaptive autonomy and ascertain the effectiveness of interface designs to direct agents. This research develops a testbed for wearable cyber-human system designs that uses software agents as unmanned robotic system simulations and uses adaptive-autonomy algorithms to drive the agents. The research develops a framework connecting wearable interface modalities to the activities they best support. Developed systems will be validated through mixed reality environments in which participants will direct software agents while acting in the physical world. The principal hypothesis is that a set of interconnected interfaces can be developed that, through appropriate control algorithms, maximizes an operator's control span over a team of agents and optimizes the operator's physical workload, mental workload, and situation awareness.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

(Showing: 1 - 10 of 26)
Khalaf, Ahmed S. and Alharthi, Sultan A. and Alshehri, Ali and Dolgov, Igor and Toups Dugas, Phoebe O. "A Comparative Study of Hand-Gesture Recognition Devices for Games" Lecture notes in computer science , v.12182 , 2020 https://doi.org/10.1007/978-3-030-49062-1_4 Citation Details
Khalaf, Ahmed S. and Alharthi, Sultan A. and Hamilton, Bill and Dolgov, Igor and Tran, Son and Toups Dugas, Phoebe O. "A Framework of Input Devices to Support Designing Composite Wearable Computers" Lecture notes in computer science , v.12182 , 2020 https://doi.org/10.1007/978-3-030-49062-1_28 Citation Details
Khalaf, Ahmed S. and Alharthi, Sultan A. and Tran, Son and Dolgov, Igor and Toups Dugas, Phoebe O. "A Taxonomy for Selecting Wearable Input Devices for Mixed Reality" Proceedings of the 2019 ACM International Conference on Interactive Surfaces and Spaces , 2019 https://doi.org/10.1145/3343055.3360759 Citation Details
Khalaf, Ahmed S and Pianpak, Poom and Alharthi, Sultan A and NaminiMianji, Zahra and Torres, Ruth and Tran, Son and Dolgov, Igor and Toups_Dugas, Phoebe O "An Architecture for Simulating Drones in Mixed Reality Games to Explore Future Search and Rescue Scenarios" , v.2169 , 2018 Citation Details
Márquez Segura, Elena and Spiel, Katta and Johansson, Karin and Back, Jon and Toups Dugas, Phoebe O. and Hammer, Jessica and Waern, Annika and Tanenbaum, Theresa Jean and Isbister, Katherine "Larping (Live Action Role Playing) as an Embodied Design Research Method" Companion Publication of the 2019 on Designing Interactive Systems Conference 2019 Companion (DIS '19 Companion) , 2019 https://doi.org/10.1145/3301019.3320002 Citation Details
Martin Gebser and Philipp Obermeier and Thomas Otto and Torsten Schaub and Orkunt Sabuncu and Van Nguyen and Tran Cao Son "Experimenting with robotic intra-logistics domains" Theory and Practice of Logic Programming , v.18 , 2018 , p.502--519 10.1017/S1471068418000200
Pianpak, Poom and Son, Tran Cao and Toups_Dugas, Phoebe O and Yeoh, William "A distributed solver for multi-agent path finding problems" DAI '19: Proceedings of the First International Conference on Distributed Artificial Intelligence , 2019 https://doi.org/10.1145/3356464.3357702 Citation Details
Thanh Hai Nguyen and Enrico Pontelli and Tran Cao Son "Phylotastic: An Experiment in Creating, Manipulating, and Evolving Phylogenetic Biology Workflows Using Logic Programming" Theory and Practice of Logic Programming , v.18 , 2018 , p.656-672 https://doi.org/10.1017/S1471068418000236
Tiep Le and Tran Cao Son and Enrico Pontelli "Multi-Context Systems with Preferences" Fundamenta Informaticae , v.158 , 2018 , p.171-216 10.3233/FI-2018-1646
Toups Dugas, Phoebe O. and Lalone, Nicolas and Alharthi, Sultan A. and Sharma, Hitesh Nidhi and Webb, Andrew M. "Making Maps Available for Play: Analyzing the Design of Game Cartography Interfaces" ACM Transactions on Computer-Human Interaction , v.26 , 2019 https://doi.org/10.1145/3336144 Citation Details
Toups, Z O. and Lalone, Nicolas and Alharthi, Sultan A. and Sharma, Hitesh Nidhi and Webb, Andrew M. "Making Maps Available for Play: Analyzing the Design of Game Cartography Interfaces" ACM Trans. Comput.-Hum. Interact. , v.26 , 2019 10.1145/3336144
(Showing: 1 - 10 of 26)

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

This is the outcomes report for project 1619273: CHS: Small: Wearable Interfaces to Direct Agent Teams with Adaptive Autonomy. This work was aimed at supporting future disaster response scenarios where operatives would benefit from multiple drones providing information. We explored how to direct those drones through wearable  technology that did not inhibit movement or awareness. Specifically, we were interested in developing more intelligent multi-agent systems (i.e., software simulations of drones) and better user interfaces for disaster contexts (e.g., wearable computer configurations). The work made use of a simulation environment to enable human participants to don wearable computers, move and act outdoors, and interact with virtual drones. 


The grant has produced a set of reusable software and hardware configurations to investigate the value of wearable computers to support human-robot teams in the field (https://pixllab.github.io/URSDocumentation/). The system connects together multiple pieces of software to enable a mixed reality experience of working with virtual drones. In our scenarios to date, we look at using a game that is an analog of urban search and rescue. The player moves around in the real world while their location is tracked by the system. The player seeks out virtual goals. To assist the player, multiple drones can be deployed to find goals the player cannot reach (e.g., on top of buildings).
To provide game logic and a first-person gameworld user interface through an avatar, we use an engine built on Unity. A drone simulation platform, Gazebo, tracks and simulates the virtual drones, accounting for avoiding obstacles. A customized planner, taking inputs described by the Planning Domain Description Language (PDDL), provides the drones? intelligence and communicates with Gazebo using the Robot Operating System. Finally, a set of hardware and software interfaces connect with these components to create a user interface that provides information about the gameworld and the ability to interact with the virtual drones. In our first studies, we use NASA WorldWind for a map visualization, displayed on a wrist-worn touchscreen, and provide drone data on an HMD.


The project produced a design framework to guide building composite wearable computers. Constructing the framework involved qualitatively analyzing over 100 sources for information about components that could be used to build wearable computers. It identifies four dimensions that are essential to choosing devices: type of interactivity provided, associated output modalities, mobility, and body location for each device identified for study. The framework supports designers in ensuring a resulting composite computer supports the types of interaction necessary for the designed tasks and that the combination of devices will be compatible on the human body. We expect the framework to support researchers and designers in building new wearable computers in a number of domains, enabling the selection of the right devices to support various contexts. 


The simulated environment for controlling teams of drones provides solutions for issues that arise from hybrid human-drone team coordination and planning. To the best of our knowledge, this is the first simulation environment that allows a single human to control a team of robots at the same time through wearable devices. We undertook a number of user studies, starting with early prototypes and concluding with a test of multiple wearable configurations identified through the framework. Overall, participants were able to use our mixed reality system as planned. We identified needs for further training and our studies show ways in which future wearable user interfaces might be improved. 

 

 

 


Last Modified: 11/23/2020
Modified by: Zachary O Toups

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page