Award Abstract # 1619273
CHS: Small: Wearable Interfaces to Direct Agent Teams with Adaptive Autonomy

NSF Org: IIS
Division of Information & Intelligent Systems
Recipient: NEW MEXICO STATE UNIVERSITY
Initial Amendment Date: July 8, 2016
Latest Amendment Date: June 14, 2019
Award Number: 1619273
Award Instrument: Standard Grant
Program Manager: Dan Cosley
dcosley@nsf.gov
 (703)292-8832
IIS
 Division of Information & Intelligent Systems
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: August 1, 2016
End Date: July 31, 2020 (Estimated)
Total Intended Award Amount: $495,628.00
Total Awarded Amount to Date: $495,628.00
Funds Obligated to Date: FY 2016 = $495,628.00
History of Investigator:
  • Phoebe Toups Dugas (Principal Investigator)
    phoebe.toupsdugas@monash.edu
  • Son Tran (Co-Principal Investigator)
  • Marlena Fraune (Co-Principal Investigator)
  • Igor Dolgov (Former Co-Principal Investigator)
Recipient Sponsored Research Office: New Mexico State University
1050 STEWART ST.
LAS CRUCES
NM  US  88003
(575)646-1590
Sponsor Congressional District: 02
Primary Place of Performance: New Mexico State University
NM  US  88003-8002
Primary Place of Performance
Congressional District:
02
Unique Entity Identifier (UEI): J3M5GZAT8N85
Parent UEI:
NSF Program(s): HCC-Human-Centered Computing
Primary Program Source: 01001617DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 7367, 9150, 7923
Program Element Code(s): 736700
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070

ABSTRACT

Unmanned robotic systems are set to revolutionize a number of vital human activities, including disaster response, public safety, citizen science, and agriculture, yet such systems are complex and require multiple pilots. As algorithms take over, and controls are simplified, workers benefit from directing, rather than controlling, these systems. Such simplifications could enable workers to use their hands and focus their perception in the physical world, relying on wearable interfaces (e.g., chording keyboards, gesture inputs) to manage teams of unmanned vehicles. Adaptive autonomy, in which unmanned systems alter their need for human attention in response to complexities in the environment, offers a solution in which workers can use minimal input to enact change. The present research combines wearable interfaces with adaptive autonomy to direct teams of software agents, which simulate unmanned robotic systems. The outcomes will support next-generation unmanned robotic system interfaces.

The objective of this project is to develop wearable interfaces for the direction of a team of software agents that make use of adaptive autonomy and ascertain the effectiveness of interface designs to direct agents. This research develops a testbed for wearable cyber-human system designs that uses software agents as unmanned robotic system simulations and uses adaptive-autonomy algorithms to drive the agents. The research develops a framework connecting wearable interface modalities to the activities they best support. Developed systems will be validated through mixed reality environments in which participants will direct software agents while acting in the physical world. The principal hypothesis is that a set of interconnected interfaces can be developed that, through appropriate control algorithms, maximizes an operator's control span over a team of agents and optimizes the operator's physical workload, mental workload, and situation awareness.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

(Showing: 1 - 10 of 26)
Alharthi, Sultan A and Hamilton, William A and Dolgov, Igor and Toups_Dugas, Phoebe O "Mapping in the Wild: Toward Designing to Train Search & Rescue Planning" , 2018 https://doi.org/10.1145/3272973.3274039 Citation Details
Alharthi, Sultan A and LaLone, Nick and Khalaf, Ahmed S and Torres, Ruth and Nacke, Lennart and Dolgov, Igor and Toups_Dugas, Phoebe O "Practical Insights into the Design of Future Disaster Response Training Simulations" , 2018 Citation Details
Alharthi, Sultan A and Raptis, George E and Katsini, Christina and Dolgov, Igor and Nacke, Lennart E and Toups_Dugas, Phoebe O "Toward Understanding the Effects of Cognitive Styles on Collaboration in Multiplayer Games" , 2018 https://doi.org/10.1145/3272973.3274047 Citation Details
Alharthi, Sultan A. and Raptis, George E. and Katsini, Christina and Dolgov, Igor and Nacke, Lennart E. and Toups Dugas, Phoebe O. "Investigating the Effects of Individual Cognitive Styles on Collaborative Gameplay" ACM Transactions on Computer-Human Interaction , v.28 , 2021 https://doi.org/10.1145/3445792 Citation Details
Alharthi, Sultan A and Sharma, Hitesh Nidhi and Sunka, Sachin and Dolgov, Igor and Toups_Dugas, Phoebe O "Designing Future Disaster Response Team Wearables from a Grounding in Practice" , 2018 https://doi.org/10.1145/3183654.3183662 Citation Details
Dolgov, Igor and Sabic, Edin and White, Bryan L. "Activity Theory as a Framework for Integrating Uas Into the Nas: A Field Study of Crew Member Activity During Uas Operations Near a Non-Towered Airport" Proceedings of the Human Factors and Ergonomics Society Annual Meeting , v.62 , 2018 10.1177/1541931218621009 Citation Details
Fraune, Marlena R. and Khalaf, Ahmed S. and Zemedie, Mahlet and Pianpak, Poom and NaminiMianji, Zahra and Alharthi, Sultan A. and Dolgov, Igor and Hamilton, Bill and Tran, Son and Toups_Dugas, Phoebe O "Developing Future Wearable Interfaces for Human-Drone Teams through a Virtual Drone Search Game" International Journal of Human-Computer Studies , v.147 , 2021 https://doi.org/10.1016/j.ijhcs.2020.102573 Citation Details
GEBSER, MARTIN and OBERMEIER, PHILIPP and OTTO, THOMAS and SCHAUB, TORSTEN and SABUNCU, ORKUNT and NGUYEN, VAN and SON, TRAN CAO "Experimenting with robotic intra-logistics domains" Theory and Practice of Logic Programming , v.18 , 2018 , p.502--519 10.1017/S1471068418000200
GEBSER, MARTIN and OBERMEIER, PHILIPP and OTTO, THOMAS and SCHAUB, TORSTEN and SABUNCU, ORKUNT and NGUYEN, VAN and SON, TRAN CAO "Experimenting with robotic intra-logistics domains" Theory and Practice of Logic Programming , v.18 , 2018 10.1017/S1471068418000200 Citation Details
Igor Dolgov and Edin Sabic and Bryan L. White "Activity Theory as a Framework for Integrating Uas Into the Nas: A Field Study of Crew Member Activity During Uas Operations Near a Non-Towered Airport" Proceedings of the Human Factors and Ergonomics Society Annual Meeting , v.62 , 2018 , p.39-43 10.1177/1541931218621009
Igor Dolgov and Edin Sabic and Bryan L. White "Activity Theory as a Framework for Integrating UAS Into the NAS: A Field Study of Crew Member Activity During Uas Operations Near a Non-Towered Airport" Proceedings of the Human Factors and Ergonomics Society Annual Meeting , v.62 , 2018 , p.39-43 10.1177/1541931218621009
(Showing: 1 - 10 of 26)

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

This is the outcomes report for project 1619273: CHS: Small: Wearable Interfaces to Direct Agent Teams with Adaptive Autonomy. This work was aimed at supporting future disaster response scenarios where operatives would benefit from multiple drones providing information. We explored how to direct those drones through wearable  technology that did not inhibit movement or awareness. Specifically, we were interested in developing more intelligent multi-agent systems (i.e., software simulations of drones) and better user interfaces for disaster contexts (e.g., wearable computer configurations). The work made use of a simulation environment to enable human participants to don wearable computers, move and act outdoors, and interact with virtual drones. 


The grant has produced a set of reusable software and hardware configurations to investigate the value of wearable computers to support human-robot teams in the field (https://pixllab.github.io/URSDocumentation/). The system connects together multiple pieces of software to enable a mixed reality experience of working with virtual drones. In our scenarios to date, we look at using a game that is an analog of urban search and rescue. The player moves around in the real world while their location is tracked by the system. The player seeks out virtual goals. To assist the player, multiple drones can be deployed to find goals the player cannot reach (e.g., on top of buildings).
To provide game logic and a first-person gameworld user interface through an avatar, we use an engine built on Unity. A drone simulation platform, Gazebo, tracks and simulates the virtual drones, accounting for avoiding obstacles. A customized planner, taking inputs described by the Planning Domain Description Language (PDDL), provides the drones? intelligence and communicates with Gazebo using the Robot Operating System. Finally, a set of hardware and software interfaces connect with these components to create a user interface that provides information about the gameworld and the ability to interact with the virtual drones. In our first studies, we use NASA WorldWind for a map visualization, displayed on a wrist-worn touchscreen, and provide drone data on an HMD.


The project produced a design framework to guide building composite wearable computers. Constructing the framework involved qualitatively analyzing over 100 sources for information about components that could be used to build wearable computers. It identifies four dimensions that are essential to choosing devices: type of interactivity provided, associated output modalities, mobility, and body location for each device identified for study. The framework supports designers in ensuring a resulting composite computer supports the types of interaction necessary for the designed tasks and that the combination of devices will be compatible on the human body. We expect the framework to support researchers and designers in building new wearable computers in a number of domains, enabling the selection of the right devices to support various contexts. 


The simulated environment for controlling teams of drones provides solutions for issues that arise from hybrid human-drone team coordination and planning. To the best of our knowledge, this is the first simulation environment that allows a single human to control a team of robots at the same time through wearable devices. We undertook a number of user studies, starting with early prototypes and concluding with a test of multiple wearable configurations identified through the framework. Overall, participants were able to use our mixed reality system as planned. We identified needs for further training and our studies show ways in which future wearable user interfaces might be improved. 

 

 

 


Last Modified: 11/23/2020
Modified by: Zachary O Toups

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page