Award Abstract # 1730033
II-New: Flexible User Interaction Instrumentation for Ubiquitous and Immersive Computing Environments

NSF Org: CNS
Division Of Computer and Network Systems
Recipient: UNIVERSITY OF MARYLAND BALTIMORE COUNTY
Initial Amendment Date: June 27, 2017
Latest Amendment Date: November 30, 2018
Award Number: 1730033
Award Instrument: Standard Grant
Program Manager: Wendy Nilsen
wnilsen@nsf.gov
 (703)292-2568
CNS
 Division Of Computer and Network Systems
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: July 1, 2017
End Date: June 30, 2020 (Estimated)
Total Intended Award Amount: $356,657.00
Total Awarded Amount to Date: $356,657.00
Funds Obligated to Date: FY 2017 = $356,657.00
History of Investigator:
  • Andrea Kleinsmith (Principal Investigator)
    andreak@umbc.edu
  • Anita Komlodi (Co-Principal Investigator)
  • Ravi Kuber (Co-Principal Investigator)
  • Helena Mentis (Co-Principal Investigator)
  • Wayne Lutters (Former Principal Investigator)
  • Andrea Kleinsmith (Former Co-Principal Investigator)
Recipient Sponsored Research Office: University of Maryland Baltimore County
1000 HILLTOP CIR
BALTIMORE
MD  US  21250-0001
(410)455-3140
Sponsor Congressional District: 07
Primary Place of Performance: University of Maryland Baltimore County
1000 Hilltop Circle
Baltimore
MD  US  21250-0002
Primary Place of Performance
Congressional District:
07
Unique Entity Identifier (UEI): RNKYWXURFRL5
Parent UEI:
NSF Program(s): CCRI-CISE Cmnty Rsrch Infrstrc
Primary Program Source: 01001718DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 7359
Program Element Code(s): 735900
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070

ABSTRACT

Smart phones, networked devices, and commercially available augmented and virtual reality kits are making ubiquitous and immersive interactive systems increasingly commonplace. Designing these systems, and the applications that use them, requires the ability to study how system features and human behavior affect each other in natural contexts. This award will provide an interdisciplinary research team with equipment for capturing both individual and small group behavioral data in situ, including body motion, eye gaze, mental workload, and emotion. The infrastructure will support projects at the team's institution in a number of domains, including autonomous vehicle use by visually impaired users, augmented reality training for emergency medical responders, and collaborative scientific discovery in virtual visualization environments. A doctoral student with related research interests will coordinate management of and training on the infrastructure, developing both technical and research skills. The team will also use the equipment to provide enhanced research opportunities for undergraduates from a number of disciplines, institutions, and backgrounds.

Much of the planned infrastructure uses next-generation versions of tools that the team already has expertise with. This reduces deployment risks and allows them to augment existing projects with new capabilities while enabling new directions for research. In most cases, this is a transformation from fixed lab-based sensing to unconstrained, mobile data collection in the field. The new capabilities include five main data sources. One is body motion data, which will be collected using an industry standard infrared-based motion capture system that can flexibly capture the movements of individuals or dyads. A second is location and gait capture, for both individuals and groups, through an unobtrusive, configurable system of floor-mounted force plates. A third is eye gaze data, collected through a portable headset that captures eye fixations and pupil dilation to support monitoring visual attention. A fourth is electroencephalogram (EEG) data, collected through a portable dry sensor-based headset, that can monitor workload, affect, and facial features as well as support prototyping of brain-computer interfaces (BCIs). A fifth is physiological data including temperature, pulse, arm motion, and arousal, collected through a inconspicuous wristband that includes photoplethysmography, accelerometer, and electrodermal sensors. Each individual data stream comes with accompanying analytic software; collectively, data will be managed through a commercially available tool for analyzing events synchronized across multiple parallel data streams. Key innovations of this award this award are its novel data integration strategies and cross-disciplinary application areas.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

Feng, Yuanyuan and McGowan, Hannah and Semsar, Azin and Zahiri, Hamid R. and George, Ivan M. and Park, Adrian and Kleinsmith, Andrea and Mentis, Helena "Virtual pointer for gaze guidance in laparoscopic surgery" Surgical Endoscopy , v.34 , 2020 10.1007/s00464-019-07141-x Citation Details
Feng, Yuanyuan and McGowan, Hannah and Semsar, Azin and Zahiri, Hamid R. and George, Ivan M. and Turner, Timothy and Park, Adrian and Kleinsmith, Andrea and Mentis, Helena M. "A virtual pointer to support the adoption of professional vision in laparoscopic training" International Journal of Computer Assisted Radiology and Surgery , v.13 , 2018 10.1007/s11548-018-1792-9 Citation Details
Feng, Yuanyuan and Mentis, Helena M. and Li, Katie and Semsar, Azin and McGowan, Hannah and Mun, Jacqueline and Zahiri, H. Reza and George, Ivan and Park, Adrian and Kleinsmith, Andrea "Communication Cost of Single-user Gesturing Tool in Laparoscopic Surgical Training" Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems , 2019 10.1145/3290605.3300841 Citation Details
Lee, H. "Distinguishing Anxiety Subtypes of English Language Learners Towards Augmented Emotional Clarity" International Conference on Artificial Intelligence in Education , 2020 10.1007/978-3-030-52240-7_29 Citation Details
Semsar, Azin and McGowan, Hannah and Feng, Yuanyuan and Zahiri, H. Reza and Park, Adrian and Kleinsmith, Andrea and Mentis, Helena "How Trainees Use the Information from Telepointers in Remote Instruction" Proceedings of the ACM on Human-Computer Interaction , v.3 , 2019 10.1145/3359195 Citation Details

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

The technological advances that have occurred over the last several years have opened the door for exciting new research to understand the ways in which interactive systems are used by and are useful for people. This changing technological landscape demands new research into how individuals interact with these technologies in the wild, given that the most powerful and elegant computing solution is of little worth if it is not able to be meaningfully integrated into one?s daily life. Given the rapid rise of innovative, richly-immersive, ubiquitous computing, a new focus is viewing the person and their myriad digital devices as a coupled complex system ? a cyber-human system.

The aim of this project was to acquire a suite of new flexible and dynamic instrumentation to engage the big questions facing human-centered computing within new interactive spaces. Thus, this project focused on instrumentation to support the capture of users? interactions with technology to facilitate researchers? pursuit and exploration of new lines of research on ubiquitous and immersive interactive environments; considering a blending of laboratory work, for requisite precision, and field work, for enhanced ecological validity.

The new infrastructure is maintained by the Interactive Systems Research Center (ISRC) at the University of Maryland, Baltimore County?s (UMBC). Expanding our existing research infrastructure allows researchers to pursue new research and collaboration opportunities within and outside UMBC. The newly acquired instrumentation has already benefitted existing projects and collaborations. Multidisciplinary efforts are underway exploring the effectiveness of augmented and virtual reality training in diverse situations, including surgical training and mentoring, one-to-many emergency medical provider training to improve access to healthcare, and graduate student preparation for community engaged service. Other projects focus on psychophysiological understanding of stress experienced by paramedic trainees during in-situ simulations, and social and speaking anxiety experienced by students learning English as a foreign language.

In addition to the research opportunities, the instrumentation provides hands-on educational opportunities for students at all levels through integration in graduate level courses in UMBC?s Human-Centered Computing MS and PhD programs. The Tobii Pro X3-120 eye trackers are employed in the User Interface Design core course to teach students how to collect and analyze eye tracking data for evaluating user interfaces. New technology is also incorporated into the HCC graduate electives of Computer Supported Cooperative Work and Affective Human Computer Interaction. In Affective HCI, students receive hands-on training and tutorials on acquiring, cleaning and analyzing data with the Empatica E4 wristbands. Armed with this experience, students are introduced to other new instrumentation, such as the Vicon motion capture system, HTC Vive Pro Eye and Magic Leap augmented reality headsets, usable for their semester-long project.  

With the new infrastructure in place, in January 2020 the Co-PIs held an event to showcase the new technology to current and potential collaborators across UMBC, with a view toward attracting future project partners. The well-attended event comprised short talks by doctoral students detailing their research and how they have employed the new equipment, a tour of the user studies labs with technology demonstrations by graduate students, and a dance performance by a UMBC undergraduate dance major to demonstrate the Vicon motion capture system and its capabilities.

 


Last Modified: 10/29/2020
Modified by: Andrea Kleinsmith

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page