
NSF Org: |
CNS Division Of Computer and Network Systems |
Recipient: |
|
Initial Amendment Date: | June 27, 2017 |
Latest Amendment Date: | November 30, 2018 |
Award Number: | 1730033 |
Award Instrument: | Standard Grant |
Program Manager: |
Wendy Nilsen
wnilsen@nsf.gov (703)292-2568 CNS Division Of Computer and Network Systems CSE Directorate for Computer and Information Science and Engineering |
Start Date: | July 1, 2017 |
End Date: | June 30, 2020 (Estimated) |
Total Intended Award Amount: | $356,657.00 |
Total Awarded Amount to Date: | $356,657.00 |
Funds Obligated to Date: |
|
History of Investigator: |
|
Recipient Sponsored Research Office: |
1000 HILLTOP CIR BALTIMORE MD US 21250-0001 (410)455-3140 |
Sponsor Congressional District: |
|
Primary Place of Performance: |
1000 Hilltop Circle Baltimore MD US 21250-0002 |
Primary Place of
Performance Congressional District: |
|
Unique Entity Identifier (UEI): |
|
Parent UEI: |
|
NSF Program(s): | CCRI-CISE Cmnty Rsrch Infrstrc |
Primary Program Source: |
|
Program Reference Code(s): |
|
Program Element Code(s): |
|
Award Agency Code: | 4900 |
Fund Agency Code: | 4900 |
Assistance Listing Number(s): | 47.070 |
ABSTRACT
Smart phones, networked devices, and commercially available augmented and virtual reality kits are making ubiquitous and immersive interactive systems increasingly commonplace. Designing these systems, and the applications that use them, requires the ability to study how system features and human behavior affect each other in natural contexts. This award will provide an interdisciplinary research team with equipment for capturing both individual and small group behavioral data in situ, including body motion, eye gaze, mental workload, and emotion. The infrastructure will support projects at the team's institution in a number of domains, including autonomous vehicle use by visually impaired users, augmented reality training for emergency medical responders, and collaborative scientific discovery in virtual visualization environments. A doctoral student with related research interests will coordinate management of and training on the infrastructure, developing both technical and research skills. The team will also use the equipment to provide enhanced research opportunities for undergraduates from a number of disciplines, institutions, and backgrounds.
Much of the planned infrastructure uses next-generation versions of tools that the team already has expertise with. This reduces deployment risks and allows them to augment existing projects with new capabilities while enabling new directions for research. In most cases, this is a transformation from fixed lab-based sensing to unconstrained, mobile data collection in the field. The new capabilities include five main data sources. One is body motion data, which will be collected using an industry standard infrared-based motion capture system that can flexibly capture the movements of individuals or dyads. A second is location and gait capture, for both individuals and groups, through an unobtrusive, configurable system of floor-mounted force plates. A third is eye gaze data, collected through a portable headset that captures eye fixations and pupil dilation to support monitoring visual attention. A fourth is electroencephalogram (EEG) data, collected through a portable dry sensor-based headset, that can monitor workload, affect, and facial features as well as support prototyping of brain-computer interfaces (BCIs). A fifth is physiological data including temperature, pulse, arm motion, and arousal, collected through a inconspicuous wristband that includes photoplethysmography, accelerometer, and electrodermal sensors. Each individual data stream comes with accompanying analytic software; collectively, data will be managed through a commercially available tool for analyzing events synchronized across multiple parallel data streams. Key innovations of this award this award are its novel data integration strategies and cross-disciplinary application areas.
PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH
Note:
When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external
site maintained by the publisher. Some full text articles may not yet be available without a
charge during the embargo (administrative interval).
Some links on this page may take you to non-federal websites. Their policies may differ from
this site.
PROJECT OUTCOMES REPORT
Disclaimer
This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.
The technological advances that have occurred over the last several years have opened the door for exciting new research to understand the ways in which interactive systems are used by and are useful for people. This changing technological landscape demands new research into how individuals interact with these technologies in the wild, given that the most powerful and elegant computing solution is of little worth if it is not able to be meaningfully integrated into one?s daily life. Given the rapid rise of innovative, richly-immersive, ubiquitous computing, a new focus is viewing the person and their myriad digital devices as a coupled complex system ? a cyber-human system.
The aim of this project was to acquire a suite of new flexible and dynamic instrumentation to engage the big questions facing human-centered computing within new interactive spaces. Thus, this project focused on instrumentation to support the capture of users? interactions with technology to facilitate researchers? pursuit and exploration of new lines of research on ubiquitous and immersive interactive environments; considering a blending of laboratory work, for requisite precision, and field work, for enhanced ecological validity.
The new infrastructure is maintained by the Interactive Systems Research Center (ISRC) at the University of Maryland, Baltimore County?s (UMBC). Expanding our existing research infrastructure allows researchers to pursue new research and collaboration opportunities within and outside UMBC. The newly acquired instrumentation has already benefitted existing projects and collaborations. Multidisciplinary efforts are underway exploring the effectiveness of augmented and virtual reality training in diverse situations, including surgical training and mentoring, one-to-many emergency medical provider training to improve access to healthcare, and graduate student preparation for community engaged service. Other projects focus on psychophysiological understanding of stress experienced by paramedic trainees during in-situ simulations, and social and speaking anxiety experienced by students learning English as a foreign language.
In addition to the research opportunities, the instrumentation provides hands-on educational opportunities for students at all levels through integration in graduate level courses in UMBC?s Human-Centered Computing MS and PhD programs. The Tobii Pro X3-120 eye trackers are employed in the User Interface Design core course to teach students how to collect and analyze eye tracking data for evaluating user interfaces. New technology is also incorporated into the HCC graduate electives of Computer Supported Cooperative Work and Affective Human Computer Interaction. In Affective HCI, students receive hands-on training and tutorials on acquiring, cleaning and analyzing data with the Empatica E4 wristbands. Armed with this experience, students are introduced to other new instrumentation, such as the Vicon motion capture system, HTC Vive Pro Eye and Magic Leap augmented reality headsets, usable for their semester-long project.
With the new infrastructure in place, in January 2020 the Co-PIs held an event to showcase the new technology to current and potential collaborators across UMBC, with a view toward attracting future project partners. The well-attended event comprised short talks by doctoral students detailing their research and how they have employed the new equipment, a tour of the user studies labs with technology demonstrations by graduate students, and a dance performance by a UMBC undergraduate dance major to demonstrate the Vicon motion capture system and its capabilities.
Last Modified: 10/29/2020
Modified by: Andrea Kleinsmith
Please report errors in award information by writing to: awardsearch@nsf.gov.