
NSF Org: |
IIS Division of Information & Intelligent Systems |
Recipient: |
|
Initial Amendment Date: | August 30, 2016 |
Latest Amendment Date: | December 3, 2021 |
Award Number: | 1618283 |
Award Instrument: | Standard Grant |
Program Manager: |
Dan Cosley
dcosley@nsf.gov (703)292-8832 IIS Division of Information & Intelligent Systems CSE Directorate for Computer and Information Science and Engineering |
Start Date: | January 15, 2017 |
End Date: | December 31, 2022 (Estimated) |
Total Intended Award Amount: | $219,646.00 |
Total Awarded Amount to Date: | $235,646.00 |
Funds Obligated to Date: |
FY 2018 = $8,000.00 FY 2019 = $8,000.00 |
History of Investigator: |
|
Recipient Sponsored Research Office: |
#5 HAIRPIN DRIVE EDWARDSVILLE IL US 62026-0001 (618)650-3010 |
Sponsor Congressional District: |
|
Primary Place of Performance: |
IL US 62026-1000 |
Primary Place of
Performance Congressional District: |
|
Unique Entity Identifier (UEI): |
|
Parent UEI: |
|
NSF Program(s): | HCC-Human-Centered Computing |
Primary Program Source: |
01001819DB NSF RESEARCH & RELATED ACTIVIT 01001920DB NSF RESEARCH & RELATED ACTIVIT |
Program Reference Code(s): |
|
Program Element Code(s): |
|
Award Agency Code: | 4900 |
Fund Agency Code: | 4900 |
Assistance Listing Number(s): | 47.070 |
ABSTRACT
This project will study how making common telepresence robots more expressive and interactive affects people's willingness to use them and their opinions of remote collaborators. Many telepresence robots take the form of a screen on a mobile platform, giving remote attendees a physical body that can increase feelings of presence and interaction compared to a normal videoconference. However, the limited non-verbal expressiveness of these platforms is a major barrier to their use. Thus, the research goal is to increase the embodiment and social interaction capabilities of these platforms by adding a hand and arm to support common non-verbal interactions such as pointing, gesturing, and touch. To do this, the researchers will first build a simple hand and arm to support these non-verbal interactions and add it to an existing telepresence robot. They will then develop software to run on the robot to execute the gestures and ensure the safety of people nearby, as well as a user interface that maps gestures by the remote user onto gestures the hand and arm are able to execute. They will then test the usability of the system and its effects on social presence through studies that include both one-on-one and small group icebreaking conversations. This work will lead toward more natural interfaces for telepresence robots and a better understanding of how people accept them and interact with them. This, in turn, should lead to social benefits by making remote interaction more effective, saving time, effort, and fuel costs around travel while supporting not just remote meetings but other remote services such as medical diagnosis and caregiving. The team will also use the research both for their own classes and for outreach at events designed to encourage children to explore science careers.
The work sits at the intersection of telerobotics, haptics, and social psychology. Because gestures, pointing, handshakes, and other non-verbal communication are an important part of human interaction that current telepresence platforms do not support, the work focuses on the development of a lightweight arm that can implement those gestures without the complexity, fragility, and expense of arms that fully mimic human motion. To make this tradeoff, the research team will develop a 3D-printed, 5-fingered hand with 3 degrees of freedom and simple connections that allow the fingers to bend naturally enough to recreate the intended gestures. The control software will use a forward and inverse kinematics approach to model hand configurations and use an open-loop controller that works along with the human operator to implement the gestures; bump, force, and optical sensors will be used to address safety concerns. For the human remote operator, the team will develop interfaces that (a) add cameras to the robot to provide a fuller view of the remote environment needed for effective gestures, and (b) use motion-tracking hardware to detect the remote user's arm motion and translate it into the space of possible motions of the robot arm, focusing on the specific targeted social gestures. They will evaluate the system through a series of between-subjects user studies, having participants as either the remote or local user interacting with a version of the robot with or without the arm. Participants will interact with trained experimental confederates both to reduce variability and to ensure that the target non-verbal gestures are experienced in both the with- and without-arm conditions. The team will measure perceived social connectedness with conversation partners and acceptability of the robot using standard scales, as well as asking questions about the particulars of the experience both to gain deeper insight into the reasons why the arm is effective (if it is) and to guide the design of future systems.
PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH
Note:
When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external
site maintained by the publisher. Some full text articles may not yet be available without a
charge during the embargo (administrative interval).
Some links on this page may take you to non-federal websites. Their policies may differ from
this site.
PROJECT OUTCOMES REPORT
Disclaimer
This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.
With the rapid advancement of technologies like video conferencing, we are experiencing new ways to be present and to communicate with one another without having to physically be in one another?s company. Telerobots - platforms that not only enable video conferencing, but also enable mobility within a remote environment - are an example of a new communication technology that enables a user to be present, to communicate, and to navigate within a remote setting. In addition to videoconferencing, telerobots enable a user to turn and face other users while communicating, to explore one?s surroundings, and to travel through settings such as workplaces or schools, all while being controlled from a remote setting. Despite these increased capabilities, the social experience of using a telerobot often falls short - it?s still difficult to capture the nonverbal element - such as gestures, referencing, and talking with one?s hands - that are critical to supporting social connection between individuals. The objective of this project was to understand if and how enabling individuals to talk with their hands, to engage in social interactions such as handshakes, and to use gestures while in conversation through a telerobot enhanced the social connection between the communication partners. Specifically, we developed a human-like arm and hand prototype and new remote interfaces that were integrated onto an existing telerobot. Instead of just videoconferencing, user gestures can now be replicated in near real-time, such that the user controlling the telerobot could now point, shake hands, or shrug shoulders, and these same gestures would be mimicked on the telerobot for interaction with a remote individual. Through human user studies with 310 individuals, we demonstrated that enhanced gesturing capabilities increases the social connection between telerobot communication partners, while also uncovering areas where additional information and support is needed to enhance social connection and user experience.
Intellectual Merit: This project has produced (1) a comprehensive, published overview of the hardware, software, and control schemes developed to add gesturing capabilities to a telerobot platform, along with human user studies with 310 individuals illustrating the affordances and limitations of the current capabilities ; (2) a modular hardware design of a robot arm-hand assembly that can be ported across different telerobot platforms; and (3) wearable haptic interface designs and prototypes that provide touch feedback to telerobot users attempting to execute social contact interactions. The system design framework established in this work along with the insights shared around the trade-offs of balancing complexity with realism and functionality can be extended to different telerobot platforms informing both future designs and use cases. The modular and portable hardware designs developed in this work can also be extended by other research teams and designers to incorporate gesturing capabilities into their own robot platforms. The wearable haptic devices developed in this work illustrate the critical role of touch in supporting remote social contact interactions, and the designs presented in this work can be implemented and extended across a variety of interfaces and remote communication contexts. Taken together, this work combines advancements in robotics, haptics, virtual reality, human-robot interaction, and psychology to reimagine how remote communication can be extended to our physical world beyond just videoconferencing.
Broader Impacts: This project has helped address a growing gap in remote communication - how to support social connection and capture important nonverbal cues that are often missed using today?s technologies. This research provides a framework for putting human connection and communication at the forefront of the design of new technologies like telerobots, while also pushing the boundaries of what is possible in communicating remotely -opening up new possibilities for remote work, learning, and living. This work has resulted in 4 peer-reviewed publications, several presentations and demonstrations to academic, industry, and community audiences, and has supported 8 engineering and computer science graduate students, 2 psychology graduate students, 5 engineering and computer science undergraduates, 30 psychology undergraduates, and 2 high school students. This project also directly involved 349 individuals in the research studies, and was infused in an undergraduate/graduate course on Social Robotics. This work has also been disseminated across numerous national and regional events, including a live demonstration at an international conference. Through various outreach opportunities including Introduce a Girl to Engineering Days, local science and robotics camps, and presentations to Scout organizations, local schools, and an early childhood center, we have reached over 500 individuals in the K-12 community. Finally, this work has also been disseminated to industry partners, which has fostered new collaborations on extensions of this work. The outcomes of this project have laid the foundation for ongoing investigations in enhancing remote communication through technology, with a focus on bringing back the very human elements that connect us in conversation.
Last Modified: 04/18/2023
Modified by: Jerry Weinberg
Please report errors in award information by writing to: awardsearch@nsf.gov.