Award Abstract # 1830242
NRI: FND: Communicating Physical Interactions

NSF Org: IIS
Division of Information & Intelligent Systems
Recipient: UNIVERSITY OF WISCONSIN SYSTEM
Initial Amendment Date: August 17, 2018
Latest Amendment Date: May 23, 2022
Award Number: 1830242
Award Instrument: Standard Grant
Program Manager: Juan Wachs
IIS
 Division of Information & Intelligent Systems
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: September 1, 2018
End Date: August 31, 2023 (Estimated)
Total Intended Award Amount: $749,986.00
Total Awarded Amount to Date: $797,986.00
Funds Obligated to Date: FY 2018 = $749,986.00
FY 2020 = $16,000.00

FY 2021 = $16,000.00

FY 2022 = $16,000.00
History of Investigator:
  • Michael Gleicher (Principal Investigator)
    gleicher@cs.wisc.edu
  • Michael Zinn (Co-Principal Investigator)
  • Bilge Mutlu (Co-Principal Investigator)
Recipient Sponsored Research Office: University of Wisconsin-Madison
21 N PARK ST STE 6301
MADISON
WI  US  53715-1218
(608)262-3822
Sponsor Congressional District: 02
Primary Place of Performance: University of Wisconsin-Madison
1210 West Dayton St
Madison
WI  US  53706-1613
Primary Place of Performance
Congressional District:
02
Unique Entity Identifier (UEI): LCLSJAGTNZQ7
Parent UEI:
NSF Program(s): IIS Special Projects,
NRI-National Robotics Initiati
Primary Program Source: 01002223DB NSF RESEARCH & RELATED ACTIVIT
01001819DB NSF RESEARCH & RELATED ACTIVIT

01002021DB NSF RESEARCH & RELATED ACTIVIT

01002122DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 063Z, 8086, 9251
Program Element Code(s): 748400, 801300
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070

ABSTRACT

For robots to become ubiquitous collaborators, they must interact physically with objects in unstructured human environments. Robots must adeptly grasp, push, squeeze, snap, balance, stabilize and hand-off objects, and must effectively communicate about these actions with their human collaborators. People must not only be able to specify to a robot what to do and how to do it, they must also be able to interpret what a robot intends to do (and how). Effective communication about physical interactions will be essential if robots are going to perform such tasks as delivering care to older adults, responsively helping technicians in performing repairs, or being trained by non-experts to perform repetitive assembly tasks. This project will enable a new generation of robot applications, and advance the vision of ubiquitous, collaborative robots by developing better methods for robots to more effectively communicate with people.

The project will address key challenges in communication about physical interactions: conveying invisible and unfamiliar quantities (e.g., forces and compliances), communicating plans and contingencies, and communicating about what did not (or should not) happen. The project will involve three key challenges: specification, interpretation, and monitoring. To address these challenges, the project will 1) perform formative studies to gain insight into how people communicate about physical interactions and interpret displays; 2) develop methods for specifying physical actions based on the idea of augmented demonstrations, methods for interpreting physical actions based on the idea of interpretable representations, and methods for monitoring physical actions based on multimodal communication; and 3) deploy these ideas in prototype systems for contextualized scenarios for evaluation.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

(Showing: 1 - 10 of 23)
Chamzas, Constantinos and Quintero-Pena, Carlos and Kingston, Zachary and Orthey, Andreas and Rakita, Daniel and Gleicher, Michael and Toussaint, Marc and Kavraki, Lydia E. "MotionBenchMaker: A Tool to Generate and Benchmark Motion Planning Datasets" IEEE Robotics and Automation Letters , v.7 , 2022 https://doi.org/10.1109/LRA.2021.3133603 Citation Details
Hagenow, Michael and Zhang, Bolun and Mutlu, Bilge and Zinn, Michael and Gleicher, Michael "Recognizing Orientation Slip in Human Demonstrations" IEEE International Conference on Robotics and Automation (ICRA) , 2021 https://doi.org/10.1109/ICRA48506.2021.9561856 Citation Details
Praveena, P. and Molina, L. and Wang. Y. and Senft, E. and Mutlu, B. and Gleicher, M. "Understanding Control Frames in Multi-Camera Robot Telemanipulation" Proceedings of the 2022 ACM/IEEE International Conference on Human-Robot Interaction (HRI) , 2022 https://doi.org/10.5555/3523760.3523818 Citation Details
Praveena, Pragathi and Cagiltay, Bengisu and Gleicher, Michael and Mutlu, Bilge "Exploring the Use of Collaborative Robots in Cinematography" CHI EA '23: Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems , 2023 https://doi.org/10.1145/3544549.3585715 Citation Details
Praveena, Pragathi and Rakita, Daniel and Mutlu, Bilge and Gleicher, Michael "Supporting Perception of Weight through Motion-induced Sensory Conflicts in Robot Teleoperation" HRI '20: Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction , 2020 10.1145/3319502.3374841 Citation Details
Praveena, Pragathi and Rakita, Daniel and Mutlu, Bilge and Gleicher, Michael "User-Guided Offline Synthesis of Robot Arm Motion from 6-DoF Paths" 2019 International Conference on Robotics and Automation (ICRA) , 2019 Citation Details
Praveena, Pragathi and Subramani, Guru and Mutlu, Bilge and Gleicher, Michael "Characterizing Input Methods for Human-to-Robot Demonstrations" 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI) , 2019 10.1109/HRI.2019.8673310 Citation Details
Praveena, Pragathi and Wang, Yeping and Senft, Emmanuel and Gleicher, Michael and Mutlu, Bilge "Periscope: A Robotic Camera System to Support Remote Physical Collaboration" Proceedings of the ACM on Human-Computer Interaction , v.7 , 2023 https://doi.org/10.1145/3610199 Citation Details
Rakita, Daniel and Mutlu, Bilge "Stampede: A Discrete-Optimization Method for Solving Pathwise-Inverse Kinematics" 2019 International Conference on Robotics and Automation (ICRA) , 2019 Citation Details
Rakita, Daniel and Mutlu, Bilge and Gleicher, Michael "An analysis of RelaxedIK: an optimization-based framework for generating accurate and feasible robot arm motions" Autonomous Robots , v.44 , 2020 https://doi.org/10.1007/s10514-020-09918-9 Citation Details
Rakita, Daniel and Mutlu, Bilge and Gleicher, Michael "Effects of Onset Latency and Robot Speed Delays on Mimicry-Control Teleoperation" HRI '20: Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction , 2020 10.1145/3319502.3374838 Citation Details
(Showing: 1 - 10 of 23)

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

This project has developed a better understanding of and methods for communicating with robots about physical interactions in order to enable more effective robotic collaborators across all human-robot interaction paradigms. The efforts of this project have provided designs for devices, algorithms and systems that allow people to communicate desired actions to robots and to have awareness of robot behavior.

Broader Impact: The results of this project will improve systems where people need to interact with robots performing physical tasks. These advances contribute to the potential use of robots in tasks in human environments, and will contribute to the efforts to bring more widespread use of robots in applications such as light manufacturing, personal assistants, and healthcare.

Intellectual Merit: This project has developed algorithmic approaches to creating robot motions that meet the needs of interactive applications. The project has provided experimental evidence for how people sense and specify physical behaviors and how people respond to different types of interactive robot controls. The project has provided methods that use dynamic cameras to help people monitor behavior. The project has shown how new sensor types can be incorporated into interactive robotics systems.  The project has demonstrated how physical aspects of manipulation can be inferred from observations.


Last Modified: 11/29/2023
Modified by: Michael L Gleicher

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page