Award Abstract # 1726524
MRI: Acquisition of a Mobile Manipulation Robot Platform for Research and Education

NSF Org: CNS
Division Of Computer and Network Systems
Recipient: REGENTS OF THE UNIVERSITY OF MICHIGAN
Initial Amendment Date: September 20, 2017
Latest Amendment Date: October 16, 2018
Award Number: 1726524
Award Instrument: Standard Grant
Program Manager: Rita Rodriguez
CNS
 Division Of Computer and Network Systems
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: October 1, 2017
End Date: September 30, 2020 (Estimated)
Total Intended Award Amount: $250,000.00
Total Awarded Amount to Date: $250,000.00
Funds Obligated to Date: FY 2017 = $250,000.00
History of Investigator:
  • Samir Rawashdeh (Principal Investigator)
    srawa@umich.edu
  • Yubao Chen (Co-Principal Investigator)
  • Alex Yi (Co-Principal Investigator)
  • Stanley Baek (Co-Principal Investigator)
  • Yu Zheng (Former Principal Investigator)
  • Samir Rawashdeh (Former Co-Principal Investigator)
Recipient Sponsored Research Office: Regents of the University of Michigan - Ann Arbor
1109 GEDDES AVE STE 3300
ANN ARBOR
MI  US  48109-1015
(734)763-6438
Sponsor Congressional District: 06
Primary Place of Performance: University of Michigan - Dearborn
4901 Evergreen Rd
Dearborn
MI  US  48128-2406
Primary Place of Performance
Congressional District:
12
Unique Entity Identifier (UEI): GNJ7BBP73WE9
Parent UEI:
NSF Program(s): Information Technology Researc
Primary Program Source: 01001718DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 1189
Program Element Code(s): 164000
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070

ABSTRACT

This project, acquiring a mobile manipulation robot platform, aims to address several major challenges of robotics technologies needed to boost robot functionality and applications by integrating manipulation and locomotion capabilities, such that a robot can operate and make changes in a broader environment without territorial restrictions. Specifically, the project specifies enabling:
- Vision-based mobile robot navigation;
- Vision-guided autonomous object manipulation and application to assembly;
- Unified motion generation and control for locomotion and manipulation; and
- Robot-human interaction and collaborations.
This work addresses applications in many areas, such as manufacturing, agriculture health care, homeland security, public service, and entertainment.

Current robots, particularly in manufacturing, have mainly been designed to be bolted on a production line and only execute a pre-defined action with high accuracy and repeatability with limited sensing and no autonomy or intelligence, which significantly restricts their further applications. To enhance the functionality of robots and extend their application scopes, an essential element is not often present: integration of manipulation and locomotion capabilities so that the robot can operate and make changes in a broader environment without territorial restrictions. Moreover, robots need to be equipped with a certain level of autonomy and intelligence so they can decide on their own what to do and how to do it. The proponents aim to maximize the usage of the robot platform by providing this integration.

Broader Impacts:
The outcomes of the proposed research should significantly enhance the university's research and education infrastructure and facilitate close interaction and collaboration between local industries, nearby institutions, and the university. The enthusiasm and expectations for robots to further promote human science and technological level open up a large demand for new developments on technologies and education, as well as in training qualified researchers and engineers in robotics and related fields. The outcomes of this work offer not only needed results but also educational experience. Research problems will be adapted as senior design projects and thesis topics for undergraduates and graduate programs. Some of this work will be integrated into core courses as special lectures and lab projects. Furthermore, industrial involvement from Omron and General Motors will strengthen existing university-industry relations and create collaborative opportunities.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

Cofield, Aaron and El-Shair, Zaid and Rawashdeh, Samir A. "A Humanoid Robot Object Perception Approach Using Depth Images" Proceedings of IEEE National Aerospace and Electronics Conference (NAECON) , 2019 https://doi.org/10.1109/NAECON46414.2019.9057808 Citation Details

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

This project supported the acquisition of a mobile manipulator platform, consisting of a mobile base, two robot arms with multi-fingered hands as end-effectors, and a vision system, to support various educational and research activities at the University of Michigan ? Dearborn (UM-D). Research topics included autonomous mobile manipulation, vision-guided robot control, robot-human interaction and collaboration, and mobile robot localization, mapping, and navigation. In parallel with the research activities, the platform provides students at UM-D with a state-of-the-art robot system to work on and gain hands-on research experience in robotics. The robot supports several undergraduate and graduate courses in robotics, which support a range of degree programs as core courses and electives. These include a new master?s program in robotics engineering which was created during the project period. The project has substantially enhanced UM-D?s capability to conduct robotics research and provide high-quality education and research training.

We named the robot R2ED (Robot for Research and Education at Dearborn). It was designed to be modular and modifiable. For example, the face is 3D printed and it is a relatively simple task to redesign the face for new target applications or sensor form factors. Figure 1 shows a photo of the robot as well as a 3D model.  

The robot served as a basis for several research and educational projects. Example projects include (i) developing an object perception approach using a depth camera, (ii) robot grasp planning and control, (iii) face detection and social gestures recognition, and (iv) robot arm and hand teleoperation using wearable motion capture sensors.

A simulated model of the robot has been developed and made publicly available for the broader robotics community. The COVID-19 pandemic caused lab closure and access restrictions. During this time, the team shifted efforts to developing a simulation model of the robot, which is essential and necessary in any case in order to test out algorithms and mitigate risk before trying them on the real robot. Figure 2 shows a screenshot of the simulated robot in the ROS/Gazebo environment.

 


Last Modified: 03/02/2021
Modified by: Samir Rawashdeh

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page