
NSF Org: |
IIS Division of Information & Intelligent Systems |
Recipient: |
|
Initial Amendment Date: | September 9, 2013 |
Latest Amendment Date: | May 11, 2016 |
Award Number: | 1327597 |
Award Instrument: | Continuing Grant |
Program Manager: |
David Miller
IIS Division of Information & Intelligent Systems CSE Directorate for Computer and Information Science and Engineering |
Start Date: | October 1, 2013 |
End Date: | September 30, 2020 (Estimated) |
Total Intended Award Amount: | $1,262,883.00 |
Total Awarded Amount to Date: | $1,286,883.00 |
Funds Obligated to Date: |
FY 2014 = $8,000.00 FY 2015 = $8,000.00 FY 2016 = $8,000.00 |
History of Investigator: |
|
Recipient Sponsored Research Office: |
5000 FORBES AVE PITTSBURGH PA US 15213-3815 (412)268-8746 |
Sponsor Congressional District: |
|
Primary Place of Performance: |
5000 Forbes Ave Pittbsurgh PA US 15213-3815 |
Primary Place of
Performance Congressional District: |
|
Unique Entity Identifier (UEI): |
|
Parent UEI: |
|
NSF Program(s): | NRI-National Robotics Initiati |
Primary Program Source: |
01001415DB NSF RESEARCH & RELATED ACTIVIT 01001516DB NSF RESEARCH & RELATED ACTIVIT 01001617DB NSF RESEARCH & RELATED ACTIVIT |
Program Reference Code(s): |
|
Program Element Code(s): |
|
Award Agency Code: | 4900 |
Fund Agency Code: | 4900 |
Assistance Listing Number(s): | 47.070 |
ABSTRACT
This work will advance human-robot partnerships by establishing a new concept called complementary situational awareness (CSA), which is the simultaneous perception and use of the environment and operational constraints for task execution. The proposed CSA is transformative because it ushers in a new era of human-robot partnerships where robots act as our partners, not only in manipulation, but in perception and control. This research will establish the foundations for CSA to enable multifaceted human-robot partnerships. Three main research objectives guide this effort: 1) Real-time Sensing during Task Execution: design low-level control algorithms providing wire-actuated or flexible continuum robots with sensory awareness by supporting force sensing, exploration, and modulated force interaction in flexible unstructured environments; 2) Situational Awareness Modeling: prescribe information fusion and simultaneous localization and mapping (SLAM) algorithms suitable for surgical planning and in-vivo surgical plan adaptation; 3) Telemanipulation based on CSA: Design, construct, and integrate robotic testbeds with telemanipulation algorithms that use SLAM and exploration data for online adaptation of assistive telemanipulation virtual fixtures. This research also includes investigation of previously unaddressed questions on how sensory exploration and palpation data can be used to enable online-adaptation of assistive virtual fixtures based on force and stiffness data while also taking into account preoperative data and intraoperative correction of registration parameters.
The proposed work will restore the situational awareness readily available in open surgery to minimally invasive surgery. This will benefit patients by enabling core technologies for effective and safe natural orifice surgery or single port access surgery. The societal impact of the proposed work on these two surgical paradigms is reduced pain for patients, shorter hospital stay, improved cosmesis and patients' self image, and lower costs. We also believe that CSA will impact manufacturing where its future will require people and robots working together in a shared space on collaborative tasks. Also, the same concepts of CSA apply to telemanipulation in constrained and unstructured environments and the proposed research has direct relevance to robot-human partnerships for space exploration. To ensure this broader impact will be achieved, an advisory board has been assembled with experts from medicine, manufacturing and aerospace. Finally, the PIs will facilitate collaboration in the medical robotics research community by making our software and hardware designs available on-line and using commercial-grade hardware available at multiple institutions.
PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH
Note:
When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external
site maintained by the publisher. Some full text articles may not yet be available without a
charge during the embargo (administrative interval).
Some links on this page may take you to non-federal websites. Their policies may differ from
this site.
PROJECT OUTCOMES REPORT
Disclaimer
This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.
When operators telemanipulate robotic devices, they are hampered by perception barriers challenging their situational awareness. A surgeon telemanipulating a surgical robot is limited in their understanding of the surgical scene and of the robot's interaction with the anatomy. The robotics research community has dealt with these challenges by focusing on ways of providing force feedback to the surgeons and by providing assistive control laws (called virtual fixtures) that superimpose a safety barrier or help the surgeon follow a desired path while avoiding critical anatomy. These solutions are limited by the reliance on medical image registration (a process by which pre-operative anatomy images are related to the intraoperative scene). Moreover, these solutions relied predominantly on pre-operative geometric definitions of a surgical plan to construct the assistive virtual fixtures.
Figure 1 shows the approach followed in this collaborative research along with some scenarios where the surgical perception is lacking. According to our approach, the robot is used for manipulation augmentation and for perception augmentation by fusing intraoperative sensory data and imaging (e.g. tissue stiffness, computer vision) with preoperative models and images of the anatomy. Figures 1b and 1c show scenarios where such perception and situational awareness augmentation would be critical for safe operation. In both images, the robot can have multiple contacts outside the surgeon's visual field of view and there is a need for a robot that can discern such contacts and a high-level controller that uses this information to adapt the telemanipulation behaviors to enable safe operation.
The concept of robot situational awareness was investigated as part of a paradigm in which intraoperative sensory information was used to inform the update of a surgical plan and its corresponding virtual fixtures. In addition to using geometry, the use of force-controlled palpation and exploration of the anatomy has been explored and demonstrated to allow adaptive surgical plans. Figure 2 shows a robot using force-controlled scan of a mock organ to account for organ deformation relative to a pre-operative model of the organ. Using such intraoperative information, advanced statistical methods were used to take advantage of intraoperative sensing and preoperative information to improve the computer?s model of the patient?s anatomy and the surgical plan. Figure 3 shows a result of force-controlled exploration where an organ stiffness map is generated and used with geometry to inform the process of updating the model of the anatomy. Figure 4 shows steps in an efficient method for real-time stiffness mapping during telemanipulation of the robot. The method allows an interactive rate annotation of the anatomy model with stiffness information which could be used and in figure 3 for identifying possible locations of tumors or for identifying a hidden artery as in figure 4.
These tools enabled a rigorous exploration of new hybrid assistive telemanipulation frameworks that allow the robot high-level controller to specify behaviors where the robot controls motion or regulates force while allowing the user to telemanipulate the robot tip for remote palpation. Figure 5 shows a subset of these conditions. Automated and semi-automated telemanipulation with superimposed end-effector excitations have been developed to allow the high level controller to discern information tantamount to that obtained during palpation. Ways of relaying this information to users have also been explored through a user study with the aim of determining the potential benefits of these approaches.
We also focused on sensing using continuum robots to assist with surgical perception. To achieve this, a new approach to model force and motion transmission losses. It was shown that the high level controller can use these modeling techniques to allow assistive behaviors of palpation to support force-controlled virtual fixture model update and to enable regulation of force while carrying out tasks of ablation and knot tying. Figure 6 shows one of our robotic platforms used to test our new sensing and control approaches. The figure shows a new approach for hybrid force/motion control using estimation of tip forces and two sample use scenarios (force regulated knot tying and ablation).
These contributions will facilitate future development of human-robot cooperative systems with applications for surgery, space robotics, search and rescue and potentially robot-worker collaboration in manufacturing.
Other broader impacts of this award supported the training of two post-docs, eight Ph.D. students, and 6 undergraduate students. Thirty-nine archival publications were presented in national and international conferences and journals. Four Ph.D. dissertations were published. In addition, 37 high school female students received STEM and robotics training in three winter classes with each class spanning three weeks. Seven Ph.D.s and two postdocs trained on this program have joined industry research groups in medical robotics and research in human-robot collaboration. One of the Ph.D. students trained on this award started a tenure-track faculty position in the U.S..
The project public page is http://nri-csa.vuse.vanderbilt.edu/joomla/ where there are also public data sets and computer code related to this project.
Last Modified: 12/22/2020
Modified by: Howard M Choset
Please report errors in award information by writing to: awardsearch@nsf.gov.