
NSF Org: |
IIS Division of Information & Intelligent Systems |
Recipient: |
|
Initial Amendment Date: | June 30, 2016 |
Latest Amendment Date: | May 26, 2021 |
Award Number: | 1564065 |
Award Instrument: | Standard Grant |
Program Manager: |
Ephraim Glinert
IIS Division of Information & Intelligent Systems CSE Directorate for Computer and Information Science and Engineering |
Start Date: | August 1, 2016 |
End Date: | July 31, 2022 (Estimated) |
Total Intended Award Amount: | $894,431.00 |
Total Awarded Amount to Date: | $918,931.00 |
Funds Obligated to Date: |
FY 2020 = $24,500.00 |
History of Investigator: |
|
Recipient Sponsored Research Office: |
4000 CENTRAL FLORIDA BLVD ORLANDO FL US 32816-8005 (407)823-0387 |
Sponsor Congressional District: |
|
Primary Place of Performance: |
Orlando FL US 32826-3281 |
Primary Place of
Performance Congressional District: |
|
Unique Entity Identifier (UEI): |
|
Parent UEI: |
|
NSF Program(s): | HCC-Human-Centered Computing |
Primary Program Source: |
01002021DB NSF RESEARCH & RELATED ACTIVIT |
Program Reference Code(s): |
|
Program Element Code(s): |
|
Award Agency Code: | 4900 |
Fund Agency Code: | 4900 |
Assistance Listing Number(s): | 47.070 |
ABSTRACT
Flight simulators offer pilots a chance to safely practice flying a wide range of aircraft in a variety of scenarios. Similarly, human patient simulators offer nurses and physicians safe opportunities to practice healthcare on a wide range of patients and scenarios. Virtual patient simulators use computer graphics to render humans with a range of visual characteristics including medical symptoms, personality, race, and gender. However, they are inherently virtual--practitioners cannot manipulate them with their hands. Manikin-based patient simulators on the other hand are inherently physical, comprising human-sized bodies with realistic skin and electro-mechanical simulation of physiological symptoms. They afford a "hands on" experience but are very limited in their ability to present visual characteristics. Furthermore, medical educators are increasingly focusing on interpersonal skills and cultural competency, as these impact provider-patient relationships, diagnoses, and treatments. Manikins do not afford the associated humanistic traits.
The researchers on this project are developing a Physical-Virtual Patient Bed (PVPB) that combines the flexibility of virtual patients with the physicality of manikins. The PVPB will employ dynamic computer graphics rear-projected onto a body-shaped shell mounted in a real hospital bed, along with various sensors and actuators, to create a patient simulator that talks; appears to sweat, breathe, and squirm; exhibits a pulse; feels warm/cold on various body parts; and responds to touch by humans or medical instruments. It will be able to change race, gender, and visually-apparent symptoms on the fly, and will exhibit real human emotional complexity via real human agency. The researchers will assess the effectiveness of the PVPB in simulating certain conditions, and use it to develop new knowledge about the relative importance of various patient cues, and provider biases arising from patient demographics.
PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH
Note:
When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external
site maintained by the publisher. Some full text articles may not yet be available without a
charge during the embargo (administrative interval).
Some links on this page may take you to non-federal websites. Their policies may differ from
this site.
PROJECT OUTCOMES REPORT
Disclaimer
This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.
This project was primarily focused on the development and assessment of a new type of a patient simulator that could some day be widely used for the training and assessment of nurses and physicians. Specifically the researchers developed what they call a Physical-Virtual Patient Bed (PVPB) that combines the flexibility of computer graphics "virtual" patients with the physicality of conventional medical training manikins. See the included images of their pediatric PVPB prototype system alone and in use. The researchers created multiple PVPB prototypes. The prototypes employed dynamic computer graphics imagery that was projected onto the inside (from underneath) of a human body-shaped shell that was mounted in a metal frame resembling a hospital bed. The resulting system would display a dynamic patient, such as a child, on the body-shaped surface, such that the practitioner could stand next to the "bed" and talk with the patient, while touching their body when and where needed for diagnosis or comfort. The prototypes also included various sensors and actuators. For example they used sound and projected imagery to simulate patient talking and breathing; electronic tactile transducers to create a pulse that the practitioners could feel with their own hands; and forced air heating and cooling from underneath to create a feeling of warmth on the forehead (indicating a fever) or cold on their hands (indicating shock). Some prototypes incorporated cameras, infrared lighting, and novel computer algorithms to detect when a practitioner touched the simulated patient, and to respond by changing the patient imagery, e.g., to allow the practitioner to pull down on the patient's eyelid to check their sclera. The prototypes were able to change race, gender, and various visually-apparent symptoms on the fly. The simulated behavior was created by a remote human who observed the scene and controlled the patient response via a special computer interface. The result was very realistic human emotional intelligence and complexity. The researchers also carried out human subject research designed to evaluate the value and effectiveness of the PVPB paradigm and specific prototypes. The studies employed nursing and medical students who used the PVPB under various circumstances, e.g., stroke diagnosis, cases of the measles, and certain pediatric (children) needs.
During the COVID-19 pandemic the researchers were unable to access the specialized PVPB equipment, or to meet with nursing and medical students in person. Under these circumstances the researchers conceived of a new pandemic-related technology to provide an isolated patient and their remote visitors with a visual interaction augmented by touch --- a perception of being touched for the isolated patient, and a perception of touching for the remote visitors. For example, a loved one might be able to virtually stroke the patient?s arm or head, or even squeeze the patient's hand. The researchers called the approach Tactile Telepresence for Isolated Patients (TTIP). The researchers developed a complete functioning TTIP prototype system employing a "smart tablet" for the family member to hold and a tactile headband for the patient. See the included images of their TTIP "headband" electronics and the headband in use. Imagery of the patient was shown on the tablet, along with indications of areas where one could touch the patient image such that the sensations were transmitted to the patient's forehead via the TTIP headband, which included an array of small vibration devices. The researchers carried out some preliminary (pilot) experiments to evaluate how well the TTIP prototype functioned, and how user's perceived the remote touching. The experiments demonstrated a relatively high rate of recognition of tracing out geometric shapes on a remote person's forehead, and a feeling of not being alone when tested under conditions that simulated real isolation.
Over the life of the project the researchers published and/or presented over 25 relevant articles and several conference posters. There were 12 faculty members involved at various times, and there were four graduate students (with four PhD Degrees) and nine undergraduate students (general undergraduates and NSF REU students) involved.
Last Modified: 11/23/2022
Modified by: Gregory F Welch
Please report errors in award information by writing to: awardsearch@nsf.gov.