Award Abstract # 1564065
CHS: Medium: Physical-Virtual Patient Bed for Healthcare Training and Assessment

NSF Org: IIS
Division of Information & Intelligent Systems
Recipient: THE UNIVERSITY OF CENTRAL FLORIDA BOARD OF TRUSTEES
Initial Amendment Date: June 30, 2016
Latest Amendment Date: May 26, 2021
Award Number: 1564065
Award Instrument: Standard Grant
Program Manager: Ephraim Glinert
IIS
 Division of Information & Intelligent Systems
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: August 1, 2016
End Date: July 31, 2022 (Estimated)
Total Intended Award Amount: $894,431.00
Total Awarded Amount to Date: $918,931.00
Funds Obligated to Date: FY 2016 = $894,431.00
FY 2020 = $24,500.00
History of Investigator:
  • Gregory Welch (Principal Investigator)
    welch@ucf.edu
  • Juan Cendan (Co-Principal Investigator)
  • Laura Gonzalez (Co-Principal Investigator)
Recipient Sponsored Research Office: The University of Central Florida Board of Trustees
4000 CENTRAL FLORIDA BLVD
ORLANDO
FL  US  32816-8005
(407)823-0387
Sponsor Congressional District: 10
Primary Place of Performance: University of Central Florida
Orlando
FL  US  32826-3281
Primary Place of Performance
Congressional District:
10
Unique Entity Identifier (UEI): RD7MXJV7DKT9
Parent UEI:
NSF Program(s): HCC-Human-Centered Computing
Primary Program Source: 01001617DB NSF RESEARCH & RELATED ACTIVIT
01002021DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 7924, 7367, 9251
Program Element Code(s): 736700
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070

ABSTRACT

Flight simulators offer pilots a chance to safely practice flying a wide range of aircraft in a variety of scenarios. Similarly, human patient simulators offer nurses and physicians safe opportunities to practice healthcare on a wide range of patients and scenarios. Virtual patient simulators use computer graphics to render humans with a range of visual characteristics including medical symptoms, personality, race, and gender. However, they are inherently virtual--practitioners cannot manipulate them with their hands. Manikin-based patient simulators on the other hand are inherently physical, comprising human-sized bodies with realistic skin and electro-mechanical simulation of physiological symptoms. They afford a "hands on" experience but are very limited in their ability to present visual characteristics. Furthermore, medical educators are increasingly focusing on interpersonal skills and cultural competency, as these impact provider-patient relationships, diagnoses, and treatments. Manikins do not afford the associated humanistic traits.

The researchers on this project are developing a Physical-Virtual Patient Bed (PVPB) that combines the flexibility of virtual patients with the physicality of manikins. The PVPB will employ dynamic computer graphics rear-projected onto a body-shaped shell mounted in a real hospital bed, along with various sensors and actuators, to create a patient simulator that talks; appears to sweat, breathe, and squirm; exhibits a pulse; feels warm/cold on various body parts; and responds to touch by humans or medical instruments. It will be able to change race, gender, and visually-apparent symptoms on the fly, and will exhibit real human emotional complexity via real human agency. The researchers will assess the effectiveness of the PVPB in simulating certain conditions, and use it to develop new knowledge about the relative importance of various patient cues, and provider biases arising from patient demographics.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

(Showing: 1 - 10 of 41)
Jason Hochreiter, Salam Daher, Gerd Bruder, Gregory Welch "Cognitive and Touch Performance Effects of Mismatched 3D Physical and Visual Perceptions" IEEE VR 2018 , 2018
Anderson, M., Diaz, D., Losekamp, T., & Welch, G. "Parental behaviors in episodic medical visits: A qualitative study to inform augmented reality design" International Nursing Association for Clinical Simulation and Learning (INACSL) 2021 Annual Conference, Denver, CO , 2021
Daher, Salam and Hochreiter, Jason and Norouzi, Nahal and Gonzalez, Laura and Bruder, Gerd and Welch, Greg "Physical-Virtual Agents for Healthcare Simulation" International Conference on Intelligent Virtual Agents , 2018 10.1145/3267851.3267876 Citation Details
Daher, Salam and Hochreiter, Jason and Norouzi, Nahal and Schubert, Ryan and Bruder, Gerd and Gonzalez, Laura and Anderson, Mindi and Diaz, Desiree and Cendan, Juan and Welch, Greg "[POSTER] Matching vs. Non-Matching Visuals and Shape for Embodied Virtual Healthcare Agents" IEEE Virtual Reality , 2019 Citation Details
Daher,Salam; Hochreiter,Jason; Schubert,Ryan; Gonzalez,Laura; Cendan,Juan.; Anderson,Mindi., Díaz, Desiree; Welch,Gregory. "The Physical-Virtual Patient Simulator: A Physical Human Form with Virtual Appearance and Behavior" Simulation in Healthcare , 2020 10.1097/SIH.0000000000000409
Eike Langbehn, Frank Steinicke, Markus Lappe, Gregory F. Welch, and Gerd Bruder "In the Blink of an Eye ? Leveraging Blink-Induced Suppression for Imperceptible Position and Orientation Redirection in Virtual Reality" ACM Transactions of Graphics (TOG), Special Issue on ACM SIGGRAPH 2018 , v.37 , 2018 , p.1 10.1145/3197517.3201335
Erickson, Austin and Bruder, Gerd and Wisniewski, Pamela J. and Welch, Gregory F. "Examining Whether Secondary Effects of Temperature-Associated Virtual Stimuli Influence Subjective Perception of Duration" 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) , 2020 https://doi.org/10.1109/VR46266.2020.00070 Citation Details
Erickson, Austin and Reiners, Dirk and Bruder, Gerd and Welch, Gregory F. "Augmenting Human Perception: Mediation of Extrasensory Signals in Head-Worn Augmented Reality" 2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) , 2021 https://doi.org/10.1109/ISMAR-Adjunct54149.2021.00085 Citation Details
Gonzalez,Laura; Daher,Salam; Welch,Gregory. "Neurological Assessment Using a Physical-Virtual Patient" Simulation and Gaming , 2020 10.1177/1046878120947462
Gottsacker, Matt and Norouzi, Nahal and Kim, Kangsoo and Bruder, Gerd and Welch, Greg "Diegetic Representations for Seamless Cross-Reality Interruptions" 2021 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) , 2021 https://doi.org/10.1109/ISMAR52148.2021.00047 Citation Details
Guido-Sanz, F., Anderson, M., Díaz, D., Welch, G., & Gonzalez, L. "Using XR technology to innovate healthcare education" International Nursing Association for Clinical Simulation and Learning (INACSL) 2021 Annual Conference , 2021
(Showing: 1 - 10 of 41)

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

This project was primarily focused on the development and assessment of a new type of a patient simulator that could some day be widely used for the training and assessment of nurses and physicians. Specifically the researchers developed what they call a Physical-Virtual Patient Bed (PVPB) that combines the flexibility of computer graphics "virtual" patients with the physicality of conventional medical training manikins. See the included images of their pediatric PVPB prototype system alone and in use. The researchers created multiple PVPB prototypes. The prototypes employed dynamic computer graphics imagery that was projected onto the inside (from underneath) of a human body-shaped shell that was mounted in a metal frame resembling a hospital bed. The resulting system would display a dynamic patient, such as a child, on the body-shaped surface, such that the practitioner could stand next to the "bed" and talk with the patient, while touching their body when and where needed for diagnosis or comfort. The prototypes also included various sensors and actuators. For example they used sound and projected imagery to simulate patient talking and breathing; electronic tactile transducers to create a pulse that the practitioners could feel with their own hands; and forced air heating and cooling from underneath to create a feeling of warmth on the forehead (indicating a fever) or cold on their hands (indicating shock). Some prototypes incorporated cameras, infrared lighting, and novel computer algorithms to detect when a practitioner touched the simulated patient, and to respond by changing the patient imagery, e.g., to allow the practitioner to pull down on the patient's eyelid to check their sclera. The prototypes were able to change race, gender, and various visually-apparent symptoms on the fly. The simulated behavior was created by a remote human who observed the scene and controlled the patient response via a special computer interface. The result was very realistic human emotional intelligence and complexity. The researchers also carried out human subject research designed to evaluate the value and effectiveness of the PVPB paradigm and specific prototypes. The studies employed nursing and medical students who used the PVPB under various circumstances, e.g., stroke diagnosis, cases of the measles, and certain pediatric (children) needs.

During the COVID-19 pandemic the researchers were unable to access the specialized PVPB equipment, or to meet with nursing and medical students in person. Under these circumstances the researchers conceived of a new pandemic-related technology to provide an isolated patient and their remote visitors with a visual interaction augmented by touch --- a perception of being touched for the isolated patient, and a perception of touching for the remote visitors. For example, a loved one might be able to virtually stroke the patient?s arm or head, or even squeeze the patient's hand. The researchers called the approach Tactile Telepresence for Isolated Patients (TTIP). The researchers developed a complete functioning TTIP prototype system employing a "smart tablet" for the family member to hold and a tactile headband for the patient. See the included images of their TTIP "headband" electronics and the headband in use. Imagery of the patient was shown on the tablet, along with indications of areas where one could touch the patient image such that the sensations were transmitted to the patient's forehead via the TTIP headband, which included an array of small vibration devices. The researchers carried out some preliminary (pilot) experiments to evaluate how well the TTIP prototype functioned, and how user's perceived the remote touching. The experiments demonstrated a relatively high rate of recognition of tracing out geometric shapes on a remote person's forehead, and a feeling of not being alone when tested under conditions that simulated real isolation.

Over the life of the project the researchers published and/or presented over 25 relevant articles and several conference posters. There were 12 faculty members involved at various times, and there were four graduate students (with four PhD Degrees) and nine undergraduate students (general undergraduates and NSF REU students) involved.

 


Last Modified: 11/23/2022
Modified by: Gregory F Welch

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page