Award Abstract # 1522125
SCH: INT: Collaborative Research: Replicating Clinic Physical Therapy at Home: Touch, Depth, and Epidermal Electronics in an Interactive Avatar System

NSF Org: IIS
Division of Information & Intelligent Systems
Recipient: UNIVERSITY OF CALIFORNIA, SAN DIEGO
Initial Amendment Date: August 19, 2015
Latest Amendment Date: August 19, 2015
Award Number: 1522125
Award Instrument: Standard Grant
Program Manager: Wendy Nilsen
wnilsen@nsf.gov
 (703)292-2568
IIS
 Division of Information & Intelligent Systems
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: September 1, 2015
End Date: August 31, 2020 (Estimated)
Total Intended Award Amount: $1,775,020.00
Total Awarded Amount to Date: $1,775,020.00
Funds Obligated to Date: FY 2015 = $1,775,020.00
History of Investigator:
  • Pamela Cosman (Principal Investigator)
    pcosman@ucsd.edu
  • Todd Coleman (Co-Principal Investigator)
  • Truong Nguyen (Co-Principal Investigator)
  • Sujit Dey (Co-Principal Investigator)
Recipient Sponsored Research Office: University of California-San Diego
9500 GILMAN DR
LA JOLLA
CA  US  92093-0021
(858)534-4896
Sponsor Congressional District: 50
Primary Place of Performance: University of California-San Diego
La Jolla
CA  US  92093-0934
Primary Place of Performance
Congressional District:
50
Unique Entity Identifier (UEI): UYTTZT6G9DT1
Parent UEI:
NSF Program(s): Smart and Connected Health
Primary Program Source: 01001516DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 8062, 8018
Program Element Code(s): 801800
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070

ABSTRACT

Physical therapy is often hampered by lack of access to therapists, and lack of adherence to home therapy regimens. This research develops a physical therapy assistance system for home use, with emphasis on stroke rehabilitation. As a person exercises, inexpensive cameras observe color and depth, and unobtrusive tattoo sensors monitor detailed muscle activity. The 3D movement trajectory is derived and compared against the exercise done with an expert therapist. The patient watches a screen avatar where arrows and color coding guide the patient to move correctly. In addition to advancing fields such as movement tracking, skin sensors, and assistive systems, the project has the potential for broad impact by attracting women and under-represented minorities to engineering through health-related engineering coursework and projects, and because home physical therapy assistance can especially help rural and under-served populations.

This project uses bio-electronics, computer vision, computer gaming, high-dimensional machine learning, and human factors to develop a home physical therapy assistance system. During home exercises, patient kinematics and physiology are monitored with a Kinect color/depth camera and wireless epidermal electronics transferable to the skin with a temporary tattoo. The project involves optimization of electrode design and wireless signaling for epidermal electronics to monitor spatiotemporal aspects of muscle recruitment, hand and body pose estimation and tracking algorithms that are robust to rapid motion and occlusions, and development of machine learning and avatar rendering algorithms for multi-modal sensor fusion and expert-trained optimal control guidance logic, for both cloud and local usage. The system aims to provide real-time feedback to make home sessions as effective as office visits with an expert therapist, reducing the time and money required for full recovery.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

(Showing: 1 - 10 of 50)
Amr Haj-Omar, Willie L. Thompson, Yun-Soung Kim, Todd P. Coleman "Adaptive Flexible Antennas for Wireless Biomedical Applications" 2016 IEEE 17th Annual Wireless and Microwave Technology Conference (WAMICON) , 2016 , p.1 10.1109/WAMICON.2016.7483836
B. Kang, Y. Lee, N. Jiang, and T. Q. Nguyen, "Depth adaptive neural networks for semantic segmentation" IEEE Transactions on Multimedia, 2017 , 2017
B. Kang, Y. Lee and T. Nguyen "Depth Adaptive Deep Neural Network for Semantic Segmentation" IEEE Trans. On Multimedia , 2018
B. Kang, S. Tripathi, and T.Q. Nguyen "Real-time sign language fingerspelling recognition using convolutional neural networks from depth map" 2015 3rd IAPR Asian Conference on Pattern Recognition (ACPR) , 2015 10.1109/ACPR.2015.7486481
B. Kang, S. Tripathi and T. Q. Nguyen "Generating images in compressed domain using Generative Adversarial Networks" IEEE Access , 2020 10.1109/ACCESS.2020.3027800
B. Kang, K. Tan, H. Tai, D. Tretter, and T. Q. Nguyen, "Hand segmentation for hand-object interactionfrom depth map" 2017 5th IEEE Global Conference on Signal and Information Processing , 2017
B. Kang, K.-H. Tan, H.-S. Tai, D. Tretter, and T. Nguyen "Hand Segmentation for Hand-Object Interaction from Depth map" Proceedings of 2017 IEEE Global Conference on Signal and Information Processing (GlobalSIP) , 2017
B. Kang and T. Q. Nguyen "Random Forest with Learned Representations for Semantic Segmentation" IEEE Trans. on Image Processing , v.28 , 2019 , p.3542 10.1109/TIP.2019.2905081
?B. Kang and T. Q. Nguyen "Random Forest with Learned Features for Semantic Segmentation" IEEE Transactions on Image Processing , 2019
Amr Haj-Omar, Yun-Soung Kim, Paul Glick, Mike Tolley, Willie Thompson II, and Todd Coleman "Stretchable and Flexible Adhesive-Integrated Antennas for Biomedical Applications" IEEE International Symposium on Antennas and Propagation, 2016 , 2016 10.1109/APS.2016.7695938
A. A. Gharibans, B. L. Smarr, D. C. Kunkel, L. J. Kriegsfeld, H. M. Mousa, and T. P. Coleman "Artifact Rejection Methodology Enables Continuous, Noninvasive Measurement of Gastric Myoelectric Activity in Ambulatory Subjects" Scientific Reports , 2018
(Showing: 1 - 10 of 50)

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

In this project, we developed a physical therapy monitoring and guidance system. To use the system, one first records the movement of a physical therapist who demonstrates a motion.  The physical therapist can then be shown as an avatar or model on a screen, performing the exercise.  At home, a patient can be trained or guided by this motion model using a tablet or laptop through a wireless network.  A camera watches the patient’s motion at home and compares it against the motion model.

The motion sequences of the physical therapist avatar and the patient might be misaligned due to human reaction delay or network delays, so we applied a Dynamic Time Warping (DTW) algorithm on the sequences to find their optimal alignment. To enable real-time evaluation and guidance, we developed a gesture-based DTW algorithm that separates out different gestures in the user’s motion in real time.  Experimental results show that the algorithm outperforms other alignment methods and enables real-time evaluation with low computational complexity.

After consultations with physical therapist colleagues, we developed a system that provides guidance after each gesture depending on the user performance. We found that combined visual and text guidance is the most effective to help the user improve performance accuracy.

Real-time tracking of hand articulations is important for accurate recording and guidance of physical therapy in real-time. We proposed an efficient hand tracking system which uses an adaptive hand model and depth maps. In our system, we track hand articulations by minimizing the discrepancy between the depth map from a sensor and a computer-generated hand model. We also re-initialize the hand pose at each frame using finger detection and classification.  Our system achieves both automatic hand model adjustment and real-time tracking with low complexity. 

 In a related problem, we considered sign language recognition.  We trained a convolutional neural network from depth maps for the classification of 31 alphabet letters and numbers using a subset of collected depth data from multiple subjects. We achieved 99.99% accuracy for signers who were in our set to be observed, and 84% accuracy for new signers. Accuracy improves as we include more data from different subjects during training.  With a processing time of 3 ms for the prediction of a single image, the proposed system achieves high accuracy and speed.  

We also proposed an automated system for balance evaluation using multiple sensors to enable on-demand balance evaluation at home. The system provides a quantified balance level consistent with a physical therapist’s assessments in traditional balance evaluation tests. We collected real patient clinic data to train our model. Experimental results show the high accuracy of the proposed systems. By using inexpensive sensors and artificial intelligence, the proposed virtual physical therapist and balance evaluation system has the potential of enabling on-demand virtual care and significantly reducing cost for both patients and care providers.

In addition to successfully developing the core physical therapy system and algorithms of this project, the project also made a number of fundamental advances in video processing and electronics.  To improve the accuracy of image pixel classification by utilizing depth information, we developed a depth-adaptive deep neural network which is able to adapt the receptive field not only for each layer but also for each neuron at the spatial location. To adjust the receptive field, we proposed the depth-adaptive multiscale convolution layer.  On a publicly available RGB-D dataset for multi-class per-pixel classification and a novel hand segmentation dataset for hand-object interaction, the proposed method outperformed state-of-the-art methods.

In terms of electronics for health monitoring, we developed fully functional systems containing sensors and integrated circuits for monitoring strain, fabricated on a flexible substrate embedded in an adhesive.  We also fabricated a flexible antenna, embedded within 3M Tegaderm adhesive. Methods for microfabrication of solderable and stretchable sensing systems (S4s) were demonstrated.  S4s are versatile and modular, and can be produced with an integrated adhesive, allowing them to be attached to the skin like a temporary tattoo.  S4s’ excellent solderability is achieved by the sputter-deposited nickel-vanadium and gold pad metal layers and copper interconnection. The feasibility for S4-based health monitoring was demonstrated by developing an S4 integrated with a strain gauge and an onboard optical indication circuit. S4 respiration sensors were tested for robustness for cyclic deformation, maximum stretchability, durability, and biocompatibility for multiday wear time. The test results and demonstration of the respiration sensing indicate that the adhesive-integrated S4s can provide users a way to unobtrusively monitor health. 

Beyond the many healthcare applications, the broader impact of this work includes the development of several public databases, and the investigators on this project have been active in K-12 outreach to stimulate interest in engineering, including the Splash high school program, campus lab tours, Girl Scout events, school science nights, summer camps, and a book on gender in STEM for high school girls. 


Last Modified: 11/09/2020
Modified by: Pamela C Cosman

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page