Award Abstract # 1337866
MRI Collaborative: Development of iRehab, an Intelligent Closed-Loop Instrument for Adaptive Rehabilitation

NSF Org: CNS
Division Of Computer and Network Systems
Recipient: TRUSTEES OF BOSTON UNIVERSITY
Initial Amendment Date: September 16, 2013
Latest Amendment Date: September 16, 2013
Award Number: 1337866
Award Instrument: Standard Grant
Program Manager: Rita Rodriguez
CNS
 Division Of Computer and Network Systems
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: October 1, 2013
End Date: September 30, 2017 (Estimated)
Total Intended Award Amount: $200,109.00
Total Awarded Amount to Date: $200,109.00
Funds Obligated to Date: FY 2013 = $200,109.00
History of Investigator:
  • Margrit Betke (Principal Investigator)
    betke@cs.bu.edu
Recipient Sponsored Research Office: Trustees of Boston University
1 SILBER WAY
BOSTON
MA  US  02215-1703
(617)353-4365
Sponsor Congressional District: 07
Primary Place of Performance: CS Dept, Boston University
111 Cummington Mall
Boston
MA  US  02215-1300
Primary Place of Performance
Congressional District:
07
Unique Entity Identifier (UEI): THL6A6JLE1S7
Parent UEI:
NSF Program(s): Information Technology Researc
Primary Program Source: 01001314DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 1189
Program Element Code(s): 164000
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070

ABSTRACT

Proposal #: 13-38118 Collaborative Proposal #: 13-37866
PI(s): Makedon, Fillia S collaboration with PI(s): Betke, Margrit
Athitsos, Vassilis; Gatchel, Robert J; Huang, Heng; Romero-Ortega, Mario I
Institution: University of Texas-Arlington collaboration with Institution: Boston Univeristy
Title: MRI/Dev :Collab Dev. of iRehab, an Intelligent Closed-loop Instrument for Adaptive Rehabilitation
Project Proposed:
This project, developing of an instrument referred to as iRehab, aims to enable personalized rehabilitation therapy for individuals suffering from brain injury, motor disabilities, cognitive impairments, and/or psychosocial symptoms. The instrument, a modular rehabilitation device, in its simplest form consists of a computer, a camera, and adaptive software for assessment and training of cognitive functions. In its final, most complex form, the instrument will integrate data from a 4-degree-of-freedom robotic-arm with gimbals and torque sensing, a Kinect sensor, multiple cameras, an eye-tracking device, a touch screen, a microphone, and an fNIRS brain imaging sensor.
The instrument will be developed in two phases. In the first phase, the investigators develop a Barrett robot arm. In the second phase, the instrument will extend to a Kinect sensor, multiple cameras, an eye-tracking device, and related low-cost components, along with the assessment software for assessing motor function and cognitive, emotional, and personality functioning.
iRehab consists integrates multidisciplinary methodologies and sensors to assess and assist the cognitive and physical rehabilitation of persons affected by various impairments. This work highly interdisciplinary work follows a cyber-physical approach. It provides new research opportunities across the fields of human-centered computing, computer vision, assistive technology, robotics, machine learning, and neuroimaging. This work advances research in human brain activity mapping, personalized medicine, and big data.
Broader Impacts:
The proposed instrument exhibits potential for large broader impact as it directly contributes to future healthcare and human wellbeing improving accessibility to affordable rehabilitation for a broad range of patients. The instrument is likely to accelerate the recovery of a large spectrum of injuries and diseases including those causing motor, neurological, and cognitive disorders. An education plan includes course development, internships, workshops and tutorials, and an on-line resource center. In addition to many educational impacts, impact will be felt on the fundamental research in the areas addressed.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

(Showing: 1 - 10 of 14)
A. Joshi, C. Monnier, M. Betke, and S. Sclaroff "Comparing random forest approaches to segmenting and classifying gestures" Image and Video Computing , v.58 , 2017 , p.86 10.1016/j.imavis.2016.06.001
A. Joshi, S. Ghosh, M. Betke, S. Sclaroff, and H. Pfister. "Personalizing Gesture Recogntion using Hierarchical Bayesian Neural Networks" IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Waikiki Beach, Hawaii, July 2017, 10 pages. , 2017
Christopher W. Kwan, Isaac Paquette, John J. Magee, and Margrit Betke "Adaptive Sliding Menubars Make Existing Software More Accessible to People with Severe Motion Impairments" Universal Access in the Information Society , v.13 , 2014 , p.pages 5-2
E. Saraee and A. Joshi and M. Betke. "A therapeutic robotic system for the upper body based on the Proficio robotic arm" International Conference on Virtual Rehabilitation (ICVR 2017), Montreal, Canada, June 2017. , 2017 10.1109/ICVR.2017.8007498
E. Saraee, S. Singh, A. Joshi, and M. Betke "PostureCheck: Posture modeling for exercise assessment using the Microsoft Kinect" International Conference on Virtual Rehabilitation (ICVR 2017), Montreal, Canada, June 2017. , 2017 10.1109/ICVR.2017.8007497
E. Saraee, S. Singh, K. Hendron, M. Zheng, A. Joshi, T. Ellis, and M. Betke. "ExerciseCheck: Remote Monitoring and Evaluation Platform for Home Based Physical Therapy" 10th Annual International Conference on Pervasive Technologies Related to Assistive Environments (PETRA'17), Rhodes, Greece, June 2017. , 2017 10.1145/3056540.3064958
Joshi, A., L. Tickle-Degnen, S. Gunnery, T. Ellis, and M. Betke "Predicting Active Facial Expressivity in People with Parkinson's Disease" 9th Annual International Conference on Pervasive Technologies Related to Assistive Environments (PETRA'16), Corfu, Greece, June 2016. 4 pages , 2016
J. Zhang, S. Ma, M. Sameki, S. Sclaroff, M. Betke, Z. Lin, X. Shen, B. Price, and R. Mech "Salient Object Subitizing" International Journal of Computer Vision , v.124 , 2017 , p.169 10.1007/s11263-017-1011-0
Kurauchi, A., W. Feng, A. Joshi, C. Morimoto, and M. Betke "EyeSwipe: Dwell-free Text Entry Using Gaze Paths" CHI 2016, the Annual ACM Conference of the Special Interest Group on Computer-Human Interaction (SIGCHI), San Jose, California, May 2016. 4 pages , 2016
Le, H., A. Joshi and M. Betke "b3.js: A Library for Interactive Web Data Visualizations in Virtual Reality" IEEE Virtual Reality 2016 Conference in Greenville, South Carolina, March 19-23, 2016. , 2016
Saad, R. S. M., R. I. Elanwar, N. S. Abdel Kader, S. Mashali, and M. Betke "BCE-Arabic-v1 dataset: A step towards interpreting Arabic document images for people with visual impairments" 9th Annual International Conference on Pervasive Technologies Related to Assistive Environments (PETRA'16), Corfu, Greece, June 2016. 8 pages. , 2016
(Showing: 1 - 10 of 14)

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

The project built iRehab, a modular, adaptive, easy-to-use intelligent instrument that supports the personalized rehabilitation therapy for individuals suffering from brain injury, motor disabilities, or cognitive impairments. iRehab uses the Proficio robotic arm with accurate torque sensing output.  The robotic arm can be instructed to move along a trajectory in 3D space, and the user, while holding onto the robot's hand, experiences this motion trajectory. The research team paired the robotic arm with a Kinect sensor and an Oculus Rift virtual reality headset (see image).  Experiments showed that the Kinect interface can monitor upper-body exercises with an appropriate level of accuracy. The result was obtained by comparing the trajectories measured with the Kinect interface to the ground-truth trajectories obtained with the Proficio robotic arm. The margin of error depended on the relative position between the Kinect and the Proficio and the direction of the exercise motion.  

The team developed the DyAd approach (short for "Dynamic Adjustment" approach), which recommends adjusted rehabilitation exercise configurations based on a person's performance. The system can compare the performed movement trajectory with a desired trajectory and measure differences. Based on these differences, the system can make adjustments, for example, in the difficulty level of the exercise.

For users with severe motion disabilities, who cannot use the traditional keyboard and mouse to access a computer, iRehab provides a novel interaction mechanism that uses a virtual keyboard.  The mechanism is called EyeSwipe and requires a gaze detection device to be connected with iRehab.  EyeSwipe is a dwell-time-free gaze-typing method. With EyeSwipe, the user gaze-types the first and last characters of a word using the novel selection strategy “reverse crossing.” To gaze-type the characters in the middle of the word, the user only needs to glance at the vicinity of the respective keys.

The project supported the professional preparation of numerous researchers, doctoral, masters, and undergraduate students.  

 


Last Modified: 12/23/2017
Modified by: Margrit Betke

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page