
NSF Org: |
CNS Division Of Computer and Network Systems |
Recipient: |
|
Initial Amendment Date: | September 16, 2013 |
Latest Amendment Date: | September 16, 2013 |
Award Number: | 1337866 |
Award Instrument: | Standard Grant |
Program Manager: |
Rita Rodriguez
CNS Division Of Computer and Network Systems CSE Directorate for Computer and Information Science and Engineering |
Start Date: | October 1, 2013 |
End Date: | September 30, 2017 (Estimated) |
Total Intended Award Amount: | $200,109.00 |
Total Awarded Amount to Date: | $200,109.00 |
Funds Obligated to Date: |
|
History of Investigator: |
|
Recipient Sponsored Research Office: |
1 SILBER WAY BOSTON MA US 02215-1703 (617)353-4365 |
Sponsor Congressional District: |
|
Primary Place of Performance: |
111 Cummington Mall Boston MA US 02215-1300 |
Primary Place of
Performance Congressional District: |
|
Unique Entity Identifier (UEI): |
|
Parent UEI: |
|
NSF Program(s): | Information Technology Researc |
Primary Program Source: |
|
Program Reference Code(s): |
|
Program Element Code(s): |
|
Award Agency Code: | 4900 |
Fund Agency Code: | 4900 |
Assistance Listing Number(s): | 47.070 |
ABSTRACT
Proposal #: 13-38118 Collaborative Proposal #: 13-37866
PI(s): Makedon, Fillia S collaboration with PI(s): Betke, Margrit
Athitsos, Vassilis; Gatchel, Robert J; Huang, Heng; Romero-Ortega, Mario I
Institution: University of Texas-Arlington collaboration with Institution: Boston Univeristy
Title: MRI/Dev :Collab Dev. of iRehab, an Intelligent Closed-loop Instrument for Adaptive Rehabilitation
Project Proposed:
This project, developing of an instrument referred to as iRehab, aims to enable personalized rehabilitation therapy for individuals suffering from brain injury, motor disabilities, cognitive impairments, and/or psychosocial symptoms. The instrument, a modular rehabilitation device, in its simplest form consists of a computer, a camera, and adaptive software for assessment and training of cognitive functions. In its final, most complex form, the instrument will integrate data from a 4-degree-of-freedom robotic-arm with gimbals and torque sensing, a Kinect sensor, multiple cameras, an eye-tracking device, a touch screen, a microphone, and an fNIRS brain imaging sensor.
The instrument will be developed in two phases. In the first phase, the investigators develop a Barrett robot arm. In the second phase, the instrument will extend to a Kinect sensor, multiple cameras, an eye-tracking device, and related low-cost components, along with the assessment software for assessing motor function and cognitive, emotional, and personality functioning.
iRehab consists integrates multidisciplinary methodologies and sensors to assess and assist the cognitive and physical rehabilitation of persons affected by various impairments. This work highly interdisciplinary work follows a cyber-physical approach. It provides new research opportunities across the fields of human-centered computing, computer vision, assistive technology, robotics, machine learning, and neuroimaging. This work advances research in human brain activity mapping, personalized medicine, and big data.
Broader Impacts:
The proposed instrument exhibits potential for large broader impact as it directly contributes to future healthcare and human wellbeing improving accessibility to affordable rehabilitation for a broad range of patients. The instrument is likely to accelerate the recovery of a large spectrum of injuries and diseases including those causing motor, neurological, and cognitive disorders. An education plan includes course development, internships, workshops and tutorials, and an on-line resource center. In addition to many educational impacts, impact will be felt on the fundamental research in the areas addressed.
PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH
Note:
When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external
site maintained by the publisher. Some full text articles may not yet be available without a
charge during the embargo (administrative interval).
Some links on this page may take you to non-federal websites. Their policies may differ from
this site.
PROJECT OUTCOMES REPORT
Disclaimer
This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.
The project built iRehab, a modular, adaptive, easy-to-use intelligent instrument that supports the personalized rehabilitation therapy for individuals suffering from brain injury, motor disabilities, or cognitive impairments. iRehab uses the Proficio robotic arm with accurate torque sensing output. The robotic arm can be instructed to move along a trajectory in 3D space, and the user, while holding onto the robot's hand, experiences this motion trajectory. The research team paired the robotic arm with a Kinect sensor and an Oculus Rift virtual reality headset (see image). Experiments showed that the Kinect interface can monitor upper-body exercises with an appropriate level of accuracy. The result was obtained by comparing the trajectories measured with the Kinect interface to the ground-truth trajectories obtained with the Proficio robotic arm. The margin of error depended on the relative position between the Kinect and the Proficio and the direction of the exercise motion.
The team developed the DyAd approach (short for "Dynamic Adjustment" approach), which recommends adjusted rehabilitation exercise configurations based on a person's performance. The system can compare the performed movement trajectory with a desired trajectory and measure differences. Based on these differences, the system can make adjustments, for example, in the difficulty level of the exercise.
For users with severe motion disabilities, who cannot use the traditional keyboard and mouse to access a computer, iRehab provides a novel interaction mechanism that uses a virtual keyboard. The mechanism is called EyeSwipe and requires a gaze detection device to be connected with iRehab. EyeSwipe is a dwell-time-free gaze-typing method. With EyeSwipe, the user gaze-types the first and last characters of a word using the novel selection strategy “reverse crossing.” To gaze-type the characters in the middle of the word, the user only needs to glance at the vicinity of the respective keys.
The project supported the professional preparation of numerous researchers, doctoral, masters, and undergraduate students.
Last Modified: 12/23/2017
Modified by: Margrit Betke
Please report errors in award information by writing to: awardsearch@nsf.gov.