
NSF Org: |
IIS Division of Information & Intelligent Systems |
Recipient: |
|
Initial Amendment Date: | September 11, 2007 |
Latest Amendment Date: | August 20, 2008 |
Award Number: | 0713229 |
Award Instrument: | Continuing Grant |
Program Manager: |
Ephraim Glinert
IIS Division of Information & Intelligent Systems CSE Directorate for Computer and Information Science and Engineering |
Start Date: | September 15, 2007 |
End Date: | August 31, 2011 (Estimated) |
Total Intended Award Amount: | $0.00 |
Total Awarded Amount to Date: | $385,840.00 |
Funds Obligated to Date: |
FY 2008 = $261,888.00 |
History of Investigator: |
|
Recipient Sponsored Research Office: |
1 SILBER WAY BOSTON MA US 02215-1703 (617)353-4365 |
Sponsor Congressional District: |
|
Primary Place of Performance: |
1 SILBER WAY BOSTON MA US 02215-1703 |
Primary Place of
Performance Congressional District: |
|
Unique Entity Identifier (UEI): |
|
Parent UEI: |
|
NSF Program(s): | HCC-Human-Centered Computing |
Primary Program Source: |
01000809DB NSF RESEARCH & RELATED ACTIVIT |
Program Reference Code(s): |
|
Program Element Code(s): |
|
Award Agency Code: | 4900 |
Fund Agency Code: | 4900 |
Assistance Listing Number(s): | 47.070 |
ABSTRACT
For individuals who suffer form extreme paralysis, the ability to communicate is often limited to yes/no responses using small head, hand, or eye movements. Providing these people with the ability to participate in the Information Society (for example, by browsing the Web, posting messages, or emailing friends), would greatly enhance their emotional well being by alleviating the frustration caused by having an active mind trapped in a paralyzed body. Previously, the PI has developed camera-based interfaces that have afforded partial attainment of such goals. In the current project, the PI will build upon her prior achievements and push forward on several fronts concurrently. She will develop a mouse-replacement interface that computes pointer coordinates from the user's head movements, which are detected by a multi-camera computer vision system. She will explore context-aware approaches for gesture detection that can distinguish a user's communicative motions from involuntary movements or social interactions. She will design and implement a novel tool that allows users to develop their own semaphore-based communication interfaces. She will build an application mediator that serves as an intercessor between several separate software components. And she will develop innovative assistive software for a variety of common applications including text entry, web browsing, and animation (in the latter case, which enables users capable only of controlling a mouse cursor with limited precision to create and interact with 3D objects).
Broader Impacts: Project outcomes will have a direct and positive impact on the quality of life of adults and children with severe disabilities, as well as their friends, families and caregivers. The software to be developed will be disseminated at special care facilities (including a hospital, schools for children with severe physical disabilities, and a long-term care facility for people with Multiple Sclerosis and ALS), and will also be available on the internet via free download. This work will also advance the state of the art in computer vision, through the development of technology that employs 3D models for real-time tracking of facial features with multiple pan/tilt/zoom cameras.
PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH
Note:
When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external
site maintained by the publisher. Some full text articles may not yet be available without a
charge during the embargo (administrative interval).
Some links on this page may take you to non-federal websites. Their policies may differ from
this site.
Please report errors in award information by writing to: awardsearch@nsf.gov.