Award Abstract # 0945236
SBIR Phase I: Visual Information Delivery Robot for Visually Impaired Children

NSF Org: TI
Translational Impacts
Recipient: ENURGA INC
Initial Amendment Date: November 4, 2009
Latest Amendment Date: June 7, 2010
Award Number: 0945236
Award Instrument: Standard Grant
Program Manager: Glenn H. Larsen
TI
 Translational Impacts
TIP
 Directorate for Technology, Innovation, and Partnerships
Start Date: January 1, 2010
End Date: December 31, 2010 (Estimated)
Total Intended Award Amount: $0.00
Total Awarded Amount to Date: $200,000.00
Funds Obligated to Date: FY 2010 = $200,000.00
History of Investigator:
  • Yudaya Sivathanu (Principal Investigator)
    sivathan@enurga.com
  • Hyukseong Kwon (Former Principal Investigator)
Recipient Sponsored Research Office: EN'URGA INC
1201 CUMBERLAND AVE
WEST LAFAYETTE
IN  US  47906-1359
(765)497-3269
Sponsor Congressional District: 04
Primary Place of Performance: EN'URGA INC
1201 CUMBERLAND AVE
WEST LAFAYETTE
IN  US  47906-1359
Primary Place of Performance
Congressional District:
04
Unique Entity Identifier (UEI): LFCSBTSRFJU2
Parent UEI:
NSF Program(s): SBIR Phase I
Primary Program Source: 01001011DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 1658, 5371, 9216, HPCC
Program Element Code(s): 537100
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.084

ABSTRACT

This Small Business Innovation Research (SBIR) Phase I project will evaluate the feasibility of a robotic system to detect and track a child's finger so that the scene that the child wishes to see can be displayed at high magnification on a monitor. There are two issues of intellectual merit that will be addressed during the Phase I work. The first issue is analyzing the pointing pattern of children. Since each person has his or her own pattern for raising an arm and pointing with a finger, the system requires a machine learning algorithm to adjust the decision from the system. This adaptive algorithm forms one of the key innovations that will be evaluated during the Phase I work. The second issue will be an adaptive zooming and scene segmentation algorithm.

There are two primary commercial applications for the proposed visual information delivery robot. The first involves the education of visually impaired children. Most of research and products currently available for visually impaired children are focused on learning while they are sitting on the chair in front of the computer monitor. However, the proposed system captures the scene that the child points to in any location, thus bringing a dynamic tool to education. It is anticipated that such tools will have a significant commercial potential in schools for the visually impaired. The second commercial application is in assisting visually impaired adults with enhanced dynamic information when they are in a wheelchair. The commercial and societal impact of the proposed project is that it will enable visually impaired children and adults to enhance their quality of life by adding a dynamic tool for visualizing near and far off objects. The algorithms developed during the Phase I research will also aid researchers in industrial automation with advanced robots.

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

Assistive Technologies have proven to be a significant help to the visually impaired children in their educational goals. Most of the products (and research) for the visually impaired are focused on learning while they are sitting on the chair in front of the computer monitor. A challenging goal is to extend such assistive technology products to a dynamic learning environment so as to provide greater benefits to visually impaired children. For the Phase I work, En’Urga Inc. developed a visual information delivery system for visually impaired children. Briefly, the system works as follows: With the initial signal (by touching a key on the keyboard or by a simple voice recognition), a stereo camera system mounted on a mobile platform detects the pointed finger of a child, then the system estimates where the scene/object that the child is pointing to is located, and finally the system displays the scene or object zooming in to show an enlarged view on the display for the visually-impaired child.

Based on the Phase I results, the feasibility of utilizing a mobile camera system to detect a hand, estimate the direction in which the index finger of the hand was pointing, and displaying the scene on a laptop computer was completely demonstrated. For the Phase II work, En’Urga Inc. will mount the system on a headphone. In practice, the visually impaired child or adult, wearing the headphone, can point to different objects in their surroundings and have a magnified view of the same displayed on a tablet PC or an Iphone. Some examples are a visually impaired person wanting to read the title of a book on the top shelf of the library, or a street sign while walking. In a classroom setting, it could be the need to observe an experiment being demonstrated by an instructor near the front of the class while the student is seated a distance.

There are two primary commercial applications for the visual information delivery robot. The first is in the education of visually impaired children. Our system captures the scene that the child points while he or she is moving around, enabling education in a dynamic environment. The second commercial application is in assisting visually impaired adults with enhanced information when they are in a wheelchair. The commercial and societal benefit of the proposed project is that it will enable visually impaired children and adults to enhance their quality of life by adding a dynamic tool for visualizing near and far off objects.


Last Modified: 03/01/2011
Modified by: Yudaya R Sivathanu

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page