Skip to feedback

Award Abstract # 1265129
I-Corps: BlindNav: Indoor Navigation for the Visually Impaired

NSF Org: TI
Translational Impacts
Recipient: TRUSTEES OF THE UNIVERSITY OF PENNSYLVANIA, THE
Initial Amendment Date: September 25, 2012
Latest Amendment Date: September 25, 2012
Award Number: 1265129
Award Instrument: Standard Grant
Program Manager: Rathindra DasGupta
TI
 Translational Impacts
TIP
 Directorate for Technology, Innovation, and Partnerships
Start Date: October 1, 2012
End Date: March 31, 2014 (Estimated)
Total Intended Award Amount: $50,000.00
Total Awarded Amount to Date: $50,000.00
Funds Obligated to Date: FY 2012 = $50,000.00
History of Investigator:
  • Kostas Daniilidis (Principal Investigator)
    kostas@cis.upenn.edu
Recipient Sponsored Research Office: University of Pennsylvania
3451 WALNUT ST STE 440A
PHILADELPHIA
PA  US  19104-6205
(215)898-7293
Sponsor Congressional District: 03
Primary Place of Performance: University of Pennsylvania
PA  US  19104-6205
Primary Place of Performance
Congressional District:
03
Unique Entity Identifier (UEI): GM1XX56LEP58
Parent UEI: GM1XX56LEP58
NSF Program(s): I-Corps
Primary Program Source: 01001213DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s):
Program Element Code(s): 802300
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.084

ABSTRACT

There are currently very few ways for the blind to navigate a new indoor space without the assistance of a fully-sighted person. The technology proposed by this project is designed to enable a visually-impaired individual to find their way through large indoor environments such as airports, train stations and shopping malls by recognizing semantic and salient visual features of the environment. There is no prior visit or mapping of the environment required, and there is no need to deploy or utilize any special infrastructure like WiFi access points or infrared beacons. Researchers plan to use publically available architectural lay-outs and information about the location of ships, tracks, gates and other visual cues. The platform is a cell-phone mounted on a necklace that provides turn-by-turn directions through an audio-voice command interface. This technology is designed to process video from the cell phone camera in real-time using text and logo detection, localization based on prior knowledge of the layout and integration of accelerometer and visual odometry.

The blind and visually-impaired population in the United States is large and expected to grow in the future. If successfully implemented, this technology could have broader reaching applications, including many location-based services such as aiding those with spatial learning difficulties or guiding users to a specific location. The project team has the expertise required to develop this technology at a relatively rapid rate and economical cost.

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

Independent mobility is a critical function of humans and it is almost universally taken for granted by fully sighted people. This process involves localization, navigation, and obstacle avoidance. While obstacle avoidance might be partially solved using canes or guide dogs, and outdoor navigation is facilitated by GPS systems, indoor navigation remains an intractable challenge. The I-Corps team proposes a new product, BlindNav, which enables a visually impaired person to find her way in large indoor environments like airports, train stations, and malls by recognizing semantic and salient visual features of the environment. No prior visits to the train station or mapping of the station with SLAM techniques is needed. Instead  architectural lay-outs available publicly and information about location of shops, tracks, gates, and other visual cues are used.  The platform is a cell-phone mounted on a necklace and providing turn by turn directions through a voice interface.  BlindNav processes video from the cell phone camera in real-time using three main technologies: text and logo detection, localization based on prior knowledge of the layout (like vertical lines), integration of vanishing point detection and accelerometer data, and short-range visual odometry to mitigate hard visibility conditions.  There is currently no way for a blind person to navigate in a new space without the assistance of a fully-sighted person. The team aims to revolutionize the world for blind people by providing an unprecedented independence. The integration of “accessibility” software such as audio-feedback touch screens has put mobile technology such as smartphones and tablets in the hands of the visually impaired.  The team has partnered with several members of the blind community towards developing a simple, intuitive audio interface and furthermore, a grounds root adoption. BlindNav is scheduled for nationwide commercial launch in October 2013.  Currently, in pre-alpha development, the team is leveraging experience in computer vision to port cutting-edge, proven research to the cell phone. Because, the key to BlindNav’s success is The team’s integration filter which combines multiple open source computer vision tools towards localization, all IP is owned by the company.  The team has implemented text detection and vertical line detection on the cell phone to date. With the revolution of Apple's universal access software-which makes mobile software usable for the disabled, other companies have followed suit, thus changing the entire landscape of assistive technology.  The team's product would be the only product of its kind, and would offer a unique and elegant solution an affordable price. 


Last Modified: 07/11/2014
Modified by: Kostas Daniilidis

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page