Award Abstract # 1534010
STTR Phase II: An Assistive Tool to Locate People and Objects with a Multimodal Thermogram Interface

NSF Org: TI
Translational Impacts
Recipient: MOAI TECHNOLOGIES L.L.C.
Initial Amendment Date: September 14, 2015
Latest Amendment Date: November 7, 2017
Award Number: 1534010
Award Instrument: Standard Grant
Program Manager: Muralidharan Nair
TI
 Translational Impacts
TIP
 Directorate for Technology, Innovation, and Partnerships
Start Date: September 15, 2015
End Date: February 28, 2019 (Estimated)
Total Intended Award Amount: $742,386.00
Total Awarded Amount to Date: $890,856.00
Funds Obligated to Date: FY 2015 = $742,386.00
FY 2018 = $148,470.00
History of Investigator:
  • Brian Hanzal (Principal Investigator)
    brian_hanzal@yahoo.com
  • Nicholas Giudice (Co-Principal Investigator)
Recipient Sponsored Research Office: Moai Technologies L.L.C.
14300 34TH AVE N
MINNEAPOLIS
MN  US  55447-5207
(612)481-8723
Sponsor Congressional District: 03
Primary Place of Performance: University of Maine
348 Boardman Hall
Orono
ME  US  04469-5711
Primary Place of Performance
Congressional District:
02
Unique Entity Identifier (UEI): HKRCLDA9BG75
Parent UEI: VF25JKJN12U4
NSF Program(s): STTR Phase II
Primary Program Source: 01001516DB NSF RESEARCH & RELATED ACTIVIT
01001819DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 1591, 169E, 6840, 8035, 9139, HPCC
Program Element Code(s): 159100
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.084

ABSTRACT

The broader impact/commercial potential of this project is the assistive use by blind people of thermal imaging. The resulting product provides a person who is blind or visually impaired with the relevant information about the layout of an unfamiliar public space in order to assist blind users in everyday activities. Thermal imaging can differentiate people and objects from their background without the need for complex image analysis. The shape and the temperature of the human body allows the location of people to be easily determined. The societal impact will be assisting users in navigation of complex public spaces. A blind person can use a smartphone?s haptic touchscreen display to examine the thermal image to determine the location of people in front of them. Information about the layout of an unfamiliar public space can be learned from the heat and shape of materials. Examples would be locating vending machines like ATMs and train passes. The market sector for this technology will likely extend beyond assisting blind users, to include additional commercial opportunities as well.

This Small Business Technology Transfer Research (STTR) Phase 2 project will leverage past National Science Foundation Funded research to develop a product that a blind/low vision user can use to receive practical navigation and interaction information about their environment from a multimodal thermogram (thermal image) interface on a smartphone. There are no practical assistive technologies for blind or low vision users that allow them to locate people, objects, and the layout information of their surroundings other than exploring with a cane. This development will address the objective of creating an interface that provides both practical utility and will be accepted by the target demographic of blind users. This project represents an excellent translational path from NSF-sponsored research programs to a product that is built from the ground up on solid theoretical underpinnings and empirical findings from multimodal human information processing. This development will use thermal radiation from people, machines, lighting and heat retention differences in building materials and convert this data into a user interface to facilitate blind navigation and environment interaction. The product resulting will be a multimodal (kinesthetic, vibro-tactile, and auditory) interface for blind users of a smartphone to interpret and gain useful value from thermal image information.

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

Among the assistive devices for use by blind or low-vision persons are various applications on smartphones. In addition, smartphones now make possible the fusion of multiple technologies, for example, camera imaging and vibration (haptics), which can be combined to allow a blind or low-vision person to locate people and objects, and to interpret the layout of their surroundings. Most importantly, there are now commercially-available smartphone-add-on (and likely eventually, built-in) thermal imaging cameras. This is a unique range of information-gathering capabilities to the disabled user in a single, portable handheld assistive device. Moai Technologies LLC, through this grant from the NSF, has created a small suite of innovative applications using a smartphone and the capabilities of both an add-on thermal camera and the native optical camera. Our research partner on this effort was the VEMI Laboratory at the University of Maine. The director of the laboratory, who also served as the project’s co-PI, is a blind person, whose assistance in the direction of the app development was invaluable.

 

The device operates in two complementary modes. In one, the camera and signal processing can locate a person or object in a scene by the thermal signature, and display it on the screen of the smartphone, and the blind or low-vision person can move a finger over the screen, and when it coincides with the image, the phone vibrates. In the second mode, the system can be used ‘finger-free’ by vibrating when the image is in an (adjustable-sized) area in the center of the screen. For instance, by noting where the thermal image of a person is located, a blind or low-vision person can determine autonomously where an empty seat is, or where there is an opening in a line of people. This first-of-its-kind application opens the possibility for similar but diverse apps.

 

As part of an NSF Technology Enhancement for Commercial Partnerships supplemental program, we developed a sub-app which assists a blind or low-vision person in locating the open doors of a train or light-rail car. This type of assistance was felt to be of prime importance by the co-PI.

 


Last Modified: 03/29/2019
Modified by: Brian Hanzal

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page