
NSF Org: |
IIS Division of Information & Intelligent Systems |
Recipient: |
|
Initial Amendment Date: | July 1, 2020 |
Latest Amendment Date: | July 11, 2024 |
Award Number: | 1942444 |
Award Instrument: | Continuing Grant |
Program Manager: |
Jie Yang
jyang@nsf.gov (703)292-4768 IIS Division of Information & Intelligent Systems CSE Directorate for Computer and Information Science and Engineering |
Start Date: | September 1, 2020 |
End Date: | August 31, 2026 (Estimated) |
Total Intended Award Amount: | $519,256.00 |
Total Awarded Amount to Date: | $519,256.00 |
Funds Obligated to Date: |
FY 2021 = $145,538.00 FY 2022 = $60,085.00 FY 2023 = $105,759.00 FY 2024 = $107,560.00 |
History of Investigator: |
|
Recipient Sponsored Research Office: |
1523 UNION RD RM 207 GAINESVILLE FL US 32611-1941 (352)392-3516 |
Sponsor Congressional District: |
|
Primary Place of Performance: |
1 University Drive Gainesville FL US 32611-0001 |
Primary Place of
Performance Congressional District: |
|
Unique Entity Identifier (UEI): |
|
Parent UEI: |
|
NSF Program(s): | Robust Intelligence |
Primary Program Source: |
01002122DB NSF RESEARCH & RELATED ACTIVIT 01002324DB NSF RESEARCH & RELATED ACTIVIT 01002021DB NSF RESEARCH & RELATED ACTIVIT 01002425DB NSF RESEARCH & RELATED ACTIVIT |
Program Reference Code(s): |
|
Program Element Code(s): |
|
Award Agency Code: | 4900 |
Fund Agency Code: | 4900 |
Assistance Listing Number(s): | 47.070 |
ABSTRACT
The prevalence of foveation, and the wide variety of it in the living world, makes it very clear that this is an effective visual design strategy. This project is about copying foveation, by building fast cameras that can optically concentrate sensing resources onto areas of interest in the world around them. Doing this can improve sensing performance for computer vision-enabled intelligent systems. On resource-constrained platforms, such as robots or spacecraft, adaptively sensing only on areas of interest improves efficiency. This project will create capability that enables a variety of sensing applications. Throughout the project timeline, research outcomes will be integrated in the investigator's hardware/software bridging courses, focused on fundamental procedures such as camera calibration. In addition, a program called LensLearning will be started, to spread foveating camera concepts beyond the lab. LensLearning includes impacting high-school students through special University of Florida programs with hands-on projects. It also enables the training of one high-school student and one undergraduate senior every summer through this project's timeline, by working with the University of Florida's associated programs, with the goal of giving opportunities to underrepresented minorities in foveated camera research.
Although the idea of artificial foveation has been explored with slow, mechanical means of motion, in this project the foveating cameras and accompanying algorithms will be much faster because they exploit newly available, next generation micro-mechanical optics that can quickly and adaptively change the camera resolution. The first phase of this project involves building the fast foveating camera test-bed and characterizing the fundamental limits of fast foveation for dynamic scenes through an optical model that considers modulation speed, camera field-of-view, noise, motion and long-range effects. The second phase involves demonstrating tracking advantages in dynamic scenes with variants of the fast foveation setup, such as co-located systems and arrays of foveating cameras. Evaluations in simulation will be done using widely available datasets by comparing processing power and imaging efficiency. Real evaluation will also be done on the test bed, resulting in the release of a novel foveated dataset of dynamic scenes of everyday objects. In the last phase, the developed systems and algorithms will be used to demonstrate extreme imaging applications by combining both large baselines and co-located multimodal systems, showing capabilities such as glasses-free eye-tracking, imaging in dark environments and fast face imaging for robotics.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH
Note:
When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external
site maintained by the publisher. Some full text articles may not yet be available without a
charge during the embargo (administrative interval).
Some links on this page may take you to non-federal websites. Their policies may differ from
this site.
Please report errors in award information by writing to: awardsearch@nsf.gov.