Award Abstract # 1942444
CAREER: Fast Foveation: Bringing Active Vision into the Camera

NSF Org: IIS
Division of Information & Intelligent Systems
Recipient: UNIVERSITY OF FLORIDA
Initial Amendment Date: July 1, 2020
Latest Amendment Date: July 11, 2024
Award Number: 1942444
Award Instrument: Continuing Grant
Program Manager: Jie Yang
jyang@nsf.gov
 (703)292-4768
IIS
 Division of Information & Intelligent Systems
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: September 1, 2020
End Date: August 31, 2026 (Estimated)
Total Intended Award Amount: $519,256.00
Total Awarded Amount to Date: $519,256.00
Funds Obligated to Date: FY 2020 = $100,314.00
FY 2021 = $145,538.00

FY 2022 = $60,085.00

FY 2023 = $105,759.00

FY 2024 = $107,560.00
History of Investigator:
  • Sanjeev Koppal (Principal Investigator)
    sjkoppal@ece.ufl.edu
Recipient Sponsored Research Office: University of Florida
1523 UNION RD RM 207
GAINESVILLE
FL  US  32611-1941
(352)392-3516
Sponsor Congressional District: 03
Primary Place of Performance: University of Florida
1 University Drive
Gainesville
FL  US  32611-0001
Primary Place of Performance
Congressional District:
03
Unique Entity Identifier (UEI): NNFQH1JAPEP3
Parent UEI:
NSF Program(s): Robust Intelligence
Primary Program Source: 01002223DB NSF RESEARCH & RELATED ACTIVIT
01002122DB NSF RESEARCH & RELATED ACTIVIT

01002324DB NSF RESEARCH & RELATED ACTIVIT

01002021DB NSF RESEARCH & RELATED ACTIVIT

01002425DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 1045, 7495
Program Element Code(s): 749500
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070

ABSTRACT

The prevalence of foveation, and the wide variety of it in the living world, makes it very clear that this is an effective visual design strategy. This project is about copying foveation, by building fast cameras that can optically concentrate sensing resources onto areas of interest in the world around them. Doing this can improve sensing performance for computer vision-enabled intelligent systems. On resource-constrained platforms, such as robots or spacecraft, adaptively sensing only on areas of interest improves efficiency. This project will create capability that enables a variety of sensing applications. Throughout the project timeline, research outcomes will be integrated in the investigator's hardware/software bridging courses, focused on fundamental procedures such as camera calibration. In addition, a program called LensLearning will be started, to spread foveating camera concepts beyond the lab. LensLearning includes impacting high-school students through special University of Florida programs with hands-on projects. It also enables the training of one high-school student and one undergraduate senior every summer through this project's timeline, by working with the University of Florida's associated programs, with the goal of giving opportunities to underrepresented minorities in foveated camera research.

Although the idea of artificial foveation has been explored with slow, mechanical means of motion, in this project the foveating cameras and accompanying algorithms will be much faster because they exploit newly available, next generation micro-mechanical optics that can quickly and adaptively change the camera resolution. The first phase of this project involves building the fast foveating camera test-bed and characterizing the fundamental limits of fast foveation for dynamic scenes through an optical model that considers modulation speed, camera field-of-view, noise, motion and long-range effects. The second phase involves demonstrating tracking advantages in dynamic scenes with variants of the fast foveation setup, such as co-located systems and arrays of foveating cameras. Evaluations in simulation will be done using widely available datasets by comparing processing power and imaging efficiency. Real evaluation will also be done on the test bed, resulting in the release of a novel foveated dataset of dynamic scenes of everyday objects. In the last phase, the developed systems and algorithms will be used to demonstrate extreme imaging applications by combining both large baselines and co-located multimodal systems, showing capabilities such as glasses-free eye-tracking, imaging in dark environments and fast face imaging for robotics.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

Folden, Justin and Ingle, Atul and Koppal, Sanjeev J "FoveaSPAD: Exploiting Depth Priors for Adaptive and Efficient Single-Photon 3D Imaging" IEEE Transactions on Computational Imaging , v.10 , 2024 https://doi.org/10.1109/TCI.2024.3503360 Citation Details
Tilmon, Brevin and Jain, Eakta and Ferrari, Silvia and Koppal, Sanjeev Jagannatha "Fast Foveating Cameras for Dense Adaptive Resolution" IEEE Transactions on Pattern Analysis and Machine Intelligence , 2021 https://doi.org/10.1109/TPAMI.2021.3071588 Citation Details
Tilmon, Brevin and Sun, Zhanghao and Koppal, Sanjeev J. and Wu, Yicheng and Evangelidis, Georgios and Zahreddine, Ramzi and Krishnan, Gurunandan and Ma, Sizhuo and Wang, Jian "Energy-Efficient Adaptive 3D Sensing" IEEE Conference on Computer Vision and Pattern Recognition , 2023 Citation Details
Zhang, Yuxuan and Koppal, Sanjeev J "FoveaCam++: Systems-Level Advances for Long Range Multi-Object High-Resolution Tracking" , 2024 https://doi.org/10.1109/IROS58592.2024.10802188 Citation Details

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page