Skip to feedback

Award Abstract # 2332060
Integrating Federated Split Neural Network with Artificial Stereoscopic Compound Eyes for Optical Flow Sensing in 3D Space with Precision

NSF Org: ECCS
Division of Electrical, Communications and Cyber Systems
Recipient: RECTOR & VISITORS OF THE UNIVERSITY OF VIRGINIA
Initial Amendment Date: April 1, 2024
Latest Amendment Date: April 1, 2024
Award Number: 2332060
Award Instrument: Standard Grant
Program Manager: Huaiyu Dai
hdai@nsf.gov
 (703)292-4568
ECCS
 Division of Electrical, Communications and Cyber Systems
ENG
 Directorate for Engineering
Start Date: October 1, 2024
End Date: September 30, 2027 (Estimated)
Total Intended Award Amount: $400,000.00
Total Awarded Amount to Date: $400,000.00
Funds Obligated to Date: FY 2024 = $400,000.00
History of Investigator:
  • Kyusang Lee (Principal Investigator)
    kl6ut@virginia.edu
  • Cong Shen (Co-Principal Investigator)
Recipient Sponsored Research Office: University of Virginia Main Campus
1001 EMMET ST N
CHARLOTTESVILLE
VA  US  22903-4833
(434)924-4270
Sponsor Congressional District: 05
Primary Place of Performance: University of Virginia Main Campus
1001 N EMMET ST
CHARLOTTESVILLE
VA  US  22903-4833
Primary Place of Performance
Congressional District:
05
Unique Entity Identifier (UEI): JJG6HU8PA4S5
Parent UEI:
NSF Program(s): CCSS-Comms Circuits & Sens Sys
Primary Program Source: 01002425DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 153E
Program Element Code(s): 756400
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.041

ABSTRACT

This project is aiming to develop an innovative image sensor inspired by arthropod eyes, featuring a wide field of view, high-speed operation, and efficient object tracking. These smart sensors can dramatically expand the field of view, enhance response speeds, and operate with energy efficiency. Central to this advancement is the fusion of photodiodes with artificial synapses and mimicry of geometrical shape, a pairing that mirrors the biological processes of neural networks within the insect eye. This configuration not only facilitates the rapid processing of visual information but also reduces the energy required to do so, marking a significant step forward from traditional planar imaging systems. Unlike conventional systems, the sensor arrays will adopt hemispherical designs, inspired by nature's own solution to wide-angle and efficient vision. This geometric optimization is crucial for capturing the patterns of movement across a visual scene with enhanced accuracy and depth, making the technology invaluable for applications that require precise motion detection and spatial awareness. To process the high-dimensional data captured by these sensors, the project further introduces a specialized neural network architecture. This approach divides data processing tasks between the sensors and a central processing unit, ensuring swift, efficient, and precise analysis. Such a configuration is particularly suited for dynamic environments where rapid decision-making is essential. By emulating the intricate vision systems found in nature, it offers a glimpse into a future where technology and biology converge, providing impactful solutions. The potential applications of this work span across various sectors, promising to enhance the efficiency, and capabilities of systems in robotics, autonomous driving, and beyond, thereby advancing the progress of science and technology for the benefit of society.

The goal of this project is to develop and integrate a novel artificial compound eye system capable of advanced in-situ object tracking and depth perception. This system leverages the unique advantages of a photodiode integrated with an artificial synaptic device, embodying a computational layer for immediate data processing akin to biological synaptic functions. The hardware design, inspired by the hemispherical structure of arthropod eyes, enables a wide field of view and rapid image acquisition. To address the challenges of processing high-dimensional visual data efficiently, a communication, storage, and energy-efficient federated split learning framework will be employed. This framework optimizes data processing by distributing computational tasks between the sensor and a centralized processing unit, significantly enhancing the system's real-time object tracking capabilities. By integrating advanced hardware with software algorithms, this project aims to create a system that not only advances the scientific understanding of bio-inspired imaging but also offers practical solutions for real-world applications. The successful execution of this project is expected to set a new standard in optical sensing technology, contributing significantly to the fields of computer vision and neuromorphic computing.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

Liu, R and Shen, C and Yang, J "Federated Representation Learning in the Under-Parameterized Regime" , 2024 Citation Details

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page