
NSF Org: |
OAC Office of Advanced Cyberinfrastructure (OAC) |
Recipient: |
|
Initial Amendment Date: | April 22, 2021 |
Latest Amendment Date: | August 30, 2021 |
Award Number: | 2104319 |
Award Instrument: | Standard Grant |
Program Manager: |
Sheikh Ghafoor
sghafoor@nsf.gov (703)292-7116 OAC Office of Advanced Cyberinfrastructure (OAC) CSE Directorate for Computer and Information Science and Engineering |
Start Date: | September 1, 2021 |
End Date: | August 31, 2025 (Estimated) |
Total Intended Award Amount: | $174,749.00 |
Total Awarded Amount to Date: | $209,624.00 |
Funds Obligated to Date: |
|
History of Investigator: |
|
Recipient Sponsored Research Office: |
820 N MICHIGAN AVE CHICAGO IL US 60611-2147 (773)508-2471 |
Sponsor Congressional District: |
|
Primary Place of Performance: |
1032 W. Sheridan Road Chicago IL US 60660-1537 |
Primary Place of
Performance Congressional District: |
|
Unique Entity Identifier (UEI): |
|
Parent UEI: |
|
NSF Program(s): | CDS&E |
Primary Program Source: |
|
Program Reference Code(s): |
|
Program Element Code(s): |
|
Award Agency Code: | 4900 |
Fund Agency Code: | 4900 |
Assistance Listing Number(s): | 47.070 |
ABSTRACT
Digital cameras are deployed as network edge devices, gathering visual data for such tasks as autonomous driving, traffic analysis, and wildlife observation. Analyzing the vast amount of visual data is a challenge. Existing computer vision methods require fast computers that are beyond the computational capabilities of many edge devices. This project aims to improve the efficiency of computer vision methods so that they can run on battery-powered edge devices. Based on the visual data and complementary metadata (e.g., geographical location, local time), the project first extracts contextual information (such as a city street is expected to be busy at rush hour). The contextual information can help assist determine whether analysis results are correct. For example, a wild animal is not expected on a city street. Moreover, contextual information can improve efficiency. Only certain pixels need to be analyzed (pixels on the road are useful for detecting cars, while pixels in the sky are not) and this can significantly reduce the amount of computation, thus enabling analysis on edge devices. This project constructs a cyberinfrastructure for three services: (1) understand contextual information to reduce the search space of analysis methods, (2) reduce computation by considering only necessary pixels, and (3) automate evaluation of analysis results based on the contextual information without human effort.
Understanding contextual information is achieved by using background segmentation, GPS-location-dependent logic, and image depth maps. Background analysis leverages semantic segmentation and analysis over time to identify the background pixels and then generate inference rules via a background-implies-foreground relationship. If a pixel is consistently marked by the same semantic label across a long period of time, this pixel is classified as a background pixel. The background information can infer certain types of foreground objects. For example, if the background is city streets, the foreground objects can be vehicles or pedestrians; if a bison is detected, this is likely a mistake. This project processes only the foreground pixels by adding masks to the neural network layers. Masking convolution can substantially reduce the amount of computation with no loss of accuracy and no additional training is needed. Meanwhile, hierarchical neural networks can skip sections of a model based on context. For example, pixels in the sky only need to be processed by the hierarchy nodes that classify airplanes. The project provides an online service that can accept input data and analysis programs for automatic evaluation of the programs, without human created labels. The evaluation is based on the correlations of background and foreground objects.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH
Note:
When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external
site maintained by the publisher. Some full text articles may not yet be available without a
charge during the embargo (administrative interval).
Some links on this page may take you to non-federal websites. Their policies may differ from
this site.
Please report errors in award information by writing to: awardsearch@nsf.gov.