Award Abstract # 2047933
CAREER: The Nature of Average-Case Computation

NSF Org: CCF
Division of Computing and Communication Foundations
Recipient: CARNEGIE MELLON UNIVERSITY
Initial Amendment Date: February 2, 2021
Latest Amendment Date: September 20, 2023
Award Number: 2047933
Award Instrument: Continuing Grant
Program Manager: Peter Brass
pbrass@nsf.gov
 (703)292-0000
CCF
 Division of Computing and Communication Foundations
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: February 15, 2021
End Date: March 31, 2024 (Estimated)
Total Intended Award Amount: $599,586.00
Total Awarded Amount to Date: $498,141.00
Funds Obligated to Date: FY 2021 = $304,031.00
FY 2023 = $194,110.00
History of Investigator:
  • Pravesh Kothari (Principal Investigator)
    praveshk@cs.cmu.edu
Recipient Sponsored Research Office: Carnegie-Mellon University
5000 FORBES AVE
PITTSBURGH
PA  US  15213-3815
(412)268-8746
Sponsor Congressional District: 12
Primary Place of Performance: Carnegie-Mellon University
PA  US  15213-3815
Primary Place of Performance
Congressional District:
12
Unique Entity Identifier (UEI): U3NKNFLNQ613
Parent UEI: U3NKNFLNQ613
NSF Program(s): Algorithmic Foundations
Primary Program Source: 01002122DB NSF RESEARCH & RELATED ACTIVIT
01002425DB NSF RESEARCH & RELATED ACTIVIT

01002324DB NSF RESEARCH & RELATED ACTIVIT

01002526DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 1045, 7926, 7927
Program Element Code(s): 779600
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070

ABSTRACT

The recent surge in the applications of machine learning is powered by algorithms that learn hidden patterns in large volumes of data. Designing faster and more reliable data analysis algorithms is a key challenge in broadening the scope of such applications. However, researchers have realized that the classical framework of algorithm design is inadequate for this task. This is because large data in almost every application is modeled using statistical models as opposed to the standard worst-case model used in algorithm design. Consequently, central challenges that involve an interplay between algorithms and statistically generated data remain widely unresolved not just in machine learning but also in statistical physics and cryptography. This project will address this critical deficiency by building a principled theory of algorithm design for statistical (aka average-case) data. The new paradigms explored in this work will unify the currently fragmented set of approaches for studying average-case computation. The curriculum development plan outlined in this project will train the next generation of scientists in the algorithmic methods tailor-made for problems in large scale statistical data analysis and disseminate the modern paradigms for understanding computation to both graduate and undergraduate students.

Average-case complexity is a central thrust in the theory of computation with a direct impact on potential technological advances in machine learning and cryptography as well as basic questions in statistical physics. Examples include training expressive statistical models such as Gaussian mixture models and Sparse PCA to find patterns in large data in machine learning, ascertaining the security of pseudo-random generators in cryptography, and finding the lowest-energy states of spin-glass systems in statistical physics. Our current understanding of such problems is based on fragmented, domain-specific algorithmic schemes such as statistical query methods and method of moments (in machine learning), belief propagation (in statistical physics), and semidefinite programming hierarchies (in computational complexity). This project is devoted to building a unified theory of average-case computation that offers new tools to design better algorithms, prove sharp lower-bounds, and allow rigorously transferring insights between different specific frameworks. This investigation will build new bridges between theoretical computer science and several adjacent areas including machine learning, statistical physics, algebraic geometry, and probability. In addition, it will further develop the burgeoning understanding of the sum-of-squares semidefinite programming hierarchy, mixture models, and use of solution-space geometry in solving random constraint satisfaction problems.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

Alabi, Daniel and Kothari, Pravesh K. and Tankala, Pranay and Venkat, Prayaag and Zhang, Fred "Privately Estimating a Gaussian: Efficient, Robust, and Optimal" , 2023 https://doi.org/10.1145/3564246.3585194 Citation Details
Bafna, Mitali and Hsieh, Jun-Ting and Kothari, Pravesh K. and Xu, Jeff "Polynomial-Time Power-Sum Decomposition of Polynomials" 2022 IEEE 63rd Annual Symposium on Foundations of Computer Science (FOCS) , 2022 https://doi.org/10.1109/FOCS54457.2022.00094 Citation Details
Basiok, Jarosaw and Buhai, Rares-Darius and Kothari, Pravesh K and Steurer, David "Semirandom Planted Clique and the Restricted Isometry Property" , 2024 https://doi.org/10.1109/FOCS61266.2024.00064 Citation Details
Gollakota, Aravind and Klivans, Adam R. and Kothari, Pravesh K. "A Moment-Matching Approach to Testable Learning and a New Characterization of Rademacher Complexity" , 2023 https://doi.org/10.1145/3564246.3585206 Citation Details
Hsieh, Tim and Kothari, Pravesh K. "Algorithmic Thresholds for Refuting Random Polynomial Systems" Proceedings of the annual ACMSIAM symposium on discrete algorithms , 2022 https://doi.org/10.1137/1.9781611977073.49 Citation Details
Ivkov, Misha and Kothari, Pravesh K. "List-decodable covariance estimation" Proceedings of the 54th Annual {ACM} {SIGACT} Symposium on Theory of Computing , 2022 https://doi.org/10.1145/3519935.3520006 Citation Details
Jia, He and Kothari, Pravesh K. and Vempala, Santosh S. "Beyond Moments: Robustly Learning Affine Transformations with Asymptotically Optimal Error" , 2023 https://doi.org/10.1109/FOCS57990.2023.00147 Citation Details
Pravesh K. Kothari, Pasin Manurangsi "Private Robust Estimation by Stabilizing Convex Relaxations" Proceedings of Thirty Fifth Conference on Learning Theory, PMLR , 2022 Citation Details

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page