Award Abstract # 1213151
AF: Large: Collaborative Research: Exploiting Duality between Meta-Algorithms and Complexity

NSF Org: CCF
Division of Computing and Communication Foundations
Recipient: UNIVERSITY OF CALIFORNIA, SAN DIEGO
Initial Amendment Date: June 20, 2012
Latest Amendment Date: July 10, 2015
Award Number: 1213151
Award Instrument: Continuing Grant
Program Manager: Tracy Kimbrel
CCF
 Division of Computing and Communication Foundations
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: July 1, 2012
End Date: June 30, 2017 (Estimated)
Total Intended Award Amount: $1,250,000.00
Total Awarded Amount to Date: $1,250,000.00
Funds Obligated to Date: FY 2012 = $679,264.00
FY 2013 = $279,564.00

FY 2015 = $291,172.00
History of Investigator:
  • Russell Impagliazzo (Principal Investigator)
    russell@cs.ucsd.edu
  • Samuel Buss (Co-Principal Investigator)
  • Ramamohan Paturi (Co-Principal Investigator)
Recipient Sponsored Research Office: University of California-San Diego
9500 GILMAN DR
LA JOLLA
CA  US  92093-0021
(858)534-4896
Sponsor Congressional District: 50
Primary Place of Performance: University of California-San Diego
Department of Computer Science
La Jolla
CA  US  92093-0404
Primary Place of Performance
Congressional District:
50
Unique Entity Identifier (UEI): UYTTZT6G9DT1
Parent UEI:
NSF Program(s): Algorithmic Foundations
Primary Program Source: 01001213DB NSF RESEARCH & RELATED ACTIVIT
01001314DB NSF RESEARCH & RELATED ACTIVIT

01001516DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 7927, 7926, 7925
Program Element Code(s): 779600
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070

ABSTRACT

Meta-algorithms are algorithms that take other algorithms as input.
Meta-algorithms are important in a variety of applications, from
minimizing circuits in VLSI to verifying hardware and software to
machine learning. Lower bound proofs show that computational problems
are difficult in the sense of requiring a prohibitive
amount of time, memory, or other resource to solve.
This is particularly important in the context of cryptography,
where it is vital to ensure that no feasible adversary can break
a code. Surprisingly, recent research by the PIs and others
shows that designing meta-algorithms is, in a formal sense,
equivalent to proving lower bounds. In other words, one can prove a
negative (the non-existence of a small circuit to solve a problem) by
a positive (devising a new meta-algorithm). This was the key to a
breakthrough by PI Williams, proving lower bounds on constant
depth circuits with modular arithmetic gates.

The proposed research will utilize this connection both to
design new meta-algorithms and to prove new lower bounds.
A primary focus will be on meta-algorithms for
deciding if a given algorithm is 'trivial' or not, such as algorithms
for the Boolean satisfiability problem. The proposed research will devise new
algorithms that improve over exhaustive search for many variants
of satisfiability. On the other hand, it will also explore
complexity-theoretic limitations on how much improvement is
possible, using reductions and lower bounds for restricted
models. Satisfiability will provide a starting point for a more
general understanding of the exact complexities of other NP-complete
problems such as the traveling salesman problem and k-colorability.
The proposal addresses both worst-case performance and the use
of fast algorithms as heuristics for solving this problem.

This exploration will be mainly mathematical. However, when
new algorithms and heuristics are developed, they will be
implemented and the resulting software made widely available.
This research will be incorporated in courses taught by
the PI's, at both graduate and undergraduate levels.
Both graduate and undergraduate students will perform research
as part of the project.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

(Showing: 1 - 10 of 73)
Daniel Moeller, Ramamohan Paturi, Stefan Schneider "Subquadratic Algorithms for Succinct Stable Matching." CSR , v.11 , 2016 , p.294
Impagliazzo, Russell; Kabanets, Valentine; Wigderson, Avi "NEW DIRECT-PRODUCT TESTERS AND 2-QUERY PCPs" SIAM JOURNAL ON COMPUTING , v.41 , 2012 , p.1722-1768
Sam Buss and Doug Cenzer and Jeffrey B. Remmel "Sub-computable Bounded Randomness" Logical Methods of Computer Science , v.10 , 2014 , p.Article 1
Sam Buss, Mia Minnes "Probabilistic algorithmic randomness" J. Symb. Log. , v.78 , 2013 , p.579
Sam Buss, Michael Soltys "Unshuffling a square is NP-hard" J. Comput. Syst. Sci , v.80 , 2014 , p.766
Samuel R. Buss "Alternation Trading Proofs and Their Limitations" MFCS , v.38 , 2013 , p.1
Daniel Moeller, Ramamohan Paturi, Stefan Schneider: "Subquadratic Algorithms for Succinct Stable Matching" Computer Science Symposium in Russia , v.11 , 2016 , p.294
Alexander Golovnev, Alexander S. Kulikov, Ivan Mihajlin: "Families with Infants: Speeding Up Algorithms for NP-Hard Problems Using FFT" ACM Transactions on Algorithms , v.12 , 2016 , p.35
Alexander Golovnev, Alexander S. Kulikov, Ivan Mihajlin: "Families with Infants: Speeding Up Algorithms for NP-Hard Problems Using FFT." ACM Transactions on Algorithms , v.12 , 2016 , p.35:1
Arnold Beckmann and Samuel R. Buss "Improved Witnessing and Local Improvement Principles for Second-Order Bounded Arithmetic" Transactions on Computational Logic , v.15 , 2014 , p.Article 2
Arnold Beckmann and Samuel R. Buss and Sy-David Friedman "Safe Recursive Set Function" Journal of Symbolic Logic , v.80 , 2015 , p.730
(Showing: 1 - 10 of 73)

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

Intellectual merit:

Theoretical computer science concerns itsefl with the computational resources, such as number of operations or bits of memory, needed for a computer to solve a computational problem.  Traditionally, this is divided into two sub-fields: algorithm design, seeking new ways of  solving problems quickly or with few resources; and computational complexity, characterizing those problems that are ``hard'' in that they require large amount of resources.  Our research utilizes pardoxical connections between these two areas to make progress on both.  We utilize reasons why some problems are difficult to design new algorithms that solve other problems more efficiently, and utilize new algorithms for some problems to show others are difficult. 

As an example, we designed a new learning algorithm that, from random labelled examples of a function of a certain type (that can be expressed witha small number of  layers of and, or and parity gates) , deduces how to compute the function.  This solved a problem that had been open for twenty-five years, and won the Best Paper Award in CCC 16.  This algorithm is completely different from any in the previous learning theory literature, because it combines two ideas from complexity theory:  the circuit size lower bound of Smolensky and Razborov and the way to remove randomness from algorithms of Nisan and Wigderson. 

We also bridge another divide, showing connections between the difficulty of ``very hard'' problems that require exponential time and those that are relatively easy, but where algorithm designers are stil working to make further improvements. 

For example, the stable marriage problem was introduced by Gale (and was cited in his Nobel Prize for Economics as a major contribution).  The problem is to match partners in  a way that each prefers to be together than to split up.  It is widely used for applications such as pairing medical residents with hospitals.  It has a reasonably efficient algorithm, but if the input is described succinctly, one could hope to improve it.  We examined this problem from a complexity-theoretic viewpoint.  We obtained both new algorithms for many special cases of stable marriage, and hardness results, showing that improving algorithms for other cases would violate a widely-held conjecture about the complexity of NP-complete problems.

We also made progress on understanding the complexity of hard problems, such as Satisfiability.  New algorithms for hard problems were designed, many using circuit lower bound arguments.  Somewhat paradoxically, satisfiability is both a known hard problem, and a problem where heuristic algorithms solve many typical instances.  We used proof complexity to characterize both the power and limitations of different heuristic methods; for example, we gave provable examples where the new clause learning technique is superior to any implementation of the more old-fashioned Davis-Putnam style algorithms.  We also showed the first examples where memory restrictions would cost a SAT-solver  a huge amount of time, even if the memory is  much larger  than the input formula.

Broader impact:

The divide between algorithm design and computational complexity was social as well as technical.  Since researchers in the two communities are not used to each other's ideas and approaches, and have mostly worked separately, technical results explaining the connections between the two areas do not automatically create collaborations between the fields.   With this in mind, we organized a special semester at the Simons Insitute in Berkeley to bring together the two groups of researchers.  This semester program was very successful, both in terms of the research that was performed during the semester (solving many open problems and launching many new directions), and  in creating collaborations between algorithm designer s and complexity theorists.  Moreover, many young researchers became involved and interested in this new approach.

We also looked for ways to use theoretical ideas to improve undergraduate education.  One way we did this is by directly involving undergraduates in our reseach, such as Cody Murray, now a Ph.D. student at MIT working with non-UCSD PI Wlliiams.  We also used algorithmic problems that came up in our research as example problems in undergraduate courses.  Finally, some of us were instrumental in creating programs to ease the transition into the computer science major for new  undergraduates with little computing experience.


Last Modified: 11/28/2017
Modified by: Ramamohan Paturi

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page