
NSF Org: |
DMS Division Of Mathematical Sciences |
Recipient: |
|
Initial Amendment Date: | July 28, 2020 |
Latest Amendment Date: | July 28, 2020 |
Award Number: | 2012465 |
Award Instrument: | Standard Grant |
Program Manager: |
Yuliya Gorb
ygorb@nsf.gov (703)292-2113 DMS Division Of Mathematical Sciences MPS Directorate for Mathematical and Physical Sciences |
Start Date: | August 1, 2020 |
End Date: | July 31, 2024 (Estimated) |
Total Intended Award Amount: | $249,999.00 |
Total Awarded Amount to Date: | $249,999.00 |
Funds Obligated to Date: |
|
History of Investigator: |
|
Recipient Sponsored Research Office: |
160 ALDRICH HALL IRVINE CA US 92697-0001 (949)824-7295 |
Sponsor Congressional District: |
|
Primary Place of Performance: |
Rowland Hall room 510F Irvine CA US 92697-3875 |
Primary Place of
Performance Congressional District: |
|
Unique Entity Identifier (UEI): |
|
Parent UEI: |
|
NSF Program(s): | COMPUTATIONAL MATHEMATICS |
Primary Program Source: |
|
Program Reference Code(s): |
|
Program Element Code(s): |
|
Award Agency Code: | 4900 |
Fund Agency Code: | 4900 |
Assistance Listing Number(s): | 47.049 |
ABSTRACT
This projects incorporates several recent developments in optimization methods and nonlinear multigrid methods to provide a new technique to improve the computational efficiency of practical applications. Successful integration of our fast optimization methods will open a wide new area of applications ranging from numerical solution of partial differential equations to optimization methods for large-scale machine learning. Social media such as Facebook and GitHub will be used to disseminate basics on applied and computational mathematics and promote the research to a wider audience in both academia and industry, as well as increase the public awareness of how computational mathematics help the advancement of research in other physical and data sciences. This project will provide training opportunities for graduate students.
The project focuses on a particular nonlinear multigrid method, the fast subspace descent (FASD) method, for solving optimization problems arising from various applications such as numerical solution of partial differential equations and data science problems. For example, the nonlinear multigrid methods to be studied can address the challenging problems in engineering applications including gradient flow in phase field models, Poisson-Boltzmann equation in math biology, and convex composite optimization problems in data science. Acceleration has been one of the most productive ideas in modern optimization theory. This framework brings more insight and mathematical tools for the design and analysis of old and new optimization methods, especially the accelerated gradient descent methods. Another important aspect of this project will be the rigorous theoretical foundation for a large class of optimization methods.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH
Note:
When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external
site maintained by the publisher. Some full text articles may not yet be available without a
charge during the embargo (administrative interval).
Some links on this page may take you to non-federal websites. Their policies may differ from
this site.
PROJECT OUTCOMES REPORT
Disclaimer
This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.
This project made significant advances in optimization methods, data science applications, and the resolution of complex mathematical problems. Key achievements include:
-
Novel Convergence Analysis: Conducted a novel convergence analysis of the Fast Subspace Descent Methods (FASD) for convex optimization problems, enhancing the robustness and applicability of these methods.
-
Unified Framework for Acceleration: Developed a unified framework for designing and analyzing accelerated optimization methods, strengthening their theoretical foundation and efficiency.
-
Accelerated Over-Relaxation Heavy-Ball (AOR-HB) Method: Created the AOR-HB method, which enables global and accelerated convergence for solving optimization problems with superior generalization ability.
-
Transformed Primal-Dual with Variable Preconditioners (TPDv) Algorithm: Introduced a transformed primal-dual gradient flow technique and developed the TPDv algorithm, which demonstrates superior performance in solving nonlinear problems compared to existing methods.
-
Deep Learning Method for Real-Time Simulations
The integration of advanced optimization and computational algorithms has been a crucial intellectual development, particularly in data science and machine learning. This project contributed new insights and mathematical tools for designing and analyzing optimization methods, particularly in accelerated optimization techniques. Additionally, the project established a rigorous theoretical foundation for a broad class of optimization methods, significantly advancing the field of modern optimization theory.
Broader Impacts
Last Modified: 08/15/2024
Modified by: Long Chen
Please report errors in award information by writing to: awardsearch@nsf.gov.