Award Abstract # 2309549
Acceleration, Complexity and Implementation of Active Set Methods for Large-scale Sparse Nonlinear Optimization

NSF Org: DMS
Division Of Mathematical Sciences
Recipient: LOUISIANA STATE UNIVERSITY
Initial Amendment Date: May 31, 2023
Latest Amendment Date: May 31, 2023
Award Number: 2309549
Award Instrument: Standard Grant
Program Manager: Jodi Mead
jmead@nsf.gov
 (703)292-7212
DMS
 Division Of Mathematical Sciences
MPS
 Directorate for Mathematical and Physical Sciences
Start Date: July 15, 2023
End Date: June 30, 2026 (Estimated)
Total Intended Award Amount: $236,770.00
Total Awarded Amount to Date: $236,770.00
Funds Obligated to Date: FY 2023 = $236,770.00
History of Investigator:
  • Hongchao Zhang (Principal Investigator)
    hozhang@math.lsu.edu
Recipient Sponsored Research Office: Louisiana State University
202 HIMES HALL
BATON ROUGE
LA  US  70803-0001
(225)578-2760
Sponsor Congressional District: 06
Primary Place of Performance: Louisiana State University
202 HIMES HALL
BATON ROUGE
LA  US  70803-0001
Primary Place of Performance
Congressional District:
06
Unique Entity Identifier (UEI): ECQEYCHRNKJ4
Parent UEI:
NSF Program(s): COMPUTATIONAL MATHEMATICS
Primary Program Source: 01002324DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 9263
Program Element Code(s): 127100
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.049

ABSTRACT

Large-scale nonconvex sparse nonlinear optimization problems frequently arise in many modern applications where speed, stability and solution accuracy are critically important. Such applications include optimal control, image processing and stochastic learning. This project improves the implementation and theory of current active set methods for solving large-scale nonlinear optimization problems. Innovation in the project will help to understand the convergence rate and computational complexities of active set methods which have not been fully addressed in the literature. The algorithms and software developed in the project will not only benefit the research in computational optimization but also the investigations of new methods in broader areas of computational science. All the graduate and undergraduate students supported by this project will have opportunities to perform interdisciplinary research in both computational mathematics and data science.

Although interior point methods successfully solve optimization problems with excellent computational complexity, there are rarely global computational complexity results of active set methods for constrained optimization. This project develops practical, efficient and robust active set algorithms and software to solve the large-scale sparse optimization problems to high accuracy with both local fast convergence and global computational complexity guaranteed. In particular, by exploring the affine-scaling techniques and the second-order information the developed methods will have accelerated asymptotic convergence speed, guaranteed global iteration complexity and converge to a (weak) second-order stationary point. In addition, by combining the approach with a generalized minimum eigenvalue procedure and a conjugate gradient method with negative curvature line search, the developed algorithm is expected to have excellent practical performance. All the algorithms will be developed carefully from both theoretical and implementation perspectives to ensure the eventual success of implemented software.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

Wu, Jiayuan and Hu, Jiang and Zhang, Hongchao and Wen, Zaiwen "Convergence Analysis of an Adaptively Regularized Natural Gradient Method" IEEE Transactions on Signal Processing , v.72 , 2024 https://doi.org/10.1109/TSP.2024.3398496 Citation Details
Zhang, Miao and Zhang, Hongchao "A unified proximal gradient method for nonconvex composite optimization with extrapolation" Numerical Algebra, Control and Optimization , v.0 , 2024 https://doi.org/10.3934/naco.2024005 Citation Details
Chen, Xiaobing and Zhou, Xiangwei and Zhang, Hongchao and Sun, Mingxuan and Poor, H Vincent "Client Selection for Wireless Federated Learning With Data and Latency Heterogeneity" IEEE Internet of Things Journal , 2024 https://doi.org/10.1109/JIOT.2024.3425757 Citation Details

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page