Award Abstract # 2110722
Optimization Methods for Nonconvex Structured Optimization

NSF Org: DMS
Division Of Mathematical Sciences
Recipient: LOUISIANA STATE UNIVERSITY
Initial Amendment Date: May 27, 2021
Latest Amendment Date: May 27, 2021
Award Number: 2110722
Award Instrument: Standard Grant
Program Manager: Yuliya Gorb
ygorb@nsf.gov
 (703)292-2113
DMS
 Division Of Mathematical Sciences
MPS
 Directorate for Mathematical and Physical Sciences
Start Date: July 15, 2021
End Date: June 30, 2025 (Estimated)
Total Intended Award Amount: $150,000.00
Total Awarded Amount to Date: $150,000.00
Funds Obligated to Date: FY 2021 = $150,000.00
History of Investigator:
  • Hongchao Zhang (Principal Investigator)
    hozhang@math.lsu.edu
Recipient Sponsored Research Office: Louisiana State University
202 HIMES HALL
BATON ROUGE
LA  US  70803-0001
(225)578-2760
Sponsor Congressional District: 06
Primary Place of Performance: Louisiana State University
Baton Rouge, LA
LA  US  70803-2701
Primary Place of Performance
Congressional District:
06
Unique Entity Identifier (UEI): ECQEYCHRNKJ4
Parent UEI:
NSF Program(s): COMPUTATIONAL MATHEMATICS
Primary Program Source: 01002122DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 9150, 9263
Program Element Code(s): 127100
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.049

ABSTRACT

This project will advance fundamental algorithmic theory and software tools for solving optimization problems with wide applications in science, engineering and industry. Specifically, the project will be in the area of structured nonconvex nonlinear optimization, a critical component in many modern applications ranging from signal/image processing, real-time optimal control to stochastic learning. The project aims to develop algorithms with focus on the following features: speed, problem dependence, and ease of use for researchers in both optimization and computational data science community. Students will be involved and will have opportunities for interdisciplinary research. Software will be developed.

This project will develop theoretically strong and numerically efficient algorithms as well as the software for solving nonconvex structured optimization. These algorithms will solve the subproblems inexactly with guaranteed global convergence as well as feature an optimal computational complexity when the problem features convexity structure. The algorithms will be based on recent work on proximal and stochastic gradient methods for structured composite minimization, inexact alternating direction multiplier methods (ADMM) for separable convex/nonconvex optimization and active set methods for polyhedral constrained optimization. In addition, second-order techniques for accelerating the convergence will be also explored.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

(Showing: 1 - 10 of 12)
Chen, Xiaobing and Zhou, Xiangwei and Zhang, Hongchao and Sun, Mingxuan and Poor, H Vincent "Client Selection for Wireless Federated Learning With Data and Latency Heterogeneity" IEEE Internet of Things Journal , 2024 https://doi.org/10.1109/JIOT.2024.3425757 Citation Details
Huang, Yakui and Dai, Yu-Hong and Liu, Xin-Wei and Zhang, Hongchao "On the Asymptotic Convergence and Acceleration of Gradient Methods" Journal of Scientific Computing , v.90 , 2022 https://doi.org/10.1007/s10915-021-01685-8 Citation Details
Zhang, Miao and Zhang, Hongchao "A unified proximal gradient method for nonconvex composite optimization with extrapolation" Numerical Algebra, Control and Optimization , v.0 , 2024 https://doi.org/10.3934/naco.2024005 Citation Details
Hager, William W. and Zhang, Hongchao "Algorithm 1035: A Gradient-based Implementation of the Polyhedral Active Set Algorithm" ACM Transactions on Mathematical Software , v.49 , 2023 https://doi.org/10.1145/3583559 Citation Details
Jiang, Fan and Wu, Zhongming and Cai, Xingju and Zhang, Hongchao "Unified linear convergence of first-order primal-dual algorithms for saddle point problems" Optimization Letters , v.16 , 2022 https://doi.org/10.1007/s11590-021-01832-y Citation Details
null, null "Convergence on a Symmetric Accelerated Stochastic ADMM with Larger Stepsizes" CSIAM Transactions on Applied Mathematics , v.3 , 2022 https://doi.org/10.4208/csiam-am.SO-2021-0021 Citation Details
Tang, Jingyong and Zhou, Jinchuan and Zhang, Hongchao "An Accelerated Smoothing Newton Method with Cubic Convergence for Weighted Complementarity Problems" Journal of Optimization Theory and Applications , v.196 , 2023 https://doi.org/10.1007/s10957-022-02152-6 Citation Details
Wu, Jiayuan and Hu, Jiang and Zhang, Hongchao and Wen, Zaiwen "Convergence Analysis of an Adaptively Regularized Natural Gradient Method" IEEE Transactions on Signal Processing , v.72 , 2024 https://doi.org/10.1109/TSP.2024.3398496 Citation Details
Bai, Jianchao and Hager, William W. and Zhang, Hongchao "An inexact accelerated stochastic ADMM for separable convex optimization" Computational Optimization and Applications , v.81 , 2022 https://doi.org/10.1007/s10589-021-00338-8 Citation Details
Brenner, Susanne C. and Sung, Li-yeng and Tan, Zhiyu and Zhang, Hongchao "A convexity enforcing $${C}^{{0}}$$ interior penalty method for the MongeAmpère equation on convex polygonal domains" Numerische Mathematik , 2021 https://doi.org/10.1007/s00211-021-01210-x Citation Details
Chang, Xiao-Kai and Yang, Junfeng and Zhang, Hongchao "Golden Ratio Primal-Dual Algorithm with Linesearch" SIAM Journal on Optimization , v.32 , 2022 https://doi.org/10.1137/21M1420319 Citation Details
(Showing: 1 - 10 of 12)

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page