
NSF Org: |
DMS Division Of Mathematical Sciences |
Recipient: |
|
Initial Amendment Date: | August 11, 2018 |
Latest Amendment Date: | August 12, 2019 |
Award Number: | 1821149 |
Award Instrument: | Continuing Grant |
Program Manager: |
Christopher Stark
DMS Division Of Mathematical Sciences MPS Directorate for Mathematical and Physical Sciences |
Start Date: | August 15, 2018 |
End Date: | July 31, 2022 (Estimated) |
Total Intended Award Amount: | $140,000.00 |
Total Awarded Amount to Date: | $140,000.00 |
Funds Obligated to Date: |
FY 2019 = $93,742.00 |
History of Investigator: |
|
Recipient Sponsored Research Office: |
2601 WOLF VILLAGE WAY RALEIGH NC US 27695-0001 (919)515-2444 |
Sponsor Congressional District: |
|
Primary Place of Performance: |
2108 SAS Hall Raleigh NC US 27695-8205 |
Primary Place of
Performance Congressional District: |
|
Unique Entity Identifier (UEI): |
|
Parent UEI: |
|
NSF Program(s): | CDS&E-MSS |
Primary Program Source: |
01001920DB NSF RESEARCH & RELATED ACTIVIT |
Program Reference Code(s): |
|
Program Element Code(s): |
|
Award Agency Code: | 4900 |
Fund Agency Code: | 4900 |
Assistance Listing Number(s): | 47.049 |
ABSTRACT
Many tasks in scientific computing involve either data or operators that are inherently multidimensional: for example, a database of gray-scale images constitutes a three-dimensional array when each image is stored in a standard two-dimensional array format. Yet many standard numerical methods treat the data and associated operators as two-dimensional arrays, or matrices. This suggests that additional structure that could be leveraged for computational gain may be going undiscovered and underutilized. Recent research has shown that tensors (multidimensional arrays) and several types of corresponding decomposition methods can be instrumental in revealing latent correlations of both data and operators residing in high-dimensional spaces. Indeed, tensor decompositions can be provably superior to matrix-based counterparts in representation of certain types of data. This research project tackles two important questions: (1) how to uncover latent structure in data and operators using multidimensional tensor factorizations, and (2) how to use these revealed structures to develop a powerful computational framework that can harvest the benefits of this structure. Hands-on teaching material for graduate courses will be developed on randomized matrix methods and tensor decompositions. This teaching material, in the form of Python notebooks, along with the code developed as a part of this project, will be freely available as a software library under an open source license.
The investigators aim to answer these questions in the context of two applications in scientific computing of major importance and far-reaching consequences: model reduction, a mathematical framework for reducing the computational cost associated with high-fidelity simulations of complicated physical phenomena, and structured matrix approximation, which is important in applications such as parametric partial differential equations and image deblurring. The work will approach these questions through an entirely new, multidimensional lens with the advantages of providing new computational efficiencies and structure that can only be obtained by moving to a higher-dimensional regime. Two signature features of the project are: (1) design and analysis of structured tensor decompositions that are efficient in terms of computations and memory accesses, by using randomized matrix methods and the algebraic structure of tensor decompositions; and (2) use of tensor models, and a corresponding suite of structured decompositions, to exploit latent multidimensional structure in model reduction and structured matrix approximations. The research is expected to benefit numerous other applications in science and engineering as well.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH
Note:
When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external
site maintained by the publisher. Some full text articles may not yet be available without a
charge during the embargo (administrative interval).
Some links on this page may take you to non-federal websites. Their policies may differ from
this site.
PROJECT OUTCOMES REPORT
Disclaimer
This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.
This project used tensor decompositions to uncover and exploit latent multidimensional structure in a range of problems in scientific computing and data science. The state-of-the-art techniques in this area typically treated the data and operators as two-dimensional arrays and then used matrix-based methods. In contrast, our tensor-based techniques often performed much better in terms of storage costs and accuracy due to their ability to better exploit the multidimensional structure. A key ingredient in our approach was the development of novel randomized numerical linear algebra algorithms and analysis, which led to dramatic reduction in computational costs with provable probabilistic guarantees. The key outcomes of this project are:
1. Development of randomized algorithms for tensor decompositions in the Tucker and Tensor Train formats.
2. Development of novel randomized and tensor algorithms for projection-based nonlinear model reduction for partial differential equations.
3. Development of tensor-based algorithms for function and kernel approximations with an application to rank-structured matrices.
4. Development of randomized algorithms for generalized SVD with applications to sensitivity analysis.
5. Development of randomized matrix and tensor-based algorithms for subspace system identification.
6. Development of efficient algorithms for storing and representing structured matrices by using invertible matrix-to-tensor mappings.
7. Development of tensor-based methods for optimal sensor placement for flow reconstructions.
The novel algorithms developed as part of this project were demonstrated on a range of test problems, involving both synthetic and real-world data across a range of applications and disciplines. We believe the advances in this project will not only benefit other areas of computational mathematics such as model reduction and sensitivity analysis but also benefit areas of science and engineering such as Statistics, Electrical and Mechanical Engineering, and the cross-cutting area of Fluid Dynamics.
The project led to multiple software codes on randomized tensor decompositions, system identification, and flow reconstructions, which were released to the public via Github along with appropriate licenses.
The project supported the PhD studies of one PhD student (North Carolina State University, or NC State) and two Master?s students (Tufts University), who were exposed to modern research areas in scientific computing and data science and an interdisciplinary research environment.
This project also led to the development of educational materials for a special topics class held at NC State. The PIs also developed tutorials on tensor decompositions and randomized algorithms and delivered them at several international conferences and workshops. The PIs also organized minisymposia on tensor decompositions at international conferences with an emphasis on inviting early career researchers and graduate students.
Last Modified: 12/13/2022
Modified by: Arvind K Saibaba
Please report errors in award information by writing to: awardsearch@nsf.gov.