Award Abstract # 0941530
CDI-Type II: Database enabled multiscale simulations and analysis of fluid turbulence

NSF Org: CMMI
Division of Civil, Mechanical, and Manufacturing Innovation
Recipient: THE JOHNS HOPKINS UNIVERSITY
Initial Amendment Date: August 18, 2009
Latest Amendment Date: August 18, 2009
Award Number: 0941530
Award Instrument: Standard Grant
Program Manager: Massimo Ruzzene
CMMI
 Division of Civil, Mechanical, and Manufacturing Innovation
ENG
 Directorate for Engineering
Start Date: September 1, 2009
End Date: August 31, 2015 (Estimated)
Total Intended Award Amount: $1,899,469.00
Total Awarded Amount to Date: $1,899,469.00
Funds Obligated to Date: FY 2009 = $1,899,469.00
History of Investigator:
  • Charles Meneveau (Principal Investigator)
    meneveau@jhu.edu
  • Alexander Szalay (Co-Principal Investigator)
  • Gregory Eyink (Co-Principal Investigator)
  • Shiyi Chen (Co-Principal Investigator)
  • Randal Burns (Co-Principal Investigator)
Recipient Sponsored Research Office: Johns Hopkins University
3400 N CHARLES ST
BALTIMORE
MD  US  21218-2608
(443)997-1898
Sponsor Congressional District: 07
Primary Place of Performance: Johns Hopkins University
3400 N CHARLES ST
BALTIMORE
MD  US  21218-2608
Primary Place of Performance
Congressional District:
07
Unique Entity Identifier (UEI): FTMTDMBR29C7
Parent UEI: GS4PNKTRNKL3
NSF Program(s): CI REUSE,
CDI TYPE II
Primary Program Source: 01000910DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 0000, 7721, 7722, 7725, 7751, OTHR
Program Element Code(s): 689200, 775100
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.041

ABSTRACT

To make significant new progress in the understanding and modeling of multi-scale, complex fluid flow phenomena such as turbulence, the entire 4-D structure of the simulated phenomena must be widely, and easily, accessible. To achieve such goals, the proposed project aims to couple database technology with high performance computing. Specifically, we will: (i) Develop a storage infrastructure that provides efficient parallel I/O to archived simulation data at all scales by adapting and improving techniques from recent visualization, database organization and storage architecture research. (ii) Construct a data-driven batch-processing framework that supports many concurrent data-intensive queries. (iii) Develop tools to find "interesting regions" in the data, extract various features, and tag those regions with the relevant metadata. (iv) Develop tools for interactive visualization over the Internet so that data are accessible at display resolution with queries that may request arbitrary sub-regions. Several core databases will serve as content for the proposed research: (i) forced isotropic turbulent flow to address fundamental unsolved questions on the multi-scale and Lagrangian structure of turbulent flow, (ii) a large turbulent channel flow, of interest to engineering fluid dynamics, (iii) simulations of magneto-hydrodynamic (MHD) turbulence of astrophysical relevance, and (iv) a 24-hr daily cycle of the atmospheric boundary layer including day-time buoyant conditions and night-time stably stratified conditions, of relevance to wind energy.

By tightly coupling the database approach to high-performance scientific simulation, the project has the potential to transform the way research in scientific computing of multi-scale phenomena is carried out and employed. If our approach proves successful, it will become routine for state-of-the-art computational datasets to be archived in databases with easy on-line access, flexible user-friendly analysis tools, and efficient coupling with further simulations. We will apply the tools to be developed to study various important fluid flow phenomena, such as turbulent and chaotic motions in neutral fluids, in plasmas, and in the atmosphere.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

(Showing: 1 - 10 of 36)
C.C. Lalescu, C. Meneveau & G. Eyink "Synchronization of Chaos in Fluid Turbulence" Phys. Rev. Lett. , v.110 , 2013 , p.0841102 DOI:10.1103/PhysRevLett.110.084102
C.C. Lalescu, C. Meneveau & G. Eyink "Synchronization of Chaos in Fluid Turbulence" Phys. Rev. Lett. , v.110 , 2013 , p.0841102
C.C. Lalescu, Y.K. Shi, G.L. Eyink, T.D. Drivas, E.T. Vishniac & A. Lazarian "Inertial-Range Reconnection in Magnetohydrodynamic Turbulence and in the Solar Wind" Phys. Rev. Lett. , v.115 , 2014 , p.025001
F.J. Alexander & C. Meneveau "Open Simulation Laboratories [Guest editors' introduction]" Computing in Science & Engineering , v.17 , 2015 , p.7
G. Eyink, E. Vishniac, C. Lalescu, H. Aluie, K. Kanov, K Bürger, R. Burns, C. Meneveau, & A. Szalay "Flux-freezing breakdown observed in high-conductivity magnetohydrodynamic turbulence" Nature , v.497 , 2013 , p.466-469 doi:10.1038/nature12128
G. L. Eyink "Fluctuation dynamo and turbulent induction at small Prandtl number" Phys. Rev. E , v.82 , 2010 , p.046314
G. L. Eyink "Fluctuation dynamo and turbulent induction at small Prandtl number" Phys. Rev. E , v.82 , 2010 , p.046314
G. L. Eyink "Stochastic flux-freezing and magnetic dynamo" Phys. Rev E , v.83 , 2011 , p.056405
G. L. Eyink, A. Lazarian and E. T. Vishniac "Fast magnetic reconnection and spontaneous stochasticity" Astrophys. J. , v.743 , 2011 , p.51
G.L. Eyink, A. Lazarian & E. T. Vishniac "Fast magnetic reconnection and spontaneous stochasticity" Astrophys. J. , v.743 , 2011 , p.51
G. L. Eyink and A. F. Neto "Small-scale kinematic dynamo and non-dynamo in inertial-range turbulence" New J. Phys. , v.12 , 2010 , p.023021
(Showing: 1 - 10 of 36)

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page