Email Print Share

This program has been archived.


Office of Advanced Cyberinfrastructure


Benchmarks of Realistic Scientific Application Performance of Large-Scale Computing Systems  (BRAP)


CONTACTS
Name Email Phone Room
Rudolf  Eigenmann reigenma@nsf.gov 703-292-2598  1270.08  
Thomas  F. Russell trussell@nsf.gov (703) 292-4863   
Kevin  Thompson kthompso@nsf.gov (703) 292-4220   


PROGRAM GUIDELINES

PD 15-7685

Important Information for Proposers

A revised version of the NSF Proposal & Award Policies & Procedures Guide (PAPPG) (NSF 22-1), is effective for proposals submitted, or due, on or after October 4, 2021. Please be advised that, depending on the specified due date, the guidelines contained in NSF 22-1 may apply to proposals submitted in response to this funding opportunity.


DUE DATES

Archived


SYNOPSIS

NSF is interested in supporting activities by the NSF Cyberinfrastructure community in the analysis of existing benchmarks, and in the development of new benchmarks, that measure real-world performance and effectiveness of large-scale computing systems for science and engineering discovery.  

Research, development, and use of performance benchmarks in high-performance computing (HPC) has been active for over 20 years, as evidenced by the development of LINPACK and the emergence of the TOP500 list in the early 1990s, followed by the development of the HPC Challenge Benchmark and the current HPCG effort (http://tiny.cc/hpcg).  There have been efforts to provide benchmarks that include real applications, such as the SPEC High Performance Computing Benchmarks (http://spec.org/benchmarks.html#hpg), the Blue Waters SPP suite (http://www.ncsa.illinois.edu/assets/pdf/news/BW1year_apps.pdf), and the NERSC SSP (https://www.nersc.gov/users/computational-systems/nersc-8-system-cori/nersc-8-procurement/trinity-nersc-8-rfp/nersc-8-trinity-benchmarks/ssp/).  Recent efforts have sought to broaden the set of relevant benchmarks to more effectively cover performance under different application environments such as data-intensive analysis (e.g., Graph500).  Energy efficiency has also emerged in recent years as a relevant and increasingly important area of measurement and profiling for HPC systems (e.g., Green500).  In addition to HPC, the Big Data community has gained interest in benchmarking; reference approaches to measuring and characterizing system performance for large-scale data analysis hardware and software systems remains an area of research, development, and community discussion (e.g., on the Big Data Top 100).  Industry and academe have convened an ongoing series of workshops and meetings on the topic of Big Data benchmarking (http://clds.ucsd.edu/bdbc/workshops).

Given the emergence of inference-based computing, the growing role of data analysis, changes in scientific workflow due to dynamic availability of sensor and instrument data, the expanding use of large-scale computing in all scientific disciplines, the growing role of clouds, and a diversity of architectural approaches, NSF sees a timely opportunity to engage the community in benchmarking analysis and development activities.

NSF welcomes benchmarking proposals in the following general areas: (1) the analysis, evaluation, and assessment of the effectiveness of one or more existing benchmarks used in industry and academe today; (2) the development (including algorithm development and prototype implementation) and experimental use of one or more new benchmarks; or (3) workshops and community engagement events to advance discussion, dissemination, and community building around benchmarks.  Proposals focused in areas 1 and 2 must include some work in area 3.  Industry engagement is encouraged.

Proposals should describe aspects of the targeted systems and run-time environments, including relevant scales, types of platforms, and I/O processing, and should describe the new information about the targeted systems that can be expected to emerge from the project.  Describe the scientific applicability characteristics of the targeted systems, and the relevance of the new information that will be learned about these systems to realistic use in the proposed applications.  Because the act of measuring a quantity often leads to efforts to improve that quantity, proposals should justify the choice of characteristic(s) being measured, such as sustained performance, throughput, productivity, energy efficiency, time to solution, etc., including an application perspective in this justification.  Describe the scope of the interested research community, within and/or beyond NSF; the likelihood that that community will accept the proposed benchmark as a useful measure; and the practicality and feasibility of the benchmark as a tool for that community.  Describe how the proposed measurements might create incentives for vendors to design systems that will serve the application area(s).  Authors should also address the issues of evolution and sustainability of the benchmarks in future generations, and the usefulness of the benchmarks in contributing to NSF’s future efforts to acquire systems that best serve the research community.  Proposals should include a project plan with milestones.  For proposals addressing Big Data benchmarks, NSF encourages proposers to consider the characteristics and topics described in guidance for the Fifth Workshop on Big Data Benchmarking (http://clds.ucsd.edu/wbdb2014.de).

Proposals should be submitted via this Program Description NSF 15-7685 on or before February 2, 2015.  The requested starting date should be no earlier than May 1, 2015.

For further information about the appropriateness of potential project ideas for this program, investigators are strongly encouraged to contact a program officer before formulating a proposal.


RELATED PROGRAMS


RELATED URLS

What Has Been Funded (Recent Awards Made Through This Program, with Abstracts)

Map of Recent Awards Made Through This Program