Award Abstract # 2217493
Conferences on Reproducibility and Replicability in Economics and the Social Sciences (CRRESS)

NSF Org: SES
Division of Social and Economic Sciences
Recipient: CORNELL UNIVERSITY
Initial Amendment Date: May 31, 2022
Latest Amendment Date: May 31, 2022
Award Number: 2217493
Award Instrument: Standard Grant
Program Manager: Nicholas N Nagle
nnagle@nsf.gov
 (703)292-4490
SES
 Division of Social and Economic Sciences
SBE
 Directorate for Social, Behavioral and Economic Sciences
Start Date: August 1, 2022
End Date: July 31, 2025 (Estimated)
Total Intended Award Amount: $50,000.00
Total Awarded Amount to Date: $50,000.00
Funds Obligated to Date: FY 2022 = $50,000.00
History of Investigator:
  • Lars Vilhuber (Principal Investigator)
    lars.vilhuber@cornell.edu
  • Aleksandr Michuda (Co-Principal Investigator)
Recipient Sponsored Research Office: Cornell University
341 PINE TREE RD
ITHACA
NY  US  14850-2820
(607)255-5014
Sponsor Congressional District: 19
Primary Place of Performance: Cornell University
373 Pine Tree Road
Ithaca
NY  US  14850-2820
Primary Place of Performance
Congressional District:
19
Unique Entity Identifier (UEI): G56PUALJ3KT5
Parent UEI:
NSF Program(s): Economics,
Methodology, Measuremt & Stats
Primary Program Source: 01002223DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s):
Program Element Code(s): 132000, 133300
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.075

ABSTRACT

This award provides partial support for a series of virtual and in-person conferences on the topics of reproducibility, replicability, and transparency in the social sciences. The purpose of scientific publishing is the dissemination of robust research findings, exposing them to the scrutiny of peers and other interested parties. Scientific articles should accurately and completely provide information on the origin and provenance of data and on the analytical and computational methods used. Yet in recent years, doubts about the adequacy of the information provided in scientific articles and their addenda have been voiced. This has been called the replication crisis. The conferences will address the following topics: the initiation of research, the conduct of research, the preparation of research for publication, and the scrutiny after publication. The products of these meetings will be available to any non-participant through videos, presentations materials, and manuscripts. Undergraduates, graduate students, and career researchers will be able to learn about best practices for transparent, reproducible, and scientifically sound research in the social sciences. Research that follows the best practices discussed in the various meetings will be more verifiable, and thus more credible. These qualities are especially important for policy makers that wish to implement evidence-based policymaking, and a public that wishes to understand the foundations of such policies.

Scientific practices throughout the conduct of the research, during peer review, and after the dissemination of results all interact to enable a discourse about the veracity of scientific claims. The investigators will organize a sequence of conferences discussing educational and procedural barriers slowing down adoption of best practices, whether journals should be the verifiers of reproducibility, whether (and how) scientists' work can be made to be reproducible at every stage of the research process, and implications thereof for funding, technical infrastructure, and the training of undergraduate and graduate students. The topics chosen for the series are not usually part of disciplinary seminars or conferences and will be brought to a broader audience here for the first time. Most sessions will be held virtually (online), but others will be co-located with or submitted as complete sessions to professional meetings. The availability of permanent artifacts (presentations, recordings, manuscripts) after the conferences will allow this to be a resource with persistent impacts.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

(Showing: 1 - 10 of 13)
Ball, Richard "Yes We Can!: A Practical Approach to Teaching Reproducibility to Undergraduates" Harvard Data Science Review , v.5 , 2023 https://doi.org/10.1162/99608f92.9e002f7b Citation Details
Buck, Stuart "We Should Do More Direct Replications in Science" Harvard data science review , v.6 , 2024 https://doi.org/10.1162/99608f92.4eccc443 Citation Details
Butler, Courtney R. "Publishing Replication Packages: Insights From the Federal Reserve Bank of Kansas City" Harvard Data Science Review , v.5 , 2023 https://doi.org/10.1162/99608f92.aba61304 Citation Details
Guimarães, Paulo "Reproducibility With Confidential Data: The Experience of BPLIM" Harvard Data Science Review , v.5 , 2023 https://doi.org/10.1162/99608f92.54a00239 Citation Details
Hoynes, Hilary "Reproducibility in Economics: Status and Update" Harvard Data Science Review , v.5 , 2023 https://doi.org/10.1162/99608f92.80a1b88b Citation Details
MacDonald, Graham "Open Data and Code at the Urban Institute" Harvard Data Science Review , v.5 , 2023 https://doi.org/10.1162/99608f92.a631dfc5 Citation Details
Mendez-Carbajo, Diego and Dellachiesa, Alejandro "Data Citations and Reproducibility in the Undergraduate Curriculum" Harvard Data Science Review , v.5 , 2023 https://doi.org/10.1162/99608f92.c2835391 Citation Details
Peer, Limor "Why and How We Share Reproducible Research at Yale Universitys Institution for Social and Policy Studies" Harvard Data Science Review , v.6 , 2024 https://doi.org/10.1162/99608f92.dca148ba Citation Details
Pérignon, Christophe "The Role of Third-Party Verification in Research Reproducibility" Harvard Data Science Review , v.6 , 2024 https://doi.org/10.1162/99608f92.6d4bf9eb Citation Details
Salmon, Timothy C. "The Case for Data Archives at Journals" Harvard Data Science Review , v.5 , 2023 https://doi.org/10.1162/99608f92.db2a2554 Citation Details
Vilhuber, Lars and Schmutte, Ian and Michuda, Aleksandr and Connolly, Marie "Reinforcing Reproducibility and Replicability: An Introduction" Harvard Data Science Review , v.5 , 2023 https://doi.org/10.1162/99608f92.9ba2bd43 Citation Details
(Showing: 1 - 10 of 13)

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page