
NSF Org: |
BCS Division of Behavioral and Cognitive Sciences |
Recipient: |
|
Initial Amendment Date: | August 3, 2017 |
Latest Amendment Date: | August 3, 2017 |
Award Number: | 1728332 |
Award Instrument: | Standard Grant |
Program Manager: |
Steven Breckler
BCS Division of Behavioral and Cognitive Sciences SBE Directorate for Social, Behavioral and Economic Sciences |
Start Date: | August 1, 2017 |
End Date: | July 31, 2021 (Estimated) |
Total Intended Award Amount: | $324,458.00 |
Total Awarded Amount to Date: | $324,458.00 |
Funds Obligated to Date: |
|
History of Investigator: |
|
Recipient Sponsored Research Office: |
801 UNIVERSITY BLVD TUSCALOOSA AL US 35401-2029 (205)348-5152 |
Sponsor Congressional District: |
|
Primary Place of Performance: |
505 Hackberry Lane Tuscaloosa AL US 35401-2029 |
Primary Place of
Performance Congressional District: |
|
Unique Entity Identifier (UEI): |
|
Parent UEI: |
|
NSF Program(s): | Social Psychology |
Primary Program Source: |
|
Program Reference Code(s): |
|
Program Element Code(s): |
|
Award Agency Code: | 4900 |
Fund Agency Code: | 4900 |
Assistance Listing Number(s): | 47.075 |
ABSTRACT
A basic principle of the scientific method is that scientists are expected to update their theories in light of new evidence. Indeed, public investment in science is assumed to produce theoretical advances that lead to real-world changes such as better health care, more efficient technology, and a deeper understanding of human behavior. Recent findings, however, raise concerns about the robustness and reliability of scientific research. Some published findings cannot be replicated by other researchers, and thus fail to meet basic standards of scientific value. To address these concerns, the scientific community needs to (1) collect new evidence, testing and re-testing existing theories, and (2) update theories in response to this evidence. Psychological science has been a leader in initiating the first step, with multiple large-scale replication projects underway to test the robustness of existing theories. The current project complements these efforts by providing a real-world test of the second step, examining whether scientists adjust their theories in response to new evidence. Documenting how scientists respond to new evidence is fundamental to establishing the value of research.
This research tracks psychological scientists' beliefs in psychological theories before and after the release of results from two large-scale replication projects: Many Labs 5 (ML5) and Registered Replication Reports (RRRs). These projects offer evidence that reaches the highest standards of scientific rigor: researchers commit to methodology and data analysis strategy ahead of time, data are made available to the public, and results are published regardless of outcome. If psychological science is progressing as it should, this high-quality evidence should lead researchers to update their theories. If such updating is not happening, this project will help to determine why this might be the case and whether some researchers might be better than others in updating their theories. By identifying factors that promote scientific progress, and those that stand in the way, this project should contribute broadly to improving the robustness and reliability of research across all fields of science.
PROJECT OUTCOMES REPORT
Disclaimer
This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.
In order for scientific progress to occur scientists must adjust their beliefs in light of new evidence. But, does this actually happen in practice? This project was designed to answer three key questions: 1) How much do psychologists update their beliefs in response to new evidence? 2) Do psychologists show signs of trying to preserve their pre-existing beliefs? 3) What predicts the extent of psychologists' belief updating?
To address these questions, we asked over 1,000 psychological scientists to rate their belief in psychological effects before and after new evidence became available from large scale replication studies. We found that psychologists did update their beliefs; they updated as much as they predicted they would, but not as much as they "should" according to a statistical model (which assumes they trust the replication results). We found no evidence that psychologists attempted to preserve their pre-existing beliefs, but they were generally not strongly invested in the findings in the first place. We also found no evidence that experts updated their beliefs more, but people higher on intellectual humility updated their beliefs slightly more.
These results have been accepted for publication in the journal Nature Human Behaviour. They have also been presented in talks at both the University of Alabama and the University of California, Davis, and are scheduled to be presented at the global Metascience Conference in September, 2021. The data from these studies are also being used to explore questions about psychologists' perceptions of the field and the accuracy of psychologists' predictions about replication outcomes.
Overall, our results have implications for the National Science Foundation and the research that it funds. Our findings suggest that replication projects can make important contributions to self-correction within psychology, and that their value may currently be underestimated.
Last Modified: 08/06/2021
Modified by: Alexa Tullett
Please report errors in award information by writing to: awardsearch@nsf.gov.