University Library
  • Login
A gateway to Melbourne's research publications
Minerva Access is the University's Institutional Repository. It aims to collect, preserve, and showcase the intellectual output of staff and students of the University of Melbourne for a global audience.
View Item 
  • Minerva Access
  • Medicine, Dentistry & Health Sciences
  • Melbourne School of Psychological Sciences
  • Melbourne School of Psychological Sciences - Theses
  • View Item
  • Minerva Access
  • Medicine, Dentistry & Health Sciences
  • Melbourne School of Psychological Sciences
  • Melbourne School of Psychological Sciences - Theses
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

    The low statistical power of psychological research: Causes, consequences and potential remedies

    Thumbnail
    Download
    Final thesis file (6.410Mb)

    Citations
    Altmetric
    Author
    Singleton Thorn, Felix
    Date
    2020
    Affiliation
    Melbourne School of Psychological Sciences
    Metadata
    Show full item record
    Document Type
    PhD thesis
    Access Status
    Open Access
    URI
    http://hdl.handle.net/11343/251890
    Description

    © 2020 Felix Singleton Thorn

    Abstract
    This dissertation examines two major issues in psychological research: formal sample size planning and reporting biases. It is organized into three main parts. The first part examines the history of formal sample size planning and reporting biases in the psychology research literature, outlining the history of the dominant approach to statistical analysis (Chapter 2), demonstrating the implications of low statistical power and reporting biases on research literatures (Chapter 3), and examining the history of statistical power analysis as represented in the psychology research literature (Chapter 4). The second part of this dissertation examines psychologists’ research and publication practices. Chapter 5 presents a meta-analysis of previous power surveys and finds that the average statistical power of psychology research at Cohen’s small and medium effect size benchmarks was lower than typical goal levels and that this value remained approximately constant from the 1960s to 2014. Chapter 6 presents an analysis of more than 130,000 effect size estimates from over 9,000 articles published in 5 APA journals from 1985 to 2013 and finds that the average effect size reported in this body of psychological research decreased over time. Together Chapters 5 and 6 suggest that the average statistical power of psychological research remained stable or may even have decreased over time. In order to investigate why this is the case, Chapter 7 presents the results of a survey of researchers from across fields of psychological research about their research planning practices. This survey highlights the most important barriers that prevent researchers from using formal sample size planning during the design phase of their research and shows that while most researchers believe statistical power is important for their research purposes, practical constraints act to limit achieved sample sizes in most studies. The final part of this thesis examines the implications of low statistical power and reporting biases on scientific research and provides suggestions on how research planning methods could be improved. Bringing together all of the previous large-scale replication projects that have been conducted in the behavioral sciences, Chapter 8 shows that effect sizes in replication studies are, on average, considerably lower than those reported in original studies, and quantifies the substantial heterogeneity in this value across replication projects. Finally, Chapter 9 examines sample size planning efforts reported in recent Psychological Science articles and uses this to illustrate a guide to effect size selection for formal sample size planning. Together, this dissertation shows that low statistical power and reporting biases remain serious problems for the behavioral sciences research literature. Contrasting the long history of efforts to improve the statistical power of psychology research with the lack of change in the average power of research from 1962 to 2014, I argue that new methods of avoiding the negative impact of low statistical power and reporting biases are necessary. Several recent publication and methodological developments, namely (a) preregistration, (b) pre-prints and data repositories, (c) the registered reports publication format and (d) the increasing use of large scale collaborative research projects, provide possible mechanisms with which to reduce the negative impact of low statistical power and reporting biases on the published scientific literature.
    Keywords
    Publication bias; Effect size; Effect sizes; Power analysis; Sample size; QRPs; Questionable research practices; Statistical power; Metascience; Metaresearch; Research practices; Methodology

    Export Reference in RIS Format     

    Endnote

    • Click on "Export Reference in RIS Format" and choose "open with... Endnote".

    Refworks

    • Click on "Export Reference in RIS Format". Login to Refworks, go to References => Import References


    Collections
    • Minerva Elements Records [45689]
    • Melbourne School of Psychological Sciences - Theses [294]
    Minerva AccessDepositing Your Work (for University of Melbourne Staff and Students)NewsFAQs

    BrowseCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects
    My AccountLoginRegister
    StatisticsMost Popular ItemsStatistics by CountryMost Popular Authors