Bem, D. Feeling the future: experimental evidence for anomalous retroactive influences on cognition and affect. J. Pers. Soc. Psychol.100, 407 (2011).
Google Scholar
Crocker, J. The road to fraud starts with a single step. Nature479, 151–151 (2011).
Google Scholar
Wagenmakers, E.-J., Wetzels, R., Borsboom, D. & Van Der Maas, H. L. Why psychologists must change the way they analyze their data: the case of psi: comment on Bem (2011). JPSP. 100, 426–432 (2011).
Munafò, M. R. et al. A manifesto for reproducible science. Nat. Hum. Behav.1, 1–9 (2017).
Google Scholar
Open Science Collaboration. Estimating the reproducibility of psychological science. Science349, aac4716 (2015). This study was one of the first large-scale replication projects showing lower replication rates and smaller effect sizes among “successful” replicated findings.
Google Scholar
Field, S. M., Hoekstra, R., Bringmann, L. & van Ravenzwaaij, D. When and why to replicate: as easy as 1, 2, 3? Collabra Psychol. 5, 46 (2019).
Nosek, B. A. et al. Replicability, robustness, and reproducibility in psychological science. Annu. Rev. Psychol.73, 719–748 (2022). This paper highlights the importance of addressing issues related to replicability, robustness, and reproducibility in psychological research to ensure the validity and reliability of findings.
Google Scholar
Farrar, B. G., Boeckle, M. & Clayton, N. S. Replications in comparative cognition: what should we expect and how can we improve? Anim. Behav. Cognit.7, 1 (2020).
Google Scholar
Farrar, B. G., Voudouris, K. & Clayton, N. S. Replications, comparisons, sampling and the problem of representativeness in animal cognition research. Anim. Behav. Cognit.8, 273 (2021).
Google Scholar
Farrar, B. G. et al. Reporting and interpreting non-significant results in animal cognition research. PeerJ11, e14963 (2023).
Google Scholar
Errington, T. M. et al. Investigating the replicability of preclinical cancer biology. Elife10, e71601 (2021).
Google Scholar
Camerer, C. F. et al. Evaluating replicability of laboratory experiments in economics. Science351, 1433–1436 (2016).
Google Scholar
Frith, U. Fast lane to slow science. Trends Cog. Sci.24, 1–2 (2020).
Google Scholar
Pennington, C. A Student’s Guide to Open Science: Using the Replication Crisis to Reform Psychology (Open University Press, 2023).
Hendriks, F., Kienhues, D. & Bromme, R. Replication crisis = trust crisis? the effect of successful vs failed replications on laypeople’s trust in researchers and research. Public Underst. Sci.29, 270–288 (2020).
Google Scholar
Sanders, M., Snijders, V. & Hallsworth, M. Behavioural science and policy: where are we now and where are we going? Behav. Public Policy2, 144–167 (2018).
Google Scholar
Vazire, S. Implications of the credibility revolution for productivity, creativity, and progress. Perspect. Psychol. Sci.13, 411–417 (2018). This paper explores how the rise of the credibility revolution, which emphasizes the importance of evidence-based knowledge and critical thinking, can lead to increased productivity, creativity, and progress in various fields.
Google Scholar
Freese, J., Rauf, T. & Voelkel, J. G. Advances in transparency and reproducibility in the social sciences. Soc. Sci. Res.107, 102770 (2022).
Google Scholar
Trafimow, D. et al. Manipulating the alpha level cannot cure significance testing. Front. Psychol.9, 699 (2018).
Google Scholar
Loken, E. & Gelman, A. Measurement error and the replication crisis. Science355, 584–585 (2017).
Google Scholar
Azevedo, F. et al. Towards a culture of open scholarship: the role of pedagogical communities. BMC Res. Notes15, 75 (2022). This paper details (a) the need to integrate open scholarship principles into research training within higher education; (b) the benefit of pedagogical communities and the role they play in fostering an inclusive culture of open scholarship; and (c) call for greater collaboration with pedagogical communities, paving the way for a much needed integration of top-down and grassroot open scholarship initiatives.
Google Scholar
Grahe, J. E., Cuccolo, K., Leighton, D. C. & Cramblet Alvarez, L. D. Open science promotes diverse, just, and sustainable research and educational outcomes. Psychol. Lean. Teach.19, 5–20 (2020).
Google Scholar
Norris, E. & O’Connor, D. B. Science as behaviour: using a behaviour change approach to increase uptake of open science. Psychol. Health34, 1397–1406 (2019).
Google Scholar
Azevedo, F. et al. Introducing a framework for open and reproducible research training (FORRT). Preprint at https://osf.io/bnh7p/ (2019). This paper describes the importance of integrating open scholarship into higher education, its benefits and challenges, as well as about FORRT initiatives aiming to support educators in this endeavor.
Nuijten, M. B. & Polanin, J. R. “statcheck”: Automatically detect statistical reporting inconsistencies to increase reproducibility of meta-analyses. Res. Synth. Methods11, 574–579 (2020).
Google Scholar
McAleer, P. et al. Embedding data skills in research methods education: preparing students for reproducible research. Preprint at https://psyarxiv.com/hq68s/ (2022).
Holcombe, A. O., Kovacs, M., Aust, F. & Aczel, B. Documenting contributions to scholarly articles using CRediT and tenzing. PLoS ONE15, e0244611 (2020).
Google Scholar
Koole, S. L. & Lakens, D. Rewarding replications: a sure and simple way to improve psychological science. Perspect. Psychol. Sci.7, 608–614 (2012).
Google Scholar
Bauer, G. et al. Teaching constructive replications in the social sciences. Preprint at https://osf.io/g3k5t/ (2022).
Wagge, J. R. et al. A demonstration of the Collaborative Replication and Education Project: Replication attempts of the red-romance effect. Collabra Psychol.5, 5 (2019). A multi-institutional effort is being presented with the goal to replicate and teach research methods by collaboratively conducting and evaluating replications of three psychology experiments.
Google Scholar
Wagge, J. R. et al. Publishing research with undergraduate students via replication work: the collaborative replications and education project. Front. Psychol.10, 247 (2019).
Google Scholar
Quintana, D. S. Replication studies for undergraduate theses to improve science and education. Nat. Hum. Behav.5, 1117–1118 (2021).
Google Scholar
Button, K. S., Chambers, C. D., Lawrence, N. & Munafò, M. R. Grassroots training for reproducible science: a consortium-based approach to the empirical dissertation. Psychol. Learn. Teach.19, 77–90 (2020). The article argues that improving the reliability and efficiency of scientific research requires a cultural shift in both thinking and practice, and better education in reproducible science should start at the grassroots, presenting a model of consortium-based student projects to train undergraduates in reproducible team science and reflecting on the pedagogical benefits of this approach.
Google Scholar
Feldman, G. Replications and extensions of classic findings in Judgment and Decision Making. https://doi.org/10.17605/OSF.IO/5Z4A8 (2020). A research team of early career researchers with the main activities in the years 2018-2023 focused on: 1) Mass scale project completing over 120 replications and extensions of classic findings in social psychology and judgment and decision making, 2) Building collaborative resources (tools, templates, and guides) to assist others in implementing open-science.
Efendić, E. et al. Risky therefore not beneficial: replication and extension of Finucane et al.’s (2000) affect heuristic experiment. Soc. Psychol. Personal. Sci13, 1173–1184 (2022).
Google Scholar
Ziano, I., Yao, J. D., Gao, Y. & Feldman, G. Impact of ownership on liking and value: replications and extensions of three ownership effect experiments. J. Exp. Soc. Psychol.89, 103972 (2020).
Google Scholar
Pownall, M. et al. Embedding open and reproducible science into teaching: a bank of lesson plans and resources. Schol. Teach. Learn. Psychol. (in-press) (2021). To support open science training in higher education, FORRT compiled lesson plans and activities, and categorized them based on their theme, learning outcome, and method of delivery, which are made publicly available here: FORRT’s Lesson Plans.
Coles, N. A., DeBruine, L. M., Azevedo, F., Baumgartner, H. A. & Frank, M. C. ‘big team’ science challenges us to reconsider authorship. Nat. Hum. Behav. 7, 665–667 (2023).
Allen, L., O’Connell, A. & Kiermer, V. How can we ensure visibility and diversity in research contributions? how the Contributor Role Taxonomy (CRediT) is helping the shift from authorship to contributorship. Learn. Publ.32, 71–74 (2019).
Google Scholar
Allen, L., Scott, J., Brand, A., Hlava, M. & Altman, M. Publishing: Credit where credit is due. Nature508, 312–313 (2014).
Google Scholar
Pownall, M. et al. The impact of open and reproducible scholarship on students’ scientific literacy, engagement, and attitudes towards science: a review and synthesis of the evidence. Roy. Soc. Open Sci., 10, 221255 (2023). This review article describes the available (empirical) evidence of the impact (and importance) of integrating open scholarship into higher education, its benefits and challenges on three specific areas: students’ (a) scientific literacy; (b) engagement with science; and (c) attitudes towards science.
Chopik, W. J., Bremner, R. H., Defever, A. M. & Keller, V. N. How (and whether) to teach undergraduates about the replication crisis in psychological science. Teach. Psychol.45, 158–163 (2018).
Google Scholar
Frank, M. C. & Saxe, R. Teaching replication. Perspect. Psychol. Sci.7, 600–604 (2012). In this perspective article, Frank and Saxe advocate for incorporating replication as a fundamental component of research training in psychology and other disciplines.
Google Scholar
Levin, N. & Leonelli, S. How does one “open” science? questions of value in biological research. Sci. Technol. Human Values42, 280–305 (2017).
Google Scholar
Van Dijk, D., Manor, O. & Carey, L. B. Publication metrics and success on the academic job market. Curr. Bio.24, R516–R517 (2014).
Google Scholar
Elsherif, M. M. et al. Bridging Neurodiversity and Open Scholarship: how shared values can Guide best practices for research integrity, social justice, and principled education. Preprint at https://osf.io/preprints/metaarxiv/k7a9p/ (2022). The authors describe systematic barriers, issues with disclosure, directions on prevalence and stigma, and the intersection of neurodiversity and open scholarship, and provide recommendations that can lead to personal and systematic changes to improve acceptance of neurodivergent individuals. Furthermore, perspectives of neurodivergent authors are being presented, the majority of whom have personal lived experiences of neurodivergence(s), and possible improvements in research integrity, inclusivity and diversity are being discussed.
Onie, S. Redesign open science for Asia, Africa and Latin America. Nature587, 35–37 (2020).
Google Scholar
Roberts, S. O., Bareket-Shavit, C., Dollins, F. A., Goldie, P. D. & Mortenson, E. Racial inequality in psychological research: trends of the past and recommendations for the future. Perspect. Psychol. Sci.15, 1295–1309 (2020). Roberts et al. highlight historical and current trends of racial inequality in psychological research and provide recommendations for addressing and reducing these disparities in the future.
Google Scholar
Steltenpohl, C. N. et al. Society for the improvement of psychological science global engagement task force report. Collabra Psychol.7, 22968 (2021).
Google Scholar
Parsons, S. et al. A community-sourced glossary of open scholarship terms. Nat. Hum. Behav.6, 312–318 (2022). In response to the varied and plural new terminology introduced by the open scholarship movement, which has transformed academia’s lexicon, FORRT members have produced a community and consensus-based Glossary to facilitate education and effective communicationbetween experts and newcomers.
Google Scholar
Pownall, M. et al. Navigating open science as early career feminist researchers. Psychol. Women Q.45, 526–539 (2021).
Google Scholar
Gourdon-Kanhukamwe, A. et al. Opening up understanding of neurodiversity: a call for applying participatory and open scholarship practices. Preprint at https://osf.io/preprints/metaarxiv/jq23s/ (2022).
Leech, G. Reversals in psychology. Behavioural and Social Sciences (Nature Portfolio) at https://socialsciences.nature.com/posts/reversals-in-psychology (2021).
Orben, A. A journal club to fix science. Nature573, 465–466 (2019).
Google Scholar
Arnold, B. et al. The turing way: a handbook for reproducible data science. Zenodo https://doi.org/10.5281/zenodo.3233986 (2019).
Open Life Science. A mentoring & training program for Open Science ambassadors. https://openlifesci.org/ (2023).
Almarzouq, B. et al. Opensciency—a core open science curriculum by and for the research community (2023).
Schönbrodt, F. et al. Netzwerk der Open-Science-Initiativen (NOSI). https://osf.io/tbkzh/ (2016).
Ball, R. et al. Course Syllabi for Open and Reproducible Methods. https://osf.io/vkhbt/ (2022).
The Carpentries. https://carpentries.org/ (2023).
The Embassy of Good Science. https://embassy.science/wiki/Main_Page (2023).
Berkeley Initiative for Transparency in the Social Sciences. https://www.bitss.org/ (2023).
Institute for Replication. https://i4replication.org/ (2023).
Reproducibility for Everyone. https://www.repro4everyone.org/ (2023).
Armeni, K. et al. Towards wide-scale adoption of open science practices: The role of open science communities. Sci. Public Policy48, 605–611 (2021).
Google Scholar
Welcome to the UK Reproducibility Network The UK Reproducibility Network (UKRN). https://www.ukrn.org/ (2023).
Collyer, F. M. Global patterns in the publishing of academic knowledge: Global North, global South. Curr. Soc.66, 56–73 (2018).
Google Scholar
Ali-Khan, S. E., Harris, L. W. & Gold, E. R. Motivating participation in open science by examining researcher incentives. Elife6, e29319 (2017).
Google Scholar
Robson, S. G. et al. Promoting open science: a holistic approach to changing behaviour. Collabra Psychol.7, 30137 (2021).
Google Scholar
Coalition for Advancing Research Assessment. https://coara.eu/ (2023).
Vanclay, J. K. Impact factor: outdated artefact or stepping-stone to journal certification? Scientometrics92, 211–238 (2012).
Google Scholar
Kidwell, M. C. et al. Badges to acknowledge open practices: a simple, low-cost, effective method for increasing transparency. PLoS Bio.14, e1002456 (2016).
Google Scholar
Rowhani-Farid, A., Aldcroft, A. & Barnett, A. G. Did awarding badges increase data sharing in BMJ Open? A randomized controlled trial. Roy. Soc. Open Sci.7, 191818 (2020).
Google Scholar
Thibault, R. T., Pennington, C. R. & Munafo, M. Reflections on preregistration: core criteria, badges, complementary workflows. J. Trial & Err. https://doi.org/10.36850/mr6 (2022).
Chambers, C. D. Registered reports: a new publishing initiative at Cortex. Cortex49, 609–610 (2013).
Google Scholar
Chambers, C. D. & Tzavella, L. The past, present and future of registered reports. Nat. Hum. Behav.6, 29–42 (2022).
Google Scholar
Soderberg, C. K. et al. Initial evidence of research quality of registered reports compared with the standard publishing model. Nat. Hum. Behav.5, 990–997 (2021).
Google Scholar
Scheel, A. M., Schijen, M. R. & Lakens, D. An excess of positive results: comparing the standard Psychology literature with Registered Reports. Adv. Meth. Pract. Psychol. Sci.4, 25152459211007467 (2021).
Renbarger, R. et al. Champions of transparency in education: what journal reviewers can do to encourage open science practices. Preprint at https://doi.org/10.35542/osf.io/xqfwb.
Nosek, B. A. et al. Transparency and Openness Promotion (TOP) Guidelines. Center for Open Science project. https://osf.io/9f6gx/ (2022).
Silverstein, P. et al. A Guide for Social Science Journal Editors on Easing into Open Science. (2023). Preprint at https://doi.org/10.31219/osf.io/hstcx.
NASA. Transform to Open Science (TOPS). https://github.com/nasa/Transform-to-Open-Science (2023).
UNESCO. UNESCO Recommendation on Open Science. https://unesdoc.unesco.org/ark:/48223/pf0000379949.locale=en (2021).
European University Association. https://eua.eu (2023).
Munafò, M. R. Improving the efficiency of grant and journal peer review: registered reports funding. Nicotine Tob. Res.19, 773–773 (2017).
Google Scholar
Else, H. A guide to Plan S: the open-access initiative shaking up science publishing. Nature (2021).
Mills, M. Plan S–what is its meaning for open access journals and for the JACMP? J. Appl. Clin. Med. Phys.20, 4 (2019).
Google Scholar
Zhang, L., Wei, Y., Huang, Y. & Sivertsen, G. Should open access lead to closed research? the trends towards paying to perform research. Scientometrics127, 7653–7679 (2022).
Google Scholar
McNutt, M. Plan S falls short for society publishers—and for the researchers they serve. Proc. Natl Acad. Sci. USA116, 2400–2403 (2019).
Google Scholar
PeerCommunityIn. https://peercommunityin.org/ (2023).
Elife. https://elifesciences.org/for-the-press/b2329859/elife-ends-accept-reject-decisions-following-peer-review (2023).
Nosek, B. A., Spies, J. R. & Motyl, M. Scientific utopia II: Restructuring incentives and practices to promote truth over publishability. Perspect. Psychol. Sci.7, 615–631 (2012).
Google Scholar
Schönbrodt, F. https://www.nicebread.de/open-science-hiring-practices/ (2016).
Delios, A. et al. Examining the generalizability of research findings from archival data. Proc. Natl Acad. Sci. USA119, e2120377119 (2022).
Google Scholar
Dreber, A. et al. Using prediction markets to estimate the reproducibility of scientific research. Proc. Natl Acad. Sci. USA112, 15343–15347 (2015).
Google Scholar
Fraser, H. et al. Predicting reliability through structured expert elicitation with the repliCATS (Collaborative Assessments for Trustworthy Science) process. PLoS ONE18, e0274429 (2023).
Google Scholar
Gordon, M., Viganola, D., Dreber, A., Johannesson, M. & Pfeiffer, T. Predicting replicability-analysis of survey and prediction market data from large-scale forecasting projects. PLoS ONE16, e0248780 (2021).
Google Scholar
Tierney, W. et al. Creative destruction in science. Organ. Behav. Hum. Decis. Process161, 291–309 (2020).
Google Scholar
Tierney, W. et al. A creative destruction approach to replication: implicit work and sex morality across cultures. J. Exp. Soc. Psychol.93, 104060 (2021).
Google Scholar
Hoogeveen, S., Sarafoglou, A. & Wagenmakers, E.-J. Laypeople can predict which social-science studies will be replicated successfully. Adv. Meth. Pract. Psychol. Sci.3, 267–285 (2020).
Google Scholar
Lewandowsky, S. & Oberauer, K. Low replicability can support robust and efficient science. Nat. Commun.11, 358 (2020).
Google Scholar
Button, K. S. & Munafò, M. R. in Psychological Science under Scrutiny: Recent Challenges and Proposed Solutions 22–33 (2017).
Świątkowski, W. & Dompnier, B. Replicability crisis in social psychology: looking at the past to find new pathways for the future. Int. Rev. Soc. Psychol.30, 111–124 (2017).
Google Scholar
Simonsohn, U., Nelson, L. D. & Simmons, J. P. P-curve: a key to the file-drawer. J. Exp. Psychol. Gen.143, 534 (2014).
Google Scholar
Brunner, J. & Schimmack, U. Estimating population mean power under conditions of heterogeneity and selection for significance. Meta-Psychol. 4, 1–22 (2020).
Benjamin, D. J. et al. Redefine statistical significance. Nat. Hum. Behav.2, 6–10 (2018).
Google Scholar
Rubin, M. & Donkin, C. Exploratory hypothesis tests can be more compelling than confirmatory hypothesis tests. Philos. Psychol. (in-press) 1–29 (2022).
Amrhein, V. & Greenland, S. Remove, rather than redefine, statistical significance. Nat. Hum. Behav.2, 4–4 (2018).
Google Scholar
Trafimow, D. & Marks, M. Editorial in basic and applied social psychology. Basic Appl. Soc. Psych.37, 1–2 (2015).
Google Scholar
Lakens, D. et al. Justify your alpha. Nat. Hum. Behav.2, 168–171 (2018).
Google Scholar
Lakens, D., Scheel, A. M. & Isager, P. M. Equivalence testing for psychological research: a tutorial. Adv. Meth. Prac. Psychol. Sci.1, 259–269 (2018).
Google Scholar
Verhagen, J. & Wagenmakers, E.-J. Bayesian tests to quantify the result of a replication attempt. J. Exp. Psychol. Gen.143, 1457 (2014).
Google Scholar
Dienes, Z. Bayesian versus orthodox statistics: Which side are you on?. Perspect. Psychol. Sci.6, 274–290 (2011).
Google Scholar
Love, J. et al. JASP: Graphical statistical software for common statistical designs. J. Stat. Soft.88, 1–17 (2019).
Google Scholar
Şahin, M. & Aybek, E. Jamovi: an easy to use statistical software for the social scientists. Int. J. Assess. Tools Educ.6, 670–692 (2019).
Google Scholar
Brown, N. J. & Heathers, J. A. The GRIM test: a simple technique detects numerous anomalies in the reporting of results in psychology. Soc. Psychol. Personal. Sci.8, 363–369 (2017).
Google Scholar
Heathers, J. A., Anaya, J., van der Zee, T. & Brown, N. J. Recovering data from summary statistics: Sample parameter reconstruction via iterative techniques (SPRITE). PeerJ Preprints6, e26968v1 (2018).
Botvinik-Nezer, R. et al. Variability in the analysis of a single neuroimaging dataset by many teams. Nature582, 84–88 (2020).
Google Scholar
Breznau, N. et al. Observing many researchers using the same data and hypothesis reveals a hidden universe of uncertainty. Proc. Natl Acad. Sci. USA119, e2203150119 (2022).
Google Scholar
Breznau, N. et al. The Crowdsourced Replication Initiative: investigating immigration and social policy preferences: executive report. https://osf.io/preprints/socarxiv/6j9qb/ (2019).
Gellman, A. & Lokem, E. The statistical crisis in science data-dependent analysis-a ‘garden of forking paths’-explains why many statistically significant comparisons don’t hold up. Am. Sci.102, 460 (2014).
Google Scholar
Azevedo, F. & Jost, J. T. The ideological basis of antiscientific attitudes: effects of authoritarianism, conservatism, religiosity, social dominance, and system justification. Group Process. Intergroup Relat.24, 518–549 (2021).
Google Scholar
Heininga, V. E., Oldehinkel, A. J., Veenstra, R. & Nederhof, E. I just ran a thousand analyses: benefits of multiple testing in understanding equivocal evidence on gene-environment interactions. PLoS ONE10, e0125383 (2015).
Google Scholar
Liu, Y., Kale, A., Althoff, T. & Heer, J. Boba: Authoring and visualizing multiverse analyses. IEEE Trans. Vis. Comp. Graph.27, 1753–1763 (2020).
Google Scholar
Harder, J. A. The multiverse of methods: extending the multiverse analysis to address data-collection decisions. Perspect. Psychol. Sci.15, 1158–1177 (2020).
Google Scholar
Steegen, S., Tuerlinckx, F., Gelman, A. & Vanpaemel, W. Increasing transparency through a multiverse analysis. Perspect. Psychol. Sci.11, 702–712 (2016).
Google Scholar
Azevedo, F., Marques, T. & Micheli, L. In pursuit of racial equality: identifying the determinants of support for the black lives matter movement with a systematic review and multiple meta-analyses. Perspect. Politics, (in-press), 1–23 (2022).
Borenstein, M., Hedges, L. V., Higgins, J. P. & Rothstein, H. R. Introduction to Meta-analysis (John Wiley & Sons, 2021).
Higgins, J. P. et al. Cochrane Handbook for Systematic Reviews of Interventions (John Wiley & Sons, 2022).
Carter, E. C., Schönbrodt, F. D., Gervais, W. M. & Hilgard, J. Correcting for bias in psychology: a comparison of meta-analytic methods. Adv. Meth. Pract. Psychol. Sci.2, 115–144 (2019).
Google Scholar
Nuijten, M. B., Hartgerink, C. H., Van Assen, M. A., Epskamp, S. & Wicherts, J. M. The prevalence of statistical reporting errors in psychology (1985–2013). Behav. Res. Methods48, 1205–1226 (2016).
Google Scholar
Van Assen, M. A., van Aert, R. & Wicherts, J. M. Meta-analysis using effect size distributions of only statistically significant studies. Psychol. Meth.20, 293 (2015).
Google Scholar
Page, M. J. et al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. Int. J. Surg.88, 105906 (2021).
Google Scholar
Topor, M. K. et al. An integrative framework for planning and conducting non-intervention, reproducible, and open systematic reviews (NIRO-SR). Meta-Psychol. (In Press) (2022).
Van den Akker, O. et al. Generalized systematic review registration form. Preprint at https://doi.org/10.31222/osf.io/3nbea (2020).
Booth, A. et al. The nuts and bolts of PROSPERO: an international prospective register of systematic reviews. Sys. Rev.1, 1–9 (2012).
Google Scholar
Cristea, I. A., Naudet, F. & Caquelin, L. Meta-research studies should improve and evaluate their own data sharing practices. J. Clin. Epidemiol.149, 183–189 (2022).
Google Scholar
Knobloch, K., Yoon, U. & Vogt, P. M. Preferred reporting items for systematic reviews and meta-analyses (PRISMA) statement and publication bias. J. Craniomaxillofac Surg.39, 91–92 (2011).
Google Scholar
Lakens, D., Hilgard, J. & Staaks, J. On the reproducibility of meta-analyses: Six practical recommendations. BMC Psychol.4, 1–10 (2016).
Google Scholar
Editors, P. M. Best practice in systematic reviews: the importance of protocols and registration. PLoS Med.8, e1001009 (2011).
Google Scholar
Tsujimoto, Y. et al. Majority of systematic reviews published in high-impact journals neglected to register the protocols: a meta-epidemiological study. J. Clin. Epidemiol.84, 54–60 (2017).
Google Scholar
Xu, C. et al. Protocol registration or development may benefit the design, conduct and reporting of dose-response meta-analysis: empirical evidence from a literature survey. BMC Med. Res. Meth.19, 1–10 (2019).
Google Scholar
Polanin, J. R., Hennessy, E. A. & Tsuji, S. Transparency and reproducibility of meta-analyses in psychology: a meta-review. Perspect. Psychol. Sci.15, 1026–1041 (2020).
Google Scholar
Uhlmann, E. L. et al. Scientific utopia III: Crowdsourcing science. Perspect. Psychol. Sci.14, 711–733 (2019).
Google Scholar
So, T. Classroom experiments as a replication device. J. Behav. Exp. Econ.86, 101525 (2020).
Google Scholar
Ebersole, C. R. et al. Many labs 3: evaluating participant pool quality across the academic semester via replication. J. Exp. Soc. Psychol.67, 68–82 (2016).
Google Scholar
Klein, R. et al. Investigating variation in replicability: a “many labs” replication project. Soc. Psychol. 45, 142–152 (2014).
Glöckner, A. et al. Hagen Cumulative Science Project. Project overview at osf.io/d7za8 (2015).
Moshontz, H. et al. The psychological science accelerator: advancing psychology through a distributed collaborative network. Adv. Meth. Pract. Psychol. Sci.1, 501–515 (2018).
Google Scholar
Forscher, P. S. et al. The Benefits, Barriers, and Risks of Big-Team Science. Perspect. Psychol. Sci. 17456916221082970 (2020). The paper discusses the advantages and challenges of conducting large-scale collaborative research projects, highlighting the potential for increased innovation and impact, as well as the difficulties in managing complex collaborations and addressing issues related to authorship and credit.
Lieck, D. S. N. & Lakens, D. An Overview of Team Science Projects in the Social Behavioral Sciences. https://doi.org/10.17605/OSF.IO/WX4ZD (2022).
Jarke, H. et al. A roadmap to large-scale multi-country replications in psychology. Collabra Psychol.8, 57538 (2022).
Google Scholar
Pennington, C. R., Jones, A. J., Tzavella, L., Chambers, C. D. & Button, K. S. Beyond online participant crowdsourcing: the benefits and opportunities of big team addiction science. Exp. Clin. Psychopharmacol. 30, 444–451 (2022).
Disis, M. L. & Slattery, J. T. The road we must take: multidisciplinary team science. Sci. Trans. Med.2, 22cm9–22cm9 (2010).
Google Scholar
Ledgerwood, A. et al. The pandemic as a portal: reimagining psychological science as truly open and inclusive. Perspect. Psychol. Sci.17, 937–959 (2022).
Google Scholar
Legate, N. et al. A global experiment on motivating social distancing during the COVID-19 pandemic. Proc. Natl Acad. Sci. USA119, e2111091119 (2022).
Google Scholar
Nexus, P. N. A. S. Predicting attitudinal and behavioral responses to COVID-19 pandemic using machine learning. Proc. Natl Acad. Sci. USA1, 1–15 (2022).
Van Bavel, J. J. et al. National identity predicts public health support during a global pandemic. Nat. Commun.13, 517 (2022).
Google Scholar
Buchanan, E. M. et al. The psychological science accelerator’s COVID-19 rapid-response dataset. Sci. Data10, 87 (2023).
Google Scholar
Azevedo, F. et al. Social and moral psychology of covid-19 across 69 countries. Nat. Sci. Dat. https://kar.kent.ac.uk/99184/ (2022).
Wang, K. et al. A multi-country test of brief reappraisal interventions on emotions during the COVID-19 pandemic. Nat. Hum. Behav.5, 1089–1110 (2021).
Google Scholar
Dorison, C. A. et al. In COVID-19 health messaging, loss framing increases anxiety with little-to-no concomitant benefits: Experimental evidence from 84 countries. Affect. Sci.3, 577–602 (2022).
Google Scholar
Coles, N. A. et al. A multi-lab test of the facial feedback hypothesis by the many smiles collaboration. Nat. Hum. Behav.6, 1731–1742 (2022).
Google Scholar
Coles, N. A., Gaertner, L., Frohlich, B., Larsen, J. T. & Basnight-Brown, D. M. Fact or artifact? demand characteristics and participants’ beliefs can moderate, but do not fully account for, the effects of facial feedback on emotional experience. J. Pers. Soc. Psychol.124, 287 (2023).
Google Scholar
Cowan, N. et al. How do scientific views change? notes from an extended adversarial collaboration. Perspect. Psychol. Sci.15, 1011–1025 (2020).
Google Scholar
Forscher, P. S. et al. Stereotype threat in black college students across many operationalizations. Preprint at https://psyarxiv.com/6hju9/ (2019).
Kahneman, D. & Klein, G. Conditions for intuitive expertise: a failure to disagree. Am. Psychol.64, 515 (2009).
Google Scholar
Kekecs, Z. et al. Raising the value of research studies in psychological science by increasing the credibility of research reports: the transparent Psi project. Roy. Soc. Open Sci.10, 191375 (2023).
Google Scholar
Henrich, J., Heine, S. J. & Norenzayan, A. The weirdest people in the world? Behav. Brain Sci.33, 61–83 (2010).
Google Scholar
Yarkoni, T. The generalizability crisis. Behav. Brain Sci.45, e1 (2022).
Google Scholar
Ghai, S. It’s time to reimagine sample diversity and retire the WEIRD dichotomy. Nat. Hum. Behav.5, 971–972 (2021). The paper argues that the reliance on WEIRD (Western, educated, industrialized, rich, and democratic) samples in psychological research limits the generalizability of findings and suggests reimagining sample diversity to ensure greater external validity.
Google Scholar
Nielsen, M. W. & Andersen, J. P. Global citation inequality is on the rise. Proc. Natl Acad. Sci. USA118, e2012208118 (2021).
Google Scholar
Oberauer, K. & Lewandowsky, S. Addressing the theory crisis in psychology. Psychon. Bull. Rev.26, 1596–1618 (2019).
Google Scholar
Devezer, B., Navarro, D. J., Vandekerckhove, J. & Ozge Buzbas, E. The case for formal methodology in scientific reform. Roy. Soc. Open Sci.8, 200805 (2021).
Google Scholar
Scheel, A. M., Tiokhin, L., Isager, P. M. & Lakens, D. Why hypothesis testers should spend less time testing hypotheses. Perspect. Psychol. Sci.16, 744–755 (2021).
Google Scholar
Chauvette, A., Schick-Makaroff, K. & Molzahn, A. E. Open data in qualitative research. Int. J. Qual. Meth.18, 1609406918823863 (2019).
Google Scholar
Field, S. M., van Ravenzwaaij, D., Pittelkow, M.-M., Hoek, J. M. & Derksen, M. Qualitative open science—pain points and perspectives. Preprint at https://doi.org/10.31219/osf.io/e3cq4 (2021).
Steltenpohl, C. N. et al. Rethinking transparency and rigor from a qualitative open science perspective. J. Trial & Err. https://doi.org/10.36850/mr7 (2023).
Branney, P. et al. Three steps to open science for qualitative research in psychology. Soc. Pers. Psy. Comp. 17, 1–16 (2023).
VandeVusse, A., Mueller, J. & Karcher, S. Qualitative data sharing: Participant understanding, motivation, and consent. Qual. Health Res.32, 182–191 (2022).
Google Scholar
Çelik, H., Baykal, N. B. & Memur, H. N. K. Qualitative data analysis and fundamental principles. J. Qual. Res. Educ. 8, 379–406 (2020).
Class, B., de Bruyne, M., Wuillemin, C., Donzé, D. & Claivaz, J.-B. Towards open science for the qualitative researcher: from a positivist to an open interpretation. Int. J. Qual. Meth.20, 16094069211034641 (2021).
Google Scholar
Humphreys, L., Lewis Jr, N. A., Sender, K. & Won, A. S. Integrating qualitative methods and open science: five principles for more trustworthy research. J. Commun.71, 855–874 (2021).
Steinhardt, I., Bauer, M., Wünsche, H. & Schimmler, S. The connection of open science practices and the methodological approach of researchers. Qual. Quant. (in-press) 1–16 (2022).
Haven, T. L. & Van Grootel, L. Preregistering qualitative research. Account. Res.26, 229–244 (2019).
Google Scholar
Frohwirth, L., Karcher, S. & Lever, T. A. A transparency checklist for qualitative research. Preprint at https://doi.org/10.31235/osf.io/wc35g (2023).
Demgenski, R., Karcher, S., Kirilova, D. & Weber, N. Introducing the qualitative data repository’s curation handbook. J. eSci. Librariansh.10, 1–11 (2021).
Karcher, S., Kirilova, D., Pagé, C. & Weber, N. How data curation enables epistemically responsible reuse of qualitative data. Qual. Rep. 26, 1996–2010 (2021).
Bergmann, C. How to integrate open science into language acquisition research? Student workshop at BUCLD 43 (2018).
Bergmann, C. The buffet approach to open science. https://cogtales.wordpress.com/2023/04/16/the-buffet-approach-to-open-science/ (2023).
Field, S. M. & Derksen, M. Experimenter as automaton; experimenter as human: exploring the position of the researcher in scientific research. Eur. J. Philos. Sci.11, 11 (2021).
Google Scholar
Chenail, R. J. Communicating your qualitative research better. Fam. Bus. Rev.22, 105–108 (2009).
Google Scholar
Levitt, H. M. et al. The meaning of scientific objectivity and subjectivity: from the perspective of methodologists. Psychol. Methods27, 589–605 (2020).
Candela, A. G. Exploring the function of member checking. Qual. Rep.24, 619–628 (2019).
Petersen, O. H. Inequality of research funding between different countries and regions is a serious problem for global science. Function2, zqab060 (2021).
Google Scholar
Puthillam, A. et al. Guidelines to improve internationalization in psychological science. Preprint at https://psyarxiv.com/2u4h5/ (2022).
Taffe, M. & Gilpin, N. Equity, diversity and inclusion: racial inequity in grant funding from the US National Institutes of Health. eLife10, e65697 (2021).
Google Scholar
Burns, K. E., Straus, S. E., Liu, K., Rizvi, L. & Guyatt, G. Gender differences in grant and personnel award funding rates at the Canadian Institutes of Health Research based on research content area: a retrospective analysis. PLoS Med.16, e1002935 (2019).
Google Scholar
Sato, S., Gygax, P. M., Randall, J. & Schmid Mast, M. The leaky pipeline in research grant peer review and funding decisions: challenges and future directions. High. Educ.82, 145–162 (2021).
Google Scholar
Guttinger, S. The limits of replicability. Eur. J. Philos. Sci.10, 10 (2020).
Google Scholar
Evans, T. Developments in open data norms. J. Open Psychol. Data10, 1–6 (2022).
John, L. K., Loewenstein, G. & Prelec, D. Measuring the prevalence of questionable research practices with incentives for truth telling. Psychol. Sci.23, 524–532 (2012).
Google Scholar
Simmons, J. P., Nelson, L. D. & Simonsohn, U. False-positive psychology: undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychol. Sci.22, 1359–1366 (2016).
Google Scholar
Wicherts, J. M. et al. Degrees of freedom in planning, running, analyzing, and reporting psychological studies: a checklist to avoid p-hacking. Front. Psychol.7, 1832 (2016).
Google Scholar
Flake, J. K., Pek, J. & Hehman, E. Construct validation in social and personality research: current practice and recommendations. Soc. Psychol. Personal. Sci.8, 370–378 (2017).
Google Scholar
Flake, J. K. & Fried, E. I. Measurement schmeasurement: questionable measurement practices and how to avoid them. Adv. Meth. Pract. Psychol. Sci.3, 456–465 (2020).
Google Scholar
Agnoli, F., Wicherts, J. M., Veldkamp, C. L., Albiero, P. & Cubelli, R. Questionable research practices among Italian research psychologists. PLoS ONE12, e0172792 (2017).
Google Scholar
Fiedler, K. & Schwarz, N. Questionable research practices revisited. Soc. Psychol. Personal. Sci7, 45–52 (2016).
Google Scholar
Kerr, N. L. HARKing: Hypothesizing after the results are known. Pers. Soc. Psychol. Rev.2, 196–217 (1998).
Google Scholar
Molléri, J. S. Research Incentives in Academia Leading to Unethical Behavior. in Research Challenges in Information Science: 16th International Conference, RCIS 2022, Barcelona, Spain, May 17–20, 2022, Proceedings 744–751 (Springer, 2022).
Gerrits, R. G. et al. Occurrence and nature of questionable research practices in the reporting of messages and conclusions in international scientific Health Services Research publications: a structured assessment of publications authored by researchers in the Netherlands. BMJ Open9, e027903 (2019).
Google Scholar
Checchi, D., De Fraja, G. & Verzillo, S. Incentives and careers in academia: theory and empirical analysis. Rev. Econ. Stat.103, 786–802 (2021).
Grove, L. The Effects of Funding Policies on Academic Research. Ph.D. thesis, University College London (2017).
Frias-Navarro, D., Pascual-Soler, M., Perezgonzalez, J., Monterde-i Bort, H. & Pascual-Llobell, J. Spanish Scientists’ Opinion about Science and Researcher Behavior. Span. J. Psychol.24, e7 (2021).
Google Scholar
Bornmann, L. & Daniel, H.-D. The state of h index research: is the h index the ideal way to measure research performance? EMBO Rep.10, 2–6 (2009).
Google Scholar
Munafò, M. et al. Scientific rigor and the art of motorcycle maintenance. Nat. Biotechn.32, 871–873 (2014).
Google Scholar
Primbs, M. A. et al. Are small effects the indispensable foundation for a cumulative psychological science? A reply to Götz et al. (2022). Perspect. Psychol. Sci.18, 508–512 (2022).
Martin, G. & Clarke, R. M. Are psychology journals anti-replication? A snapshot of editorial practices. Front. Psychol.8, 523 (2017).
Google Scholar
Cohen, B. A. How should novelty be valued in science? Elife6, e28699 (2017).
Google Scholar
Tijdink, J. K., Vergouwen, A. C. & Smulders, Y. M. Publication pressure and burn out among Dutch medical professors: a nationwide survey. PLoS ONE8, e73381 (2013).
Google Scholar
Tijdink, J. K., Verbeke, R. & Smulders, Y. M. Publication pressure and scientific misconduct in medical scientists. J. Empir. Res. Hum. Res. Ethics9, 64–71 (2014).
Google Scholar
Laitin, D. D. et al. Reporting all results efficiently: a RARE proposal to open up the file drawer. Proc. Natl Acad. Sci. USA118, e2106178118 (2021).
Google Scholar
Franco, A., Malhotra, N. & Simonovits, G. Publication bias in the social sciences: unlocking the file drawer. Science345, 1502–1505 (2014).
Google Scholar
Matarese, V. Kinds of replicability: different terms and different functions. Axiomathes 1–24 (2022).
Maxwell, S. E., Lau, M. Y. & Howard, G. S. Is psychology suffering from a replication crisis? what does “failure to replicate” really mean? Am. Psychol.70, 487 (2015).
Google Scholar
Ulrich, R. & Miller, J. Questionable research practices may have little effect on replicability. Elife9, e58237 (2020).
Google Scholar
Devezer, B. & Buzbas, E. Minimum viable experiment to replicate (2021). Preprint at http://philsci-archive.pitt.edu/21475/.
Stroebe, W. & Strack, F. The alleged crisis and the illusion of exact replication. Perspect. Psychol. Sci.9, 59–71 (2014).
Google Scholar
Feest, U. Why replication is overrated. Phil. Sci.86, 895–905 (2019).
Google Scholar
Eronen, M. I. & Bringmann, L. F. The theory crisis in psychology: how to move forward. Perspect. Psychol. Sci.16, 779–788 (2021).
Google Scholar
Copyright for syndicated content belongs to the linked Source link