Main Article Content
The quality of scientific research is assessed not only by its positive impact on socio-economic development and human well-being, but also by its contribution to the development of valid and reliable scientific knowledge. As a result, researchers, regardless of their scientific discipline, are expected to adopt research practices based on transparency and rigor. However, the history of science and scientific literature tell us that a large part of scientific results cannot be reproduced systematically (Ioannidis, 2005) 1. This refers to what is commonly known as the “replication crisis” which concerns both the natural sciences and the human and social sciences, of which psychology is no exception.
In this article, we first discuss some aspects of the replication crisis and questionable research practices. Then, we discuss how we can involve more laboratories in Africa in global scientific research, including the Psychological Science Accelerator (PSA). In this sense, we will develop a tutorial for laboratories in Africa, highlighting the practices of open science. Finally, we discuss how to make psychological science more participatory and inclusive.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work.
Ioannidis, J.P.A. (2005).Why most published research findings are false. PLoS Med, 2(8), e124. doi: 10.1371/journal.pmed.0020124.
Doyen, S., Klein, O., Pichon, C. L., & Cleeremans, A. (2012). Behavioral Priming: It's All in the Mind, but Whose Mind? PLoS ONE, 7(1): e29081. doi: 10.1371/journal.pone.0029081.
Pashler, H., Rohrer, D., & Harris, C. (2013). Can the goal of honesty be primed? Journal of Experimental Social Psychology, 49, 959–964. doi : 10.1016/j.jesp.2013.05.011.
Rodgers, J. L., & Shrout, P. E. (2018). Psychology’s replication crisis as scientific opportunity: A précis for policymakers. Policy Insights from the Behavioral and Brain Sciences, 5(1), 134–141. doi: 10.1177/2372732217749254.
Bem, D. J. (2011). Feeling the future: Experimental evidence for anomalous retroactive influences on cognition and affect. Journal of Personality and Social Psychology, 100(3), 407–425. doi: 10.1037/a0021524.
Francis, G. (2012). Too good to be true: Publication bias in two prominent studies from experimental psychology. Psychonomic Bulletin & Review, 19, 151–156. doi: 10.3758/s13423-012-0227-9.
Galak, J., LeBoeuf, R. A., Nelson, L. D., & Simmons, J. P. (2012). Correcting the past: Failures to replicate psi. Journal of Personality and Social Psychology, 103(6), 933–948. doi:10.1037/a0029709.
Schimmack, U. (2012). The ironic effect of significant results on the credibility of multiple-study articles. Psychological Methods, 17(4), 551–566. doi: 10.1037/a0029487.
Klein, R. A., Ratliff, K. A., Vianello, M., Adams, R. B., Jr., Bahník, Š., Bernstein, M. J., . . . Nosek, B. A. (2014). Investigating variation in replicability: A “Many Labs” replication project. Social Psychology, 45, 142–152. doi:10.1027/1864-9335/a000178.
Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science, 349(6251). doi: 10.1126/science.aac4716.
Camerer, C. F., Dreber, A., Forsell, E., Ho, T.-H., Huber, J., Johannesson, M., . . . Wu, H. (2016). Evaluating replicability of laboratory experiments in economics. Science, 351, 1433–1436. doi:10.1126/science.aaf0918.
Ebersole, C. R., Atherton, O. E., Belanger, A. L., Skulborstad, H. M., Allen, J. M., Banks, J. B., . . . Nosek, B. A. (2016). Many Labs 3: Evaluating participant pool quality across the academic semester via replication. Journal of Experimental Social Psychology, 67, 68–82. doi:10.1016/j .jesp.2015.10.012.
Cova, F., Strickland, B., Abatista, A., Allard, A., Andow, J., Attie, M., . . . Zhou, X. (2018). Estimating the reproducibility of experimental philosophy. Review of Philosophy and Psychology, 1-36. doi:10.1007/s13164-018-0400-9.
Camerer, C. F., Dreber, A., Holzmeister, F., Ho, T.-H., Huber, J., Johannesson, M., . . . Wu, H. (2018). Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015. Nature Human Behaviour, 2, 637–644. doi:10.1038/s41562-018-0399-z.
Klein, R. A., Vianello, M., Hasselman, F., Adams, B. G., Adams, R. B., Alper, S., . . . Nosek, B. A. (2018). Many Labs 2: Investigating variation in replicability across sample and setting. Advances in Methods and Practices in Psychological Science, 1(4), 443-490, doi: 10.1177/2515245918810225.
Simmons, J.P., Nelson, L.D., & Simonsohn, U. (2011).False-positive psychology: undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychological Science, 22, 1359–1366. doi: 10.1177/095679761141 7632.
John, L. K., Loewenstein, G., & Prelec, D. (2012). Measuring the prevalence of questionable research practices with incentives for truth telling. Psychological Science, 23, 524–532. doi: 10.1177/0956797611430953.
Kerr, N. L. (1998). HARKing: Hypothesizing After the Results are Known. Personality and Social Psychology Review, 2(3), 196-217. doi: 10.1207/s15327957pspr0203_4.
Sterling, T. D., Rosenbaum, W. L., & Weinkam, J. J. (1995), “Publication decisions revisited: The effect of the outcome of statistical tests on the decision to publish and vice versa. The American Statistician, 49, 108- 112.
Fanelli, D. (2010). ‘‘Positive’’ results increase down the hierarchy of the sciences. Plos One, 5(3), e10068. doi:10.1371/journal.pone.0010068.
Szucs., D., & Ioannidis, J. P. A. (2017). Empirical assessment of published effect sizes and power in the recent cognitive neuroscience and psychology literature. PLoS Biol, 15(3): e2000797. https://doi.org/10.1371/journal.pbio.2000797.
Scheel, A. M., Schijen, M., & Lakens, D. (2020). An excess of positive results: Comparing the standard Psychology literature with Registered Reports. PsyArXiv.
Jones, B., DeBruine, L., Flake, J., Liuzaa, M. T., Antfolk, J., Arinze, N., … Coles, N. (under-review). To Which World Regions Does the Valence-Dominance Model of Social Perception Apply? (PSA001; Registered Report Stage 2). Nature Human Behaviour.
Moshontz, H., Campbell, L., Ebersole, C., IJzerman, H., Urry, H. L., Forscher, P. L., … Chartier, C. R. -2018). The Psychological Science Accelerator: Advancing Psychology Through a Distributed Collaborative Network. Advances in Methods and Practices in Psychological Science, 1(4) 501–515. doi: 10.1177/2515245918797607.
Wagge, J. R., Brandt, M. J., Lazarevic, L. B., Legate, N., Christopherson, C., Wiggins, B., & Grahe, J. E. (2019). Publishing research with undergraduate students via replication work: The collaborative replications and education project. Frontiers in Psychology, 10, 247.
Crandall, C. S., & Sherman, J. W. (2016). On the scientific superiority of conceptual replications for scientific progress. Journal of Experimental Social Psychology, 26, 93-99. doi: 10.1016/j.jesp.2015.10.002.
Chambers, C. D. (2013). Registered reports: A new publishing initiative at Cortex. Cortex, 49, 609–610.
Nosek, B. A., & Lakens, D. (2014). Registered reports: A method to increase the credibility of published results. Social Psychology, 45(3), 137-141. doi: 10.1027/1864-9335/a000192.