Introducing the Journal Editors Discussion Interface
DOI:
https://doi.org/10.29173/iq1146Keywords:
open science, scholarly publishing, community, social scienceAbstract
Journal editors play an important role in advancing open science in their respective fields. However, their role is temporary and (usually) part time, and therefore many do not have enough time to dedicate towards changing policies, practices, and procedures at their journals. The Journal Editors Discussion Interface (JEDI, https://dpjedi.org) is an online community for journal editors in the social sciences that was launched in 2021, consisting of a listserv and resource page. JEDI aims to increase uptake of open science at social science journals by providing journal editors with a space to learn and discuss. In this paper, we explore JEDI’s progress in its first two years, presenting data on membership, posts, and from a members survey. We see a reasonable mix of people participating in listserv conversations and there are no detectable differences among groups in the number of replies received by thread-starters. The community survey suggests JEDI members find conversations and resources on JEDI generally informative and useful, and see JEDI primarily as a community to get honest opinions from others on editorial practices. However, JEDI membership is not as heterogeneous as would be ideal for the purpose of the group, especially when considering geographic diversity.
References
Alperin, J. P. (2015). The public impact of Latin America's approach to open access. Stanford University.
American Anthropological Association. (2012). 2012 Ethics Statement. http://ethics.americananthro.org/category/statement/
American Sociological Association (ASA). (2018). ASA Code of Ethics. http://www.asanet.org/code-ethics
Bird, S. J. (2014). Socially responsible science is more than “good science”. Journal of microbiology & biology education, 15(2), 169-172. https://doi.org/10.1128/jmbe.v15i2.870
Button, K. S., Ioannidis, J. P., Mokrysz, C., Nosek, B. A., Flint, J., Robinson, E. S., & Munafò, M. R. (2013). Power failure: why small sample size undermines the reliability of neuroscience. Nature reviews neuroscience, 14(5), 365-376. https://www.nature.com/articles/nrn3475
Christensen, G., & Miguel, E. (2018). Transparency, reproducibility, and the credibility of economics research. Journal of Economic Literature, 56(3), 920-980. https://doi.org/10.1257/jel.20171350
Christensen, G., Freese, J., & Miguel, E. (2019). Transparent and reproducible social science research: How to do open science. University of California Press.
Cook, B. G., Lloyd, J. W., Mellor, D., Nosek, B. A., & Therrien, W. J. (2018). Promoting open science to increase the trustworthiness of evidence in special education. Exceptional Children, 85(1), 104-118. https://doi.org/10.1177/0014402918793138
Crosas, M., Gautier, J., Karcher, S., Kirilova, D., Otalora, G., & Schwartz, A. (2018, March 30). Data policies of highly-ranked social science journals. https://doi.org/10.31235/osf.io/9h7ay
Elman, C., & Lupia, A. (2016). DA-RT: Aspirations and Anxieties. Comparative Politics Newsletter, 26(1), 44–52. https://qdr.syr.edu/drupal_data/public/ElmanLupia_DART_2016.pdf
Elman, C., Kapiszewski, D., & Lupia, A. (2018). Transparent social inquiry: Implications for political science. Annual Review of Political Science, 21(1), 29-47. https://doi.org/10.1146/annurev-polisci-091515-025429
European Commission. (2023, February 10). Open Science. https://research-and-innovation.ec.europa.eu/strategy/strategy-2020-2024/our-digital-future/open-science_en
European Parliament. (2019, April 17). European research priorities for 2021-2027 agreed with member states [Press Release]. https://www.europarl.europa.eu/news/en/press-room/20190311IPR31038/european-research-priorities-for-2021-2027-agreed-with-member-states
Evans, T. R., Pownall, M., Collins, E., Henderson, E. L., Pickering, J. S., O’Mahony, A., ... & Dumbalska, T. (2022). A network of change: united action on research integrity. BMC research notes, 15(1), 141.
Farran, E. K., Silverstein, P., Ameen, A. A., Misheva, I., & Gilmore, C. (2020, December 15). Open Research: Examples of good practice, and resources across disciplines. https://doi.org/10.31219/osf.io/3r8hb
Fleming, J. I., Wilson, S. E., Hart, S. A., Therrien, W. J., & Cook, B. G. (2021). Open accessibility in education research: Enhancing the credibility, equity, impact, and efficiency of research. Educational Psychologist, 56(2), 110-121. https://doi.org/10.1080/00461520.2021.1897593
Freese, J. (2007). Replication standards for quantitative social science: Why not sociology?. Sociological Methods & Research, 36(2), 153-172. https://doi.org/10.1177/0049124107306659
Freese, J., & King, M. M. (2018). Institutionalizing transparency. Socius, 4. https://doi.org/10.1177/2378023117739216
Gehlbach, H., & Robinson, C. D. (2018). Mitigating illusory results through preregistration in education. Journal of Research on Educational Effectiveness, 11(2), 296-315. https://doi.org/10.1080/19345747.2017.1387950
Harris, J. K., Johnson, K. J., Carothers, B. J., Combs, T. B., Luke, D. A., & Wang, X. (2018). Use of reproducible research practices in public health: A survey of public health analysts. PLoS One, 13(9), e0202447. https://doi.org/10.1371/journal.pone.0202447
Harvard University Privacy Tools Project. (2020). [Project Homepage]. https://privacytools.seas.harvard.edu/home
Jarvis, S. N., Ebersole, C. R., Nguyen, C. Q., Zhu, M., & Kray, L. J. (2022). Stepping up to the mic: Gender gaps in participation in live question-and-answer sessions at academic conferences. Psychological Science, 33(11), 1882-1893. https://doi.org/10.1177/09567976221094036
Kamath, G., & Ullman, J. (2020). A primer on private statistics. arXiv preprint arXiv:2005.00010. https://doi.org/10.48550/arXiv.2005.00010
Levenstein, M. C., & Lyle, J. A. (2018). Data: Sharing is caring. Advances in Methods and Practices in Psychological Science, 1(1), 95-103.
Levenstein, M. C., Tyler, A. R., & Davidson Bleckman, J. (2018). The researcher passport: Improving data access and confidentiality protection. https://doi.org/10.18235/0002027
Lupia, A., & Elman, C. (2014). Openness in political science: Data access and research transparency: Introduction. PS: Political Science & Politics, 47(1), 19-42. https://doi.org/10.1017/S1049096513001716
Maienschein, J., Parker, J. N., Laubichler, M., & Hackett, E. J. (2019). Data management and data sharing in science and technology studies. Science, technology, & human values, 44(1), 143-160. https://doi.org/10.1177/0162243918798906
Makel, M. C., & Plucker, J. A. (2014). Facts are more important than novelty: Replication in the education sciences. Educational Researcher, 43(6), 304-316. https://doi.org/10.3102/0013189X14545513
McBee, M. T., Makel, M. C., Peters, S. J., & Matthews, M. S. (2018). A call for open science in giftedness research. Gifted Child Quarterly, 62(4), 374-388. https://doi.org/10.1177/0016986218784178
Mellor, D. (2021). Improving norms in research culture to incentivize transparency and rigor. Educational Psychologist, 56(2), 122-131. https://doi.org/10.1080/00461520.2021.1902329
Merton, R. K. (1949). Social Theory and Social Structure: Toward the codification of theory and research. The Free Press.
Miguel, E., Camerer, C., Casey, K., Cohen, J., Esterling, K. M., Gerber, A., ... & Van der Laan, M. (2014). Promoting transparency in social science research. Science, 343(6166), 30-31. https://doi.org/10.1126/science.1245317
National Academies of Sciences, Policy, Global Affairs, Board on Research Data, Information, Division on Engineering, ... & Replicability in Science. (2019). Reproducibility and replicability in science. National Academies Press. https://doi.org/10.17226/25303
National Science Foundation. (2011). Dissemination and sharing of research results. http://www.nsf.gov/bfa/dias/policy/dmp.jsp
National Science Foundation. (2019). Dear Colleague Letter: Effective Practices for Data. https://www.nsf.gov/pubs/2019/nsf19069/nsf19069.jsp
National Science Foundation. (2023). NSF Public Access Plan 2.0 (NSF Publication No. 23–104). https://www.nsf.gov/pubs/2023/nsf23104/nsf23104.pdf
Nosek, B. A. (2019, June 11). Strategy for Culture Change. Center for Open Science Blog. https://www.cos.io/blog/strategy-for-culture-change
Nosek, B. A., Alter, G., Banks, G. C., Borsboom, D., Bowman, S. D., Breckler, S. J., ... & Yarkoni, T. (2015). Promoting an open research culture. Science, 348(6242), 1422-1425. https://doi.org/10.1126/science.aab2374
Nosek, B. A., Hardwicke, T. E., Moshontz, H., Allard, A., Corker, K. S., Dreber, A., ... & Vazire, S. (2022). Replicability, robustness, and reproducibility in psychological science. Annual review of psychology, 73(1), 719-748. https://www.annualreviews.org/content/journals/10.1146/annurev-psych-020821-114157
Naaman, K., Grant, S., Kianersi, S., Supplee, L., Henschel, B., & Mayo-Wilson, E. (2023). Exploring enablers and barriers to implementing the Transparency and Openness Promotion Guidelines: a theory-based survey of journal editors. Royal Society Open Science, 10(2), 221093. https://doi.org/10.1098/rsos.221093
Parsons, S., Azevedo, F., Elsherif, M. M., Guay, S., Shahim, O. N., Govaart, G. H., ... & Aczel, B. (2022). A community-sourced glossary of open scholarship terms. Nature human behaviour, 6(3), 312-318. https://www.nature.com/articles/s41562-021-01269-4
Peng, R. D., & Hicks, S. C. (2021). Reproducible research: a retrospective. Annual review of public health, 42(1), 79-93. https://doi.org/10.1146/annurev-publhealth-012420-105110
Silverstein, P., Elman, C., Montoya, A., McGillivray, B., Pennington, C. R., Harrison, C. H., ... & Syed, M. (2024). A guide for social science journal editors on easing into open science. Research integrity and peer review, 9(1), 2. https://osf.io/preprints/osf/5dar8
TOP Advisory Board. (n.d.). Top factor. TOP Factor. https://www.topfactor.org/ Accessed on March 1, 2025.
US Bureau of Labor Statistics. (2024). Occupational Employment and Wage Statistics. https://www.bls.gov/oes/
Wenger, E. (1999). Communities of practice: Learning, meaning, and identity. Cambridge University Press.
Wood, A., Altman, M., Nissim, K., & Vadhan, S. (2020). Designing Access with Differential Privacy. In S. Cole, I. Dhaliwal, A. Sautmann, and L. Vilhuber (Eds.), Handbook on Using Administrative Data for Research and Evidence-based Policy. https://admindatahandbook.mit.edu/book/v1.0/diffpriv.html
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 Julia Bottesini, Priya Silverstein, Sebastian Karcher, Colin Elman

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
This license lets others remix, tweak, and build upon your work non-commercially, and although their new works must also acknowledge you and be non-commercial, they don’t have to license their derivative works on the same terms.
The Creative Commons-Attribution-Noncommercial License 4.0 International applies to all works published by IASSIST Quarterly. Authors will retain copyright of the work. Your contribution will be available at the IASSIST Quarterly website when announced on the IASSIST list server.