New NVSQ Data Transparency Policy for Results Based on Experiments

Transparency is a key condition for robust and reliable knowledge, and the advancement of scholarship over time. In order to improve the transparency of research published in NVSQ, the journal is introducing a policy requiring authors of manuscripts reporting on data from experiments to provide, upon submission, access to the data and the code that produced the results reported. This will be a condition for the manuscript to proceed through the blind peer review process.

The policy will be implemented as a pilot for papers reporting results of experiments only. For manuscripts reporting on other types of data, the submission guidelines will not be changed at this time.

Rationale

This policy is a step forward strengthening research in our field through greater transparency about research design, data collection and analysis. Greater transparency of data and analytic procedures will produce fairer, more constructive reviews and, ultimately, even higher quality articles published in NVSQ. Reviewers can only evaluate the methodologies and findings fully when authors describe the choices they made and provide the materials used in their study.

Sample composition and research design features can affect the results of experiments, as can sheer coincidence. To assist reviewers and readers in interpreting the research, it is important that authors describe relevant features of the research design, data collection, and analysis. Such details are also crucial to facilitate replication. NVSQ receives very few, and thus rarely publishes replications, although we are open to doing so. Greater transparency will facilitate the ability to reinforce, or question, research results through replication (Peters, 1973; Smith, 1994; Helmig, Spraul & Temp, 2012).

Greater transparency is also good for authors. Articles with open data appear to have a citation advantage: they are cited more frequently in subsequent research (Colavizza et al., 2020; Drachen et al., 2016). The evidence is not experimental: the higher citation rank of articles providing access to data may be a result of higher research quality. Regardless of whether the policy improves the quality of new research or attracts higher quality existing research – if higher quality research is the result, then that is exactly what we want.

Previously, the official policy of our publisher, SAGE, was that authors were ‘encouraged’ to make the data available. It is likely though that authors were not aware of this policy because it was not mentioned on the journal website. In any case, this voluntary policy clearly did not stimulate the provision of data because data are available for only a small fraction of papers in the journal. Evidence indicates that a data sharing policy alone is ineffective without enforcement (Stodden, Seiler, & Ma, 2018; Christensen et al., 2019). Even when authors include a phrase in their article such as ‘data are available upon request,’ research shows that this does not mean that authors comply with such requests (Wicherts et al., 2006; Krawczyk & Reuben, 2012). Therefore, we are making the provision of data a requirement for the assignment of reviewers.

Data Transparency Guidance for Manuscripts using Experiments

Authors submitting manuscripts to NVSQ in which they are reporting on results from experiments are kindly requested to provide a detailed description of the target sample and the way in which the participants were invited, informed, instructed, paid, and debriefed. Also, authors are requested to describe all decisions made and questions answered by the participants and provide access to the stimulus materials and questionnaires. Most importantly, authors are requested to share the data and code that produced the reported findings available for the editors and reviewers. Please make sure you do so anonymously, i.e. without identifying yourself as an author of the manuscript.

When you submit the data, please ensure that you are complying with the requirements of your institution’s Institutional Review Board or Ethics Review Committee, the privacy laws in your country such as the GDPR, and other regulations that may apply. Remove personal information from the data you provide (Ursin et al., 2019). For example, avoid logging IP and email addresses in online experiments and any other personal information of participants that may identify their identities.

The journal will not host a separate archive. Instead, deposit the data at a platform of your choice, such as Dataverse, Github, Zenodo, or the Open Science Framework. We accept data in Excel (.xls, .csv), SPSS (.sav, .por) with syntax (.sps), data in Stata (.dta) with a do-file, and projects in R.

When authors have successfully submitted the data and code along with the paper, the Area Editor will verify whether the data and code submitted actually produce the results reported. If (and only if) this is the case, then the submission will be sent out to reviewers. This means that reviewers will not have to verify the computational reproducibility of the results. They will be able to check the integrity of the data and the robustness of the results reported.

As we introduce the data availability policy, we will closely monitor the changes in the number and quality of submissions, and their scholarly impact, anticipating both collective and private benefits (Popkin, 2019). We have scored the data transparency of 20 experiments submitted in the first six months of 2020, using a checklist counting 49 different criteria. In 4 of these submissions some elements of the research were preregistered. The average transparency was 38 percent. We anticipate that the new policy improves transparency scores.

The policy takes effect for new submissions on July 1, 2020.

Background: Development of the Policy

The NVSQ Editorial Team has been working on policies for enhanced data and analytic transparency for several years, moving forward in a consultative manner.  We established a Working Group on Data Management and Access which provided valuable guidance in its 2018 report, including a preliminary set of transparency guidelines for research based on data from experiments and surveys, interviews and ethnography, and archival sources and social media. A wider discussion of data transparency criteria was held at the 2019 ARNOVA conference in San Diego, as reported here. Participants working with survey and experimental data frequently mentioned access to the data and code as a desirable practice for research to be published in NVSQ.

Eventually, separate sets of guidelines for each type of data will be created, recognizing that commonly accepted standards vary between communities of researchers (Malicki et al., 2019; Beugelsdijk, Van Witteloostuijn, & Meyer, 2020). Regardless of which criteria will be used, reviewers can only evaluate these criteria when authors describe the choices they made and provide the materials used in their study.

References

Beugelsdijk, S., Van Witteloostuijn, A. & Meyer, K.E. (2020). A new approach to data access and research transparency (DART). Journal of International Business Studies, https://link.springer.com/content/pdf/10.1057/s41267-020-00323-z.pdf

Christensen, G., Dafoe, A., Miguel, E., Moore, D.A., & Rose, A.K. (2019). A study of the impact of data sharing on article citations using journal policies as a natural experiment. PLoS ONE 14(12): e0225883. https://doi.org/10.1371/journal.pone.0225883

Colavizza, G., Hrynaszkiewicz, I., Staden, I., Whitaker, K., & McGillivray, B. (2020). The citation advantage of linking publications to research data. PLoS ONE 15(4): e0230416, https://doi.org/10.1371/journal.pone.0230416

Drachen, T.M., Ellegaard, O., Larsen, A.V., & Dorch, S.B.F. (2016). Sharing Data Increases Citations. Liber Quarterly, 26 (2): 67–82. https://doi.org/10.18352/lq.10149

Helmig, B., Spraul, K. & Tremp, K. (2012). Replication Studies in Nonprofit Research: A Generalization and Extension of Findings Regarding the Media Publicity of Nonprofit Organizations. Nonprofit and Voluntary Sector Quarterly, 41(3): 360–385. https://doi.org/10.1177%2F0899764011404081

Krawczyk, M. & Reuben, E. (2012). (Un)Available upon Request: Field Experiment on Researchers’ Willingness to Share Supplementary Materials. Accountability in Research, 19:3, 175-186, https://doi.org/10.1080/08989621.2012.678688

Malički, M., Aalbersberg, IJ.J., Bouter, L., & Ter Riet, G. (2019). Journals’ instructions to authors: A cross-sectional study across scientific disciplines. PLoS ONE, 14(9): e0222157. https://doi.org/10.1371/journal.pone.0222157

Peters, C. (1973). Research in the Field of Volunteers in Courts and Corrections: What Exists and What Is Needed. Journal of Voluntary Action Research, 2 (3): 121-134. https://doi.org/10.1177%2F089976407300200301

Popkin, G. (2019). Data sharing and how it can benefit your scientific career. Nature, 569: 445-447. https://www.nature.com/articles/d41586-019-01506-x

Smith, D.H. (1994). Determinants of Voluntary Association Participation and Volunteering: A Literature Review. Nonprofit and Voluntary Sector Quarterly, 23 (3): 243-263. https://doi.org/10.1177%2F089976409402300305

Stodden, V., Seiler, J. & Ma, Z. (2018). An empirical analysis of journal policy effectiveness for computational reproducibility. PNAS, 115(11): 2584-2589. https://doi.org/10.1073/pnas.1708290115

Ursin, G. et al., (2019), Sharing data safely while preserving privacy. The Lancet, 394: 1902. https://doi.org/10.1016/S0140-6736(19)32633-9

Wicherts, J.M., Borsboom, D., Kats, J., & Molenaar, D. (2006). The poor availability of psychological research data for reanalysis. American Psychologist, 61(7), 726-728. http://dx.doi.org/10.1037/0003-066X.61.7.726

Working Group on Data Management and Access (2018). A Data Availability Policy for NVSQ. April 15, 2018. https://renebekkers.files.wordpress.com/2020/06/18_04_15-nvsq-working-group-on-data.pdf

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: