October 26, 2018

OAWeek 2018: Barriers to Practice

In our final OAWeek post, I will present the current barriers to "open" practice. While there are many potential barriers to living up to the principles of complete openness, there are four major reasons why people or institutions make the decision to be open and their reasons for doing so. These include (but are not limited to): technological, financial, formal conventions, and learning curve.



Technological. The past few years have seen a boom in innovations and digital tools that enable open access, open science, and open source. Based on the above figure, we can see that the all areas of the conventional scientific process have been touched by this revolution. Distribution, publishing, notetaking, bibliographies, and engaging the broader community have all been impacted by new tools and (more importantly) their adoption by a critical mass of scientists. The development of formal pipelines for organizing this proliferation of tools into actionable steps [1] has also been a technological advance. Despite this convergence, this is not a single "killer app" that will solve the open problem. Nor should there be, as killer apps are often concentrated in the hands of single entities that are vulnerable to profiteering. Importantly, open-enabling technologies must be available to smaller research groups, particularly generators of smaller datasets [2], to get the most out of the scientific community's efforts.

101 Innovations in Scholarly Communication. ORIGINAL SOURCE: https://innoscholcomm.silk.co/  License: CC-BY.

Financial. While many tools are relatively cheap to use, other aspects of open science can be quite costly to individual scientists or even laboratories. In Wednesday's post on the three "opens", the various models of open access were discussed. Depending on which route to open access and/or open science is chosen, there are costs associated with manuscript, data archiving, curation, and annotation. A successful "open" strategy should include a consideration of these costs to ensure sustainability over the long term. There are also issues with the cost and public funding of large-scale community resources such as open access journals, preprint servers, data repositories that must be solved without making their use unaffordable or (by extension) unavailable. One open question is the incentive structure for sharing resources and making them accessible. This is particularly true for datasets, which require incentives related to research efficiency, social prestige, and intellectual growth [3]. Such incentives can also help to reinforce higher reproducibility standards and overall levels of scientific integrity [4]. 

An example of a set of formal conventions chosen from a large number of potential tools. COURTESY: Nate Angell, Joint Roadmap for Open Science Tools. License: CC-0.

Formal Conventions. Another barrier to "open" is cultural practice. In moving from concept to finished product, we do so by following a set of internalized practices. While science requires much formal training, many scientific practices are taught implicitly during the course of laboratory and scholarly research. Several recent studies characterize openness as a matter of evolving norms [5, 6] which define openness in terms of collegiality, and does not punish non-open endeavors. One critical aspect to encouraging open practices is education. However, there does seem to be a generational shift in attitudes and educational opportunities surrounding open practices. This has occurred at the same time information and computational technologies have emerged that encourage sharing and transparency. Whether this will change standards and expectations in a decade is unclear -- although governments and funding agencies are now embracing open access and open science in ways they previously have not.

Learning curve as compared to the diffusion of innovations [7]. COURTESY: Wikimedia.

Learning Curve. With all of the potential tools and steps in making research open, there is a learning curve for both individual scientists and small organizations (e.g. laboratory). While the learning curve for some practices (e.g. preprint posting) are trivial, other "open" practices (e.g. transparent protocol and methods) require more commitment and formal training. The learning curve is one major factor in the difference between merely "making things open" and making things accessible. In the domain of open datasets, accessibility can be hampered due to the fragmentation of resources across many obscure locations rather than a highly-discoverable set of repositories with fixed identifiers [8]. There are two additional barriers to accessibility and/or practice adoption: difficulty of learning and cultural learning. Difficulty in learning a specific tool or programming language does make a difference in how open practices are, and the harder or more time consuming a certain task is, the less likely the associated practice will be adopted. Cultural learning involves being exposed to a specific practice and then adopting that practice. This generally has little relation to difficulty, and depends more on personal and institutional preference. It is important to keep both of these in mind, both for adopting an "open" strategy and expectations of members of the broader community.


NOTES:
[1] Toelch, U. and Ostwald, D. (2018). Digital open science: Teaching digital tools for reproducible and transparent research. PLoS Biology, 16(7), e2006022. doi:10.1371/journal.pbio.2006022.

[2] Ferguson, A.R., Nielson, J.L., Cragin, M.H., Bandrowski, A.E., and Martone, M.E. (2014). Big Data from Small Data: Data-sharing in the ‘long tail’ of neuroscience. Nature Neuroscience, 17(11), 1442-1448. doi:10.1038/nn.3838.

[3] Gardner, D. et.al (2003). Towards Effective and Rewarding Data Sharing. Neuroinformatics, 1(3), 289-285. AND Piwowar, H.A., Becich, M.J., Bilofsky, H., Crowley, R.S. (2008). Towards a Data Sharing Culture: Recommendations for Leadership from Academic Health Centers. PLoS Medicine, 5(9), e183. doi:10.1371/journal.pmed.0050183.

[4] Gall, T., Ioannidis, J.P.A., Maniadis, Z. (2017). The credibility crisis in research: Caneconomics tools help? PLoS Biology, 15(4), e2001846. doi:10.1371/journal.pbio.2001846.

[5] Pham-Kanter, G., Zinner, D.E., and Campbell, E.G. (2014). Codifying Collegiality: recent developments in data sharing policy in the life sciences. PLoS One, 9(9), e108451. doi:10.1371/ journal.pone.0108451.

[6] Fecher, B., Friesike, S., and Hebing, M. (2015). What Drives Academic Data Sharing? PLoS One, 10(2), e0118053. doi:10.1371/journal.pone.0118053.

[7] Rogers, E. (1962). Diffusion of Innovations. Free Press of Glencoe, New York.

[8] Culina, A., Woutersen-Windhouwer, S., Manghi, P., Baglioni, M., Crowther, T.W., Visser, M.E.  (2018). Navigating the unfolding open data landscape in ecology and evolution. Nature Ecology and Evolution, 2, 420–426. doi:10.1038/s41559-017-0458-2

No comments:

Post a Comment

Printfriendly