Open Methodology tools, Concepts of Open Evaluation

Aus Wikiversity
Zur Navigation springen Zur Suche springen

VO Sharing is daring: Open Science approaches to Digital Humanities

Please read the lesson script below and complete the tasks.

Questions, remarks, issues? Participate in the Zoom meeting on Mon, 25.05.2020, 5 p.m. - 6 p.m.!
This week's topic of discussion:
How Open would you be willing to go? What approaches are you willing to constantly implement in your own research practice?

Mon, 25.05., 16:45 - 18:15: Open Methodology tools, Concepts of Open Evaluation[Bearbeiten]

In the past weeks, we have learned about the most established fields of Open Science: Open Access (to publications) and Open (Research) Data. This week, we want to look at was to do Open Science "all the way", which means opening up every single step of the research process as widely as (legally) possible in order to provide anyone who might be interested with a "live ticker" of what is happening in the research process. We will first look at a few of the areas that are part of this concept (Open Methodology, Open Evaluation, Citizen Science) and will subsequently take a look at a project about Open Science practices that also put Open Science into impressive practice: (101) Innovations in Scholarly Communication.

First, let's look at the concept of Open Evaluation. There are two main sub-areas of this Open Science Approach, one of which focuses on the way in which researchers evaluate each other's work (Open Peer Review) - the other one focuses on how the general public (mainly funders and employers, but also other researchers) measures and evaluates the "impact", i.e. the significance and importance of a person's research (Open Metrics).

OANA (do you remember what OANA is from our session on networks & infrastructures?) defines these concepts as follows:

Open Peer Review is an umbrella term for a number of overlapping ways that peer review models can be adapted to be in line with the aims of Open Science, including transparency about reviewer and author identities, the publication of review reports and the enablement of greater participation in the peer review process.
Open Metrics means openness of data, methods, and results of bibliometric analyses. The traceability and reusability of evaluation procedures opens up new possibilities in dealing with scientific findings in the fields of research, technology, and innovation. ( About Open Science)

Task 1[Bearbeiten]

Read at least one of the following two papers to gain a better understanding of either the concept of Open Peer Review or of Open Metrics. While peer review is a method that is more and more widely implemented in the humanities, metrics (fortunately) do not (yet) concern us very much and are a bigger influence in the hard sciences. However, Understanding the system of research metrics will also help you understand the strange tilt that helped Open Science to gain momentum as a movement.

In the article listed above, Ross-Hellauer pointed out that there is (was) no proper, widely accepted definition of the term "Open Peer Review" available, which is why this article was written. An even fuzzier area of Open Science is the area of Open Methodology, which is another field that is comprised of several sub-areas. Let's have a look at the OANA definition again:

The term Open Methodology refers to opening up methods that are used by researchers to achieve scientific results and making them publicly available. Even though the description of methods is a core element of the research process, results based on these descriptions are often not comprehensible in detail and, even more importantly, not reproducible. The Open Methodology approach aims to counteract this problem.
One of the most established implementations of Open Methodology is the Open Source movement. Its goal is to make programming code publicly available in order to make results understandable and reproducible on the one hand, and to enable further development on the basis of existing code on the other.
Another rather widely-implemented approach is the use of Open Notebooks, in which daily research work is publicly documented. Other approaches to Open Methodology include Open Workflows (documented and transparent workflows) and Open Annotations (open and collaborative classifications and comments). ( About Open Science)

While Open Source, as we heard at the beginning of the semester, is somewhat of a predecessor or "parent movement" (maybe even grand-parent movement) of Open Science and a very established approach in the programming world, the concepts of Open Notebooks, Open Workflows and Open Annotations may be among the least established Open Science practices.

Task 2[Bearbeiten]

Wikipedia time! Chose at least one of the Open Methodology approaches and use Wikipedia to first gain a better understanding of what it is. (Tip: this will work really well for some of the concepts and not for others.) Subsequently, find out what tools and infrastructures are most commonly used to implement the approach(es) you chose to research. Is there maybe a tool that is used in more than one field?

One final concept that we will only look at very briefly is Citizen Science. Like "Openness" itself, the degree to which citizens are (and can even be) involved in science can vary greatly due to numerous reasons. We will take a final look at the OANA definition of the concept of citizen science (which is fairly well established in Austria), but not investigate this area further for now as it can be argued that citizen science is not so much an area of Open Science, but an approach in its own right that is often, but not necessarily combined with Openness.

The term Citizen Science is defined in different ways across the globe. Österreich forscht points out the variety of definitions and defines the term in the following way: Citizen Science carries out scientific projects with the help of or completely by interested amateurs (lat. amator "lover"). The Citizen Scientists formulate research questions, report observations, carry out measurements, evaluate data, and/or write publications. Compliance with scientific criteria is a prerequisite. This does not only facilitate new scientific projects and new insights, but also enables a dialogue between science and society that is otherwise impossible or very difficult to achieve. ( About Open Science)

Working only with openly available data and research papers, making all own research output and data fully available in open, public places, involving citizens and other researchers, openly documenting every step of the way, only participating in conferences and other communication channels that are open to everyone, and only publishing in journals or at conferences where opening up and giving credit for reviews of the presented research is lived practice - that's a little much to ask all at once, isn't it? -

Open source. Open access. Open society. Open knowledge. Open government. Even open food. Until quite recently, the word “open” had a fairly constant meaning. The over-use of the word “open” has led to its meaning becoming increasingly ambiguous. This presents a critical problem for this important word, as ambiguity leads to misinterpretation. (Pomerantz, Peek: 50 Shades of Open)

Subsequently, Pomerantz and Peek offer a definition, the reading of which is highly recommendable. What we can learn from their article is that what the appropriate amount of Openness, the right "shade of open", is - depends on who you're asking. Personally, I sometimes feel it is unfeasible to put all elements of Open Science into practice. However, there are researchers out there who have already truly and fully embraced Open Science on all levels. One example is the project (101) Innovations in Scholarly Communication by Jeroen Bosman and Bianca Kramer, which investigates Open Science practices, but also practices Open Science.

Task 3[Bearbeiten]

To familiarize yourself with the project, watch the short video "Open Science workflows - putting the pieces together". Are you inspired? Would you maybe like to take a more detailed look at one of the many materials and graphics you saw in the video, or see the underlying data? Go to the 101innovations Wordpress site and find your resource.

101innovations is a brilliant resource for finding new tools and possibilities to practice research openly, but the downside of this resource is the information overload that it provides. I have made the experience that I might quickly need to find a new tool for a certain task when I'm in the middle of a project, and using the 400+ tools list by 101innovations does not always help if "quick answers" are needed because it provides too many options. However, the makers of 101innovations have tried to tackle this problem by providing good "selections" of open tools that are actually very useful. Still, the comprehensive list is worth taking a look at without looking for a specific tool for a specific task - it gives a great overview of the Open tool landscape, thus allowing us to think of the right tools when we need them in the future.

Task 4[Bearbeiten]

Try it yourself! First, take a look at the Rainbow of Open Science practices. Next, investigate the example workflows identified by 101innovations and see how many of the tools included in the Open Science workflow you already know. With this basic knowledge in mind, take some time to browse the 400+ tools list and find more tools that can be useful for you.

Congratulations, we have covered all our bases and investigated all areas of Open Science. Next week will be a public holiday (Pentecost), which is why we will use our time for some digital detox and fresh air. After this short break, we will meet again to take a look at a few example projects in which I myself implemented Open Science approaches - with varying success. In this next session, we will therefore focus on the obstacles for Open Science. But before that:

Do you have questions, remarks, issues about this week's topic? Participate in the Zoom meeting on Mon, 25.05.2020, 5 p.m. - 6 p.m.!
This week's topic of discussion:
How Open would you be willing to go? What approaches are you willing to constantly implement in your own research practice?