WCRI 2017 Poster Session: INTEGRITY – FRAGMENTED & OUTSOURCED

[If you arrived here using the short link bit.ly/fear_of_scooping, you might be interested in reading about my first article, “Afraid of Scooping – Case Study on Researcher Strategies against Fear of Scooping in the Context of Open Science“. It was first presented at SciDataCon2016 as a conference paper and then submitted for Data Science Journal as a research paper. It was accepted there with minor revision and is currently in the final stages of editing. I have publised a preprint manuscript in Zenodo.] 

The abstract for this poster can be seen in my Zenodo repository. It was first submitted to World Conference on Research Integrity 2017 as paper proposal (this too will someday be noted in the updated version of my CV of failures).

In the poster I claim that the social conscience of academic research, at least in Finland and very likely in many other countries, is fragmented and outsourced. Fragmented because discussion, development and implementation of issues concerning research integrity, research ethics and open science have been siloed to separate containers, based on arbitrary organizational boundaries and in a way that doesn’t encourage inter-silo dialogue. Outsourced, because the bodies regulating research integrity and research ethics and promoting open science are governmental and the people institutional elite of near mandarin status.

For many historical reasons the Finnish learned societies, the four academies that would be the natural outlets for multidisciplinary opinion and evidence based advice, have been for a long time very passive in all kinds of matters, but especially national research policy. The research behind the poster is still ongoing, but so far it seems that the academies had virtually no part to play in the formation of the Finnish research ethical landscape.

When the Finnish Advisory Board on Research Integrity (TENK, which was for the first twenty years called Board on Research Ethics) was founded in the early 1990’s , the initiative didn’t come from the research community, but the national parliament and government instead. The Finnish equivalent to a national research council, the Academy of Finland was the first to take on the task of advising and guiding the community in matters of research ethics, to be soon replaced by the Ministry of Education and it’s specially appointed board.

In the beginning the motivation for ethical guidance and oversight was tackling research misconduct and having a watchful eye on the development of genetic research. Quite soon the previous motivation shadowed the latter in the work of TENK, as the tasks related to medical ethics, gene technology and animal testing were taken over by ministries that dealt with the legislation in those domains. These emerging bodies didn’t concentrate on just the challenges of scientific research, but dealt also with industry concerns, for example. Each body got a small secretariat, sometimes less than a whole person year per year, and an expert board of their own.

Fortunately there are currently international efforts to bridge these gaps of yesteryear, such as the ENERI project, that tries to bring European research ethics committees and research integrity offices closer together. Also the secretariats of the Finnish boards have done their best with their scarce resources and held semi-regular meetings together.

Unfortunately mistakes have a tendency to be repeated. Open science, the pursuit of making academic research more accessible, transparent, efficient and influential, is in Finland being driven forward by none other than once again the government, through a Ministry of Education and Culture Open Science & Research initiative. The universities and other research institutions are getting more an more on board on the top governing level, but once more, the community lags behind. I have personally had the questionable honor of speaking about the benefits of open research in an official University of Helsinki doctoral school on human sciences event to a crowd of less than ten people, in a hall that could have fit a hundred (another one for the CV of failures).

The ALLEA Code of Conduct on Research Integrity is another disappointing example. First of all it was apparently updated because the EU Commission asked it to be. When discussing dissemination, the code focuses only on formal print era publications, leaving out dissemination throughout the research life cycle, new digital tools and media, popular science and citizen engagement altogether. When I asked about this oversight from someone who has been involved in the writing process, they said that the code was written in a way that reflects the status quo of the research community’s understanding of what is responsible practice and what isn’t.

Sometimes the lack of initiative and enthusiasm from the research community in the face of societal responsibility of research makes me want to scream. I understand that people need to put bread on the table and that requires articles, which require lots and lots of very hard work. But is the load of publish or perish really that burdensome, that it doesn’t leave room for anything else (in the professional capacity)? Or could the outsourcing of social responsibility have created a vicious circle: out of plain sight, out of mind? That’s something to reflect upon in my next print era high impact journal article.

 

Advertisements

AFRAID OF SCOOPING? Case Study on Perceived vs. Actualized Risks of Sharing Research Outputs

This post is based on my conference paper presentation at the SciDataCon 2016. The paper was part of a session called “Getting the incentives right: removing social, institutional and economic barriers to data sharing“. My slides can be seen here bit.ly/fearofscooping. The abstract for the paper has been published on this blog here. I will be submitting a full paper for the Data Science Journal special SciDataCon 2016 collection. A preprint will be available on Zenodo. [Edit: link to preprint added.]

Getting scooped, having your research idea or results published by someone else, is a common fear among researchers. It can be a major stress factor and an energy drain.

The risk of scooping is often used as a counter argument for open science, especially open research data.

One recent testament to this argumentation is a New England Journal of Medicine editorial, in which a group of over 200 medical researchers presented their conditions for data sharing, including embargoes up to five years, fees for data reuse and processes for quality control.

All scooping isn’t illegitimate. Most often scooping is accidental. Science has trends and big questions that researchers flock to address. Scooping becomes misconduct only when idea or content used was taken from another researcher, without giving them credit. In the language of research integrity and research ethics illegitimate scooping is called misappropriation.

In the context of open science making the distinction between legitimate and illegitimate scooping boils down to credit: it’s okay to take inspiration from others and use someone else’s data published with an open license, as long as you cite the source.

I was interviewing researchers doing radically open science for my Phd, that deals with research integrity, and the fear of scooping came up. I started to wonder what made it possible for these individuals to do what they did despite the fear, seen as an obstacle to openness by so many of their peers.

In this case study I have looked into the openness strategies and practices of two open collaboration research projects created by Finnish researchers.

The Social Media for Citizen Participation, or SOMUS project, was created by a loose online collective called the Open Research Swarm. The Swarm operated on a Finnish microblogging service Jaiku, which has since ceased to exist. SOMUS combined methods from engineering sciences and social sciences, co-creating and testing applications and social media platforms with citizen stakeholders.

NMR Lipids project is an ongoing open scientific collaboration project to understand the atomistic resolution structures of lipid bilayers. The discussions happens on a Blogspot based blog, while manuscripts and data are developed on Github. ArXiv and Zenodo repositories are used for preprints.

The primary source of this study are the interviews of two key researchers from each project. My methodological toolbox includes influences from social science history and social psychology of science. For understanding and describing the projects I have used cultural-historical activity theory and it’s activity system model.

The openness strategies of the two projects could be described by the term “open by default”. There were no conscious measures taken to prohibit scooping. The SOMUS project proposal was drafted completely openly. Instead of scooping them competitors aiming for the same funding call ended up actively avoiding overlaps in their proposals, some even contributed to the SOMUS proposal. The tragedy of SOMUS was that sharing funding among the Open Swarm was difficult and created tension, catalysing the collective’s disintegration. Also, the carefree enthusiasm resulted in a lack of concern for data management. Today only a PDF report remains openly available online.

The NMRLP turned the tables and instead of worrying about scooping has made sure that they don’t themselves inadvertently scoop anyone. Participants are expected to credit ideas received even in informal discussions. Anyone who has commented on the blog is eligible for co-authorship. It remains for each participant to decide for themselves whether their conribution is enough. The interviewed researchers told that neither of the projects published articles have free-rider co-auhtors, something that is actually rare in their field.

I found out that instead of being a disincentive, the fear of scooping actually acted as an incentive for one of the projects. For the researcher in question it was a way of getting rid of a constant stress. When you work is published online from the get-go, it is easier to prove priority.

All of the interviewed researchers named the unfairness of the academic publishing model as a motivation for creating alternative ways of disseminating research. One of the projects also wanted to address the normalization of p-hacking style bad practice in their field.

Even though the interviewees showed a level of mistrust for the research community at large, they had a high level of trust in their immediate community.

They worried about advancing their careers and a precarious career stage, meaning lack of funding or permanent position was a key motivation for most, but they were not prepared to compromise their principles in pursuing success. All named scientific curiosity as a source of fullfilment and motivation. They were aware of their pioneer status and excited by it.

Making generalizations based on a case study is always a risky business, but with maybe a half a grain of salt there can be general lessons to be learned here.

The interviews show a link between understanding and recognising research integrity and research ethics issues and the willingness to share. Research integrity training for researchers is already a policy priority at least in Europe and this conclusion only adds to it’s importance.

Exploring beyond one’s immediate research field can foster new ideas, research methods and questions. Again, multidisciplinarity is a research policy staple, but more should be done to make it mainstream.

Data citation principles and practices need to be in place ASAP, so that researcher’s work doesn’t get scooped just because no-one knows how to give proper credit on data.

The open collaboration projects show a worrying lack of engagement from women. The level of female participation was in both cases below of what can be considered typical for the fields: about 25% for SOMUS and 1/30 for NMRLP. The gender-specific concerns in the way of sharing should be recognised and addressed.

The experiences of empowerment, excitement and curiosity that open sharing and collaboration can offer should be communicated more, with less focus on buzzwords, policies, requirements and demands.

This could be done through case examples and champions. But in order to have them, efforts in sharing should be rewarded. One of my four interviewees grew tired of the precariousness of a researchers life and is working as a college teacher, while another is contemplating leaving academia altogether because the numerous pats-on-the-back are yet to translate into project funding or a position.

 

SciDataCon 2016: Accepted Conference Paper Abstract and Reviewers Comments

As I knew I was going to attend the SciDataCon 2016 in my role as the secretary of the Finnish Committee for Research Data, I decided to submit an abstract for a conference paper for one of the sessions. To kill two birds with four layovers, if you will (flying from Helsinki to Denver and back on a budget is truly an endurance sport).  The session that I felt most suitable in terms of my ongoing research was one titled Getting the incentives right: removing social, institutional and economic barriers to data sharing. I was happy to find out earlier this week that my abstact has been accepted. The reviewers provided some feedback, which I’m thankful for, and requested a few revisions.Below is the abstract (in its original, unrevised form)  with reviewers comments at the bottom. Now all that is left for me to do (in addition to revising the abstact and making it camera ready) is the small task of writing the actual paper. And enduring the gruesome flight itinerary.

Afraid of Scooping? – Case Study on Perceived vs. Actualized Risks of Sharing Research Outputs

Summary

Fear of being scooped is among the most commonly voiced concerns by researchers in discussions concerning Open Science in general and Open Data in particular. Does practicing openness make researchers more likely targets of research misconduct? This abstract describes a study based on two cases of “ultra open” science collaboration, with the aim of comparing perceived risks of real time sharing a wide array of research outputs to those that have realized. The focus is on research integrity related concerns. Preliminary findings are promising: for example, an experiment in openly drafting a funding proposal resulted in other teams focusing their projects on different research themes to avoid direct competition.

Introduction

The current volume of Open Science themed policy initiatives, discussions, events and community action is a clear indicator of the recognition of a need for wider access to scientific knowledge. Open Access to scientific publications is the strand of the movement with the widest acceptance and most success.[1] Progress on other fronts of Open Science, f. e. Open Data and Open Methods (Open Source), has been significantly slower. There are several reasons for this, some due to institutional factors, funding mechanisms or the lack of established workflows. One major factor that stems from the grassroots of research, from individual attitudes and professional cultures adopted by researchers, is fear.

A study commissioned by the Knowledge Exchange network reviewed incentives and disincentives for data sharing. Fear experienced by researchers was at the top of the list of barriers; “fear of competition, of being scooped and therefore reduced publication opportunities.” According to the study, these fears plagued especially early career researchers. They were both afraid of being ridiculed for their immatureness and wary of losing badly needed publications to scooping. When moving up the academic ladder, the possibility of getting laughed at faded in the researchers’ minds, while the threat of scooping persisted. (Van den Eynden & Bishop 2014)

Getting scooped means in the academic world that a researcher (or a team of researchers) beats another to the punch in publishing a research finding. This happens often and only becomes a research integrity offence if one of the researchers/teams got the research idea from the other.[2] Fear of scooping in the context of Open Science can be thus interpreted as fear of becoming a victim of research misconduct.

How well founded is the fear of being unethically scooped due to sharing? What about being ridiculed? Belittling someone’s professional efforts is of course no research fraud, but it also doesn’t exactly make one a shining beacon of integrity either. There is very little evidence on the occurrence of research misconduct, not to mention the effects on the victims. Because of the rarity of sharing research outputs beyond publications, our understanding of the social implications of practicing Open Science are also scarce. As a counterbalance to the fear of scooping there are hopes of transparency through Open Science acting as a cure to at least certain forms of irresponsible research behavior. To encourage more researchers into sharing their research outputs as widely as we need to understand the situation better.

For this case study I have interviewed key researchers from two “ultra open” research collaboration projects. In addition to Open Data, the projects have produced all of their content openly online, inviting and welcoming outside participation. One of the projects even allocated funding to a “research swarm”, an open membership online community operating on a microblogging platform. These two individual cases offer us a glimpse to the challenges and possibilities of research integrity in an Open Science era, as well as the social implications of sharing. The case study is part of an ongoing PhD research project on research integrity regulation in Finland.

Open Collaboration Cases

The two open collaboration cases that form the basis of this study are the NMR Lipids Project (NMRLP) and the Social Media for Citizens and Public Sector Collaboration (SOMUS) project.

The projects were chosen based on the approach of extreme openness towards collaboration and co-authorship. Despite their somewhat radical nature, both of the projects have produced traditional research articles, but for example in the case of NMRLP, the authorship of these publications has been based on self-assessment by the contributors (contributors meaning anyone who has commented on the project blog). Another factor was that the two projects have been coordinated by Finland trained researchers. This is due to the case study being a part of a PhD project focusing on the national Finnish research integrity regulation. This caused for example the Polymath Project to be excluded from the cases.

The NMRLP belongs to the field of molecular physics. It is ongoing at the time of writing, with all of the research outputs available online either in the project blog or GitHub service. The SOMUS ran for two years during 2009-10, although the open online community, Open Research Swarm, that gave birth to the project, predates SOMUS by at least a year. SOMUS was a project in the field of multidisciplinary media studies and all of its research outputs were openly published online during the running of the project. Unfortunately, the outputs were not placed in an open repository post-project and are currently accessible only by request. This is an issue that I will also address in the finished paper.

Research Methods

The preliminary sources for this study are thematic interviews of five open coordinating researchers from the chosen projects, together with the open online content of the NMRLP and the archives of the SOMUS.

This research is multidisciplinary both in terms of methods and theoretical framework, drawing inspiration from behavioral sciences and social sciences, especially sociology of science & technology, with strong roots in social science history, itself very much multidisciplinary in nature. Oral history has been an important point of reference in conducting the interviews.

The goal is to make the collected interview data available for further research use with an open license and through an open online repository, both in audio and textual formats. This is not common practice in qualitative humanist and social scientific research. Sharing qualitative human data is ethically complicated, but I argue that the obstacles are being somewhat exaggerated and the benefits not discussed enough. There is evidence that research subjects can be quite willing to allow re-use of their data, even if the data in question is sensitive (Borg & Kuula 2007), which is not the case for this study. Currently the biggest obstacle for sharing identifiable interview data in Finland is a data privacy ombudsman interpretation a so called broad consent is in breach of law regulating data privacy.

Preliminary Findings

The case study is still ongoing at the time of writing. More definitive results will be available at the time of SciDataCon 2016. Nevertheless, there are some interesting preliminary observations and findings to be shared. Markus Miettinen, one of the key researchers of the NMRLP, compared in a workshop presentation his initial fears when initiating the project with what actually happened. He listed eight risk scenarios, ranging from lack of participants to scooping and personal conflicts. According to him (Laine et al. 2015)

[…] the experiences gained on open research during the NMRlipids project have been extremely positive, and none of the major fears we had before starting the project have actualized. Quite the contrary, the open research approach has proven to be an extremely fruitful as well as rewarding way to do research.

The SOMUS has gathered similarly promising experiences. Opening up research funding proposals, possibly one of the most radical and controversial ideas developed under the Open Science movement umbrella, gets back up in the SOMUS project report (Kronkvist 2011):

Could research funding generally be applied to ‘open calls’? An often-used reason for a closed doors policy is that of competition. Research ideas can be stolen, and project applicants engage in tough battles for funding. It became clear through discussions with other applicants, however, that the open draft actually helped them focus their projects on different research themes to avoid direct competition with Somus. When a text is documented in a wiki, it is easy to find an author and time stamp for the text, making it uncomplicated to solve authorship questions.

To what extent these and other examples drawn from the cases can be generalized and translated to other research fields and environments, will be discussed in the final paper.

Acknowledgements

The author would like to thank Tiina and Antti Herlin Foundation for their continuous financial support to her PhD research, which this study is one part of.

Competing Interests

The author declares that she has no competing interests.

Notes

1   See for example the recent statement made by EU member statements about making all scientific articles freely accessible by 2020: http://english.eu2016.nl/documents/press-releases/2016/05/27/all-european-scientific-articles-to-be-freely-accessible-by-2020

2   Many research integrity guidelines limit the categories of research misconduct to falsification, fabrication and plagiarism (FFP), but for example the Responsible conduct of research and procedures for handling allegations of misconduct in Finland (2013) guideline by the Finnish Advisory board on Research Integrity names misappropriation as a form of research fraud.

References

Borg, S and Kuula, A 2007 Julkisrahoitteisen tutkimusdatan avoin saatavuus ja elinkaari – valmisteluraportti OECD:n datasuosituksen toimeenpanomahdollisuuksista Suomessa. Yhteiskuntatieteellisen tietoarkiston julkaisuja. Tampere, Finland: Yhteiskuntatieteellinen tietorkisto (Finnish Social Science Data Archive). pp. 37.

Kronkvist, J 2011 Somus as an attempt at a new paradigm. In: Näkki et al Social media for citizen participation – Report on the Somus project. VTT Publication 755. Espoo, Finland: VTT Technical Research Centre of Finland.

Laine, H, Lahti, L, Lehto, A, Miettinen, M and Ollila, S 2015 Beyond Open Access – The Changing Culture of Producing and Disseminating Scientific Knowledge. Proceedings of the 19th International Academic Mindtrek Conference. New York: ACM. pp. 202-205 DOI: http://dx.doi.org/10.1145/2818187.2818282

Van den Eynden, V and Bishop, L 2014 Incentives and motivations for sharing research data, a researcher’s perspective. Knowledge Exchange. Available at http://www.knowledge-exchange.info/event/sowing-the-seed [Last accessed 30 May 2016].

Comments and suggestions by the SciDataCon 2016reviewers

Very concise description of the work done, and the methods. Please have a look at C. Borgman’s recent book about Big Data, Little Data, No Data (MIT), if you have not done so. Have a look at the specific research culture (in the field, institution, countries) which enables, fosters these experiments. What do you think is the probability of such research practices to diffuse more widely, to be adopted by many?
The paper provides a Finland/ Finnish perspective on openly available research outputs – also the project data, for two projects. The content appears interesting and addresses one of the biggest stumbling blocks for data publishing – the fear to be scooped … to become the victim of research misconduct.
Fear of being scooped is one of the major reasons given by researchers as to why they don’t share their data (amongst many other!). This paper reports on 2 case studies in which that fear has found to be unsubstantiated, and that none of the fears of making data available were realised. Whilst this is only a limited survey, having hard evidence on actuality instead of a possibility, is removing yet another social barrier to the string of research data. Although this paper doesn’t discuss incentives, it is suitable for the session as it discusses removing a disincentive. The authors indicate there more results will be available at the time of the conference and these should be included in the paper.

Informed consent

I am currently working on a case study on responsible conduct of research in the context of open research collaboration. I have chosen three research projects that have been conducted online completely openly, with an open invitation for anyone to particpate. I won’t name the projects yet for the simple reason that I am intending to interview the key researchers of those three projects, but haven’t approached all of them yet, and I feel it would be a little bit tacky if they were to hear about my project indirectly. So I’ll get back to the projects as soon as I have made my plans known to the persons involved.

What I want to discuss here is the pratical aspects of forming informed consent for research subjects. As I am planning on contacting who are hopefully my interviewees-to-be I have been thinking a lot about the information that I owe to them about my project.

Informed consent is a central concept to the ethics of human research, i. e. research on human subjects. Here’s what my wise friend Wikipedia says about informed consent:

“An informed consent can be said to have been given based upon a clear appreciation and understanding of the facts, implications, and consequences of an action. To give informed consent, the individual concerned must have adequate reasoning faculties and be in possession of all relevant facts.”

The concept was first introduced in the domain of medical sciences, but is equally important to research in humanities and social and behavioural sciences, that can often involve minors or deal with sensitive issues, such as domestic abuse, sexual orientations or policital views, just to name a few examples.

My research subject, which deals with people in their professional roles, is not sensitive, but because of my commitment to the principles of openness, requires thorough ethical reflection. There is practically no precedence concerning open unanonymized qualitative interview data, at least that I know of. I will get back to the challenges I have experienced when trying to find a repository for archiving and sharing my data in another blogpost. But before I can archive, let alone share, any data, I need to be sure that my research subjects understand what they are getting involved in and agree to everything that I’m doing.

In order to inform my interviewees I have drafted a project descrption, which can be found as a Google document here. The document has been approved by the University of Helsinki Ethical Review Board in the Humanities and Social and Behavioural Sciences (my second supervisor is the chair of the board, but she recused herself from the decision making in my case). I have followed the Finnish Advisory Board on Research Integrity (FABRI) ethical principles in the humanities and social and behavioural sciences in putting together the document:

Information regarding a study should include at least the following:

1) the researcher’s contact information,

2) the research topic,

3) the method of collecting data and the estimated time required,

4) the purpose for which data will be collected, how it will be archived for secondary use, and

5) the voluntary nature of participation.

Subjects may ask for additional information regarding the study and researchers should prepare for this in advance.

Assenting to be interviewed can be considered as consenting to the interview data being used for the purpose of the research project in question without any additional paperwork. Concerning the archiving and sharing of the data I have decided to ask for a written consent. The consent form I have formulated follows the example given by the Language Bank of Finland (I can’t find the model form anymore after they have updated the website, sorry for that), with some minor altercations and additions. At the moment Zenodo looks like the most likely repository for my data, but as I mentioned above, I will get back to this issue, since it has caused me a lot of headaches. Stay tuned.

New Year’s Resolutions: My Open Pledge

Ever since I heard Erin McKiernan telling about her Open Pledge at the 2015 OpenCon I have been thinking about making my own. It seems only appropriate to publish it at the beginning of a new year. In addition to McKiernan I have been inspired for example by the Open Science Peer Review Oath and the responsible conduct of research guidelines I’m studying.

My research is open by default.*

I will not be open at the cost of my research subjects privacy; when in doubt I will refrain from publishing.

When I don’t share something I will explain the decision openly and honestly.

I will share both my successes and my failures.

I will blog my research.

I will communicate my research in a way that is understandable also to people outside the research community.

I will either publish in full open access journals or traditional journals that allow self-archiving.

I will not publish in so called hybrid open access journals.

I will not publish in journals owned by companies that exploit the research community.

I will be an advocate of open science and speak about my choices.

*See here what I mean by ‘open by default’.

Closed for good? – How ethical issues can limit openness

This is a lightning talk I gave at the Knowledge Exchange Pathways to Open Science event in Helsinki today as part of the ‘Benefits, risks & limitations of Open Scholarship’ theme.

I am a doctoral student in economic and social history at the University of Helsinki. One of the research questions I am working on is about the answers that open research processes provide to ethical challenges in research, such as plagiarism, data fabrication and author misconduct.

But today, for the next few minutes, I’m taking on the role of a devil’s advocate.

I argue, that human related data can never be open in a way that is ethically sustainable.

That is, at least if we understand open along the lines of the Open Knowledge Foundations definition, which states that open data is something that can be freely used, modified, and shared by anyone, for any purpose.

I collect interview data for my research. I want to share that data already during the research process. My interviewees are easily identifiable: former chairs and secretary generals of Finnish Advisory Board on Research Integrity and key researchers of certain open research projects. There isn’t much point in anonymizing the interview data, although I’m prepared to go there if they request it. I’m giving my subjects a consent form to sign. By signing they agree that their interviews can be freely used for educational and research purposes, without embargo. Commercial use is not included in the consent, already a breach against the open definition.

The topic of my research isn’t sensitive in the traditional sense, since it deals with people in their professional roles, without going into matters of health or family relationships. I have subjected my research plan to ethical evaluation, which deemed it ethically responsible.

Do I feel like I can declare with certainty that no harm will come to my subjects because of my research? No, I can’t. We live in a world where researchers studying subjects such as (and these are real examples from Finnish research) nutrition, wolf populations and indoor air problems get death threats. The Internet is an unpredictable and often unkind environment.

The Helsinki University ethical review board that evaluated my research gave me two questions to ponder:

“How does the choice to not anonymize the interview data affect the quality of gathered information (sample, content)? There is a danger, that ethically critical aspects will not fully surface due to fear of labeling, leading to a subdued result.”

I understand this concern, but in the case of historical research and oral history, if we hide the identity of the speaker, we might hide the historical context, and in so doing destroy the historical value of the data.

The second question is even more haunting.

“Could interview data that is in principle harmless give rise to new sensitive information on research subjects?”

This one is giving me sleepless nights.

Currently a lot of human related research data is being routinely destroyed due to privacy concerns. All the while private companies are collecting vast amounts of human related data from citizens who don’t really like giving up their data and certainly don’t trust these companies, but are just too resigned and without alternatives (because digital has become the prerequisite of social) in order to resist.

So what do we do? Change the definition of ‘open’? Change the definition of ‘ethical’? Accept as inevitable that at least some human sciences get left behind? Or is there a way we can move from closing human data for good into opening it for good?

To rephrase the question: how can we build a culture of trust and what kind of mechanisms are needed to support it, so that we can preserve qualitative human data to generations to come?

 

Ethical evaluation: passed

My research and hence this blog have both been on a slow-burner for a couple of months now. The reason has been that I have subjected my research plan to ethical review by the University of Helsinki Ethical Review Board in the Humanities and Social and Behavioral Sciences. I have now received a positive statement from the board, meaning that I can proceed with my research and start conducting the interviews.

Ethical review in human sciences in Finland follows a set of recommendations established by the Finnish Advisory Board on Research Integrity. In a fashion similar to the Finnish RCR guidelines, most of the Finnish research institutions have undersigned the recommendations and appointed boards or committees like the one in Uni. Helsinki. The purpose of the review is to make sure that non-medical research projects with human subjects respect certain key ethical principles in dealing with the subjects, namely

  • right of self-determination,
  • prevention of harm, and
  • privacy and data protection.

In the case of my research the review wasn’t mandatory, since it does not require physical intervention, nor does it deviate from the principle of informed consent, all of my subjects are adults, there is no exceptionally strong stimuli involved or other mental harm beyond the risks of normal life and it doesn’t create a security threat to the participants. But because I am on uncharted territory with my pursuit for openness and since research misconduct can be considered a delicate issue, my instructor Erika Löfström, who is also the chair of the ethical review board in question (she of course recused herself from the decision making), advised me to go through the process.

The statement itself is very brief, but I found the thought process involved in preparing the review request very useful. The request requires quite a few documents, such as a cover letter explaining the need for review, a research plan, researchers own evaluation of the research’s ethical aspects, handouts for the subjects, interview questions etc. One of the things it got me doing was an openness plan for my research. I realized I didn’t have a clear idea of what I was doing in terms of opening my process; how, where, which content and to which audience and end-user. I will translate the plan into English and publish it on this blog in the near future.

The statement I received reads in its entirety as follows (it has the same content in both Finnish and English):

“ETHICAL REVIEW STATEMENT

University of Helsinki Ethical review board in humanities and social and behavioral sciences has reviewed Heidi Laines study ”Hyvää tieteellistä käytäntöä määrittelemässä: suomalainen hyvän tieteellisen käytännön ohjeistus ja muuttuva tiedeyhteisö” in the board meeting on the 3 rd – 6 th of November 2015. The review board finds that based on the received material the planned study follows the Ethical principles of research in the humanities and social and behavioral sciences issued by the Finnish Advisory Board on Research Integrity. Thus the review board states that the mentioned study is ethically acceptable.”

In addition to the standard statement I received more personalized and unofficial comments about things to take into consideration. I have translated them from Finnish here, and I presume that they don’t reflect the official view of the board in the same way as the statement proper, and should not be taken as such:

  • Reflections on research methods: how does the choice to not anonymize the interview data affect the quality of gathered information (sample, content)? There is a danger, that ethically critical aspects will not fully surface due to fear of labeling, leading to a subdued result.
  • Reflections on responsible research conduct: could interview data that is in principle harmless give rise to new sensitive information on research subjects? It is advisable that researchers try to anticipate possible challenges and consider how to handle them if they should emerge.

I will get back to my research ethical reflections and choices in coming blog posts. Especially the first point about anonymity is something that I have given a lot of thought and am still on the fence about. But I did decide on offering the choice to my research subjects (earlier I was of the opinion, that the interviews of FABR chairs and secretary generals aren’t worth doing if they are anonymous, since the historical context will make them recognizable even without names, but if there is a danger that they will decline, it’s better to have an anonymous interview than nothing at all, and just try and write the analysis in a way that doesn’t point to individuals).

Can too much openness ruin a research interview?

Interviewing as a method of acquiring research material calls for a lot of sensitivity. When doing a research interview, you try to influence the person you are talking to as little as possible. If the other one is searching for right words, you don’t jump in with helpful suggestions, like you might normally. You certainly don’t try to convince them of anything. The questions posed should be as neutral as they can, allowing a wide array of different possible answers. Instead of asking “Was it like this?” you go “What was it like?”.

But what if one, like me, is trying to do research openly? The work plan is published for everyone to see, revealing some hypothesis and other preconceptions about research outcomes. The risk that this information will influence the interviews and through that the research results is to me very real. I am an active advocate of open science, so writing the following words is not the easiest: at least for those of us who call ourselves social scientists, there really exists a thing called too much openness, and it can jeopardize the validity of our research.

In addition to (contemporary) archival documents, interview data is an important part of my source material. I talked the problem of too much openness over last week with my supervisor, Erika Löfström. We didn’t find a simple fix-it-all solution, but instead came up with a concept called “hallittu avoimuus”. Controlled openness would be the literal translation, but to me it sounds like a euphemism for anti-openness, whatever that could be. I prefer the term conscious openness. What it means is basically that it’s good to pause and think before pressing “enter”.  Not a revolutionary idea, I know, but I’ve noticed that saying aloud seemingly obvious things can often prove surprisingly fruitful.

To me openness is not a value in itself, but a means to an end. For example my primary motive for this blog is not attention just for the sake of attention. The point of open science and research on a systemic level is to increase the quality of research, strengthen the role of evidence based knowledge in the society, make research more resource efficient and more accessible. On an individual level the benefits are networks and community, ideas and feedback, as well as increased impact of one’s work (none of these of course come for free, but that’s another blog post).

In order for science to improve through openness, we need to be conscious about what goes out there. That it’s information, not just noise. That it’s not counter-productive. Like data without proper metadata is just numbers, a research plan that becomes a self-fulfilling hypothesis is just letters (and that’s the positive scenario). No new (reliable) knowledge gets created in either case.

Conscious openness is actually very close to what anyone dealing with human subjects has had to practice already since long. A concept called informed consent is at the core of ethical research with human subjects, both in invasive medical research and non invasive social scientific research. The subjects need to know what they are getting themselves involved in, which is easier said then done. How to inform the subjects without affecting results? How to tell them, often non-experts in the field involved, about complex research questions so that they really truly understand all of the aspects? How to do this in a way that doesn’t scare them, insult them or bore them to death? It’s not enough to explain about the premise of the research, you have to also give detailed accounts on data management. This is often where ethical review steps in.

Open research as a process could take lead from the practices of forming informed consent for research subjects: What does this public have the right / need / interest  to know? How should I choose my words in order to avoid misunderstandings? The biggest difference to earlier practices is that open becomes the default setting, from which you refrain only with a good cause. A key aspect of the process, one that determines whether the research in question is legitimately open or just being open-washed, is transparency concerning the various bits and pieces you are not publishing and why.

How shall I be putting this conscious transparent openness construction of mine into practice? Right now I’m frozen in the finger-on-enter-but-pausing-and-thinking phase as I’m preparing to collect interview data. I will publish an updated work plan and a post detailing the intellectual whereabouts of my work sometime during the coming weeks.

Openness is scary

I published my research plan along with my CV yesterday on this blog (here). I downloaded the document in PDF format to Google Drive and shared the link. It was scary.

Even though I am currently working for a governmental initiative designed to increase openness in science (Open Science and Research Initiative) I don’ t have a crystal clear idea on how to go about conducting an open research process. But that’s what I’m determined to do, so I guess I will just have to invent as I go along, at least until I come across with others doing the same thing (I already know some in the fields of natural science and digital humanities, but none in social sciences, studying a contemporary subject). One of the things that I will have to decide as my research progresses is how much of my data will I open and to what degree. There will probably be at least something left unopened, since an important part of my data will be transcripts of interviews, and I will have to respect the wishes of my interviewees. One of the first things I’m going to do as I start my research full-on in May, is a research data management plan.

What is so scary about this, then, telling in a blog about what you do? That should be normal in research, right? Well, there is public and then there is open. Your stuff is public when someone can see it. It becomes open only when anyone can see it, and not only see, but also use it. And that is scary. First of all, someone could steal my text or my ideas. I don’t think that is very likely, though. My research has absolutely no commercial potential, and if someone should plagiarize my text they would get caught quite quickly, because the subject is so specific. And you can be plagiarized even if you don’t do open research. Actually the openness might save you from plagiarism. There would be no contest about who came up with the idea first or who’s text it was originally, since after publishing it openly I would very easily prove it was me. The second fear is much scarier and, I think, more real; that of being ridiculed, even harassed and / or having my work and even my person heavily and possibly unjustly criticized. But I think that genuine and consistent openness and honesty can be a remedy to that too. I guess I will just have to wait and see.