Nature

Research integrity: nine ways to move from talk to walk

In 2018, Delft University of Technology in the Netherlands began building a community of data champions across all faculties, from aerospace engineering to technology, policy and management. These champions’ role? To nudge staff and students to manage their research data better. Among other incentives, they can apply for dedicated grants to do so. Imperial College London now shuns journal-based metrics in staff assessment; it relies more on peer judgement of research quality. At Mahidol University in Bangkok, Thailand, all staff sign the university’s code of good governance, agreeing to uphold integrity, impartiality and social responsibility, for example.

These are just three of dozens of efforts we found when investigating how institutions worldwide are working to improve research integrity. They form part of our long-term study on this topic, a project that is funded by the European Commission (see Table S2 in Supplementary information for more examples).

Three years ago, the US National Academy of Sciences called for resources to help research leaders improve scientific integrity in their institutions1. Since we started our study in 2019, we have found that universities can struggle to work out where to start, to think comprehensively and to craft concrete policies and procedures tailored to their needs. One participant told us that institutions “only have bits and pieces — but it needs to be a system”.

Over the past two decades, there have been plenty of declarations that outline the components of trustworthy research and the principles of research integrity. These include the Singapore Statement in 2010, the Montreal Statement in 2013, the Hong Kong Principles in 2019 and the European Code of Conduct for Research Integrity in 2011, revised in 2017, among others (see Supplementary Table S3). Many hundreds of articles have been written on the topic: about threats to research quality from hypercompetitiveness and poor training; the unquestioning and inept reliance on metrics in evaluation; and systematic biases in peer review and publication. There are also multiple reports of shocking cases of fraud, alarming rates of questionable research practices and foot-dragging from practitioners, editors, authors and institutions when dealing with retractions and corrections. For all this to be avoided, research institutions must translate integrity principles into practice2.

We set out to assess the current situation and to learn what topics should be addressed in organizations’ plans to promote research integrity. Our study, called Standard Operating Procedures for Research Integrity (SOPs4RI), included 2 scoping reviews of the literature; 23 interviews with research-integrity experts across research institutions, funding organizations and committees; a Delphi study — an iterative, consensus-oriented study — involving a panel of 69 research-integrity policymakers; and 30 focus groups across European countries. These represented the natural, social and biomedical sciences, as well as the humanities (see Supplementary Table S3 for links to project outputs). We found firm consensus on nine topics (see ‘Better research: three areas, nine topics, many actions’ and Supplementary Table S1), which are also well represented in statements, declarations, and codes.

The European Union’s next research-funding programme, Horizon Europe — which starts next year and runs until 2027 — will confirm a strong commitment to research integrity (see go.nature.com/2gvcxt3). It is expected that institutions receiving funding from the €81-billion (US$96-billion) programme will be required to have clear plans and procedures in place for research integrity3. Here are some ideas to help them do so.

Better research: three areas, nine topics, many actions

Area

Topic

Action*

Support

Research environment

Ensure fair assessment procedures and prevent hypercompetition and excessive publication pressure.

Support

Supervision and mentoring

Create clear guidelines for PhD supervision (such as on meeting frequency); set up skills training and mentoring.

Support

Integrity training

Establish training and confidential counselling for all researchers.

Organization

Ethics structures

Establish review procedures that accommodate different types of research and disciplines.

Organization

Integrity breaches

Formalize procedures that protect both whistle-blowers and those accused of misconduct.

Organization

Data practices and management

Provide training, incentives and infrastructure to curate and share data according to FAIR principles.

Communication

Research collaboration

Establish sound rules for transparent working with industry and international partners.

Communication

Declaration of interests

State conflicts (financial and personal) in research, review and other professional activities.

Communication

Publication and communication

Respect guidelines for authorship and ensure openness and clarity in public engagement.

Pockets of promise

Even without incentives, institutions seem newly interested in reform. “Self-inspection is in the air,” wrote Marcus Munafò, co-founder of the UK Reproducibility Network, in a Nature opinion piece last December4. The scientific community has shifted from its dominant focus on individual actions and begun to accept that the research culture has a role in sustaining research integrity (and discouraging questionable research practices). In a similar way to funders, publishers and scientific societies, institutions are starting to publicly scrutinize how they go about research assessment, supervision and mentoring, collaboration, public engagement, data management and publication. The goal? To dismantle structural dysfunction and to reform the incentives that sustain it.

Take the recent efforts of Ghent University in Belgium to “become a place where talent feels valued and nurtured” (see go.nature.com/3itv56b). To assess researchers for appointment and tenure, it de-emphasized quantitative metrics such as bibliometric output measures, reduced the frequency of evaluations and removed explicit targets for publication. Instead, it increased collegial supervision and emphasized more qualitative, holistic assessment. Similar changes are under way at the Catholic University of Leuven (KU Leuven) in Belgium. There, people applying for jobs are asked to submit a biographical sketch alongside their conventional CV. At the University of Glasgow, UK, ‘collegiality’ was introduced as a formal assessment criterion for a professorship. Candidates must demonstrate contributions to other colleagues’ work and careers, for example by helping with conference submissions, sharing data, acting as a co-supervisor, enabling co-authorships or contributing to others’ projects and grant applications.

Also getting a makeover is the education and counselling aspect of research integrity. The largest universities in Denmark now mandate integrity training for PhD students, and offer access to designated counsellors across career stages. Both junior and senior researchers have dedicated people they can talk to confidentially if something in their laboratory or collaboration seems off. The University of Luxembourg has research-integrity coaches available for consultation at all stages of planning and publishing a project. In Ireland, University College Cork has introduced a Digital Badge programme to show that people have completed training in good research practice. At the University of Oxford, UK, a grassroots effort to provide training in effective computing for research reproducibility has grown into a local hub with a cross-faculty steering group, a broad portfolio of activities and links with the UK Reproducibility Network (see Supplementary Table S2).

Comprehensive help

Putting principles into practice is not easy, and efforts are often ad hoc. Leaders in each organization need to work through the topics that could be addressed and then tailor measures as appropriate for, say, a medical school versus a business school. Those conducting clinical trials, environmental-impact assessments and behavioural economic surveys all need to preserve integrity when they collect and manage data, but how they do so will differ substantially. And similar institutions in different countries will need to accommodate national laws.

To ensure that new procedures and policies work as intended, institutions need a comprehensive plan that makes sure the broad goals don’t get lost. It should specify how policies will be implemented, maintained and evaluated. It should identify what risks there are to implementation, and how to mitigate them. And it should be updated as the organization and conditions change. This plan can provide continuity, consistency and accountability; less-formal efforts are likely to wane as attention fades or resistance rises.

The nine topics our work identified map easily onto the European Code of Conduct for Research Integrity. Some focus on enhancing capabilities or on building research integrity into organizational processes, and handling breaches. Others target transparency and communication.

The League of European Research Universities, based in Leuven, and the Bonn PRINTEGER Statement both offer guidance on how to develop and implement plans that promote research integrity5,6. Funders are adding to the momentum. For example, Horizon Europe will require applicants to ensure compliance with the European Code of Conduct for Research Integrity. That will go a long way towards overcoming institutional inertia.

Avoiding snags

The dynamics of how organizations can bring effective change to their research culture are not well studied. It’s fair to say that change takes time, intellectual effort and financial investments. It also needs local champions and might well be contentious. Studies show the ways in which research managers and academic leaders resist new research-integrity policies7. Department heads recognize such issues as real but not occurring in their departments, thus negating any need for change. Hierarchical, top-down implementation is doomed to fail.

Any policy initiative must highlight the issue that most concerns the people affected (be it doing good work, salvaging reputation or accessing funding), using terms that make sense8. For example, an appeal for reliable, applicable research will be received better than asking for compliance with codes and regulation. Furthermore, policies have the best chance of shaping behaviour if those affected share the aspirations behind them, and if they see policies as supportive rather than controlling. Researchers are generally eager to do high-quality research, and institutions should avoid reforms that are perceived as bureaucratic, or they will undermine intrinsic motivation.

Plans to improve research integrity must therefore be co-created with all stakeholders. They need to be involved in analysing the problem, devising solutions, and maintaining and updating plans to implement those solutions. Differing perceptions must be explored and negotiated, and solutions crafted for each institution.

In one successful example, the University of Amsterdam in the Netherlands introduced a comprehensive set of discipline-sensitive policies for promoting research integrity. First, a cross-departmental working group of experienced researchers committed to two years of analyses, including scoping of existing governance arrangements, as well as interacting and consulting with researchers across disciplines and career levels. Then, after the university’s board had consulted senior faculty members and the institution’s research advisory council, it adopted the plan, lending it credibility and attention. Importantly, the strategy sought to implement and monitor policies that decrease workload — by integrating and digitizing ethics review, for instance — and asked deans to monitor efforts and impacts.

In another example, when University College London set out to change how bibliometrics were used in research assessment, it set up a working group to involve stakeholders. It consulted some 250 individuals, including department heads and faculty members, representing a majority of the university’s departments.

It is difficult to assess how much these projects increased research integrity, let alone compare them in terms of time and effort. We know that they required local champions. The fact that we were able to identify dozens of these projects suggests that people can be convinced that such internally driven efforts are worthwhile.

Critics will counter that requiring policies and procedures to promote research integrity amounts to using a sledgehammer to crack a nut, or that miscreants are highly visible but rare. And they will say that, in practice, it will just add another couple of pages of box-ticking to research-grant applications.

These are legitimate concerns, and our project asks participants about the perceived costs and benefits of local reforms. To avoid excess bureaucracy, it will be necessary to tailor the plan to actual problems in the specific institution and explicitly weigh administrative and other costs.

But we think that current challenges to research integrity are real, that the primary objective is quality, and that the research system must demonstrate to society that the system and its contributions are trustworthy.

We see the parallel with the history of research ethics9, which has confronted similar challenges. Few would argue today that informed consent, the protection of children and vulnerable people or the ethics of gene editing are irrelevant, or that practical ways to address them are unnecessary. We should continuously discuss and adapt specific procedures to organizations and their changing contexts, but action is required.

Getting started

How can a European mandate support organizational reform without squashing grassroots enthusiasm? By supporting choices and offering tools that individual organizations can adopt. Examples include the UK Research Integrity Office’s procedure to investigate research misconduct10 and the European Network of Research Integrity Offices’ recommendations for doing so11.

SOPs4RI has collected documents describing such recommendations, together with procedures and other resources. These are tagged according to the type of organization, discipline and purpose, and are accessible through the SOPs4RI website (see www.sops4ri.eu/). Over the next two years, we will refine and curate these, using pilot studies of institutions that implement plans, and international surveys. Readers are invited to share views, concerns, examples of best practices and any other input. Achieving research integrity requires structures and practices that are tailored to fit. The more arrangements we can all draw from, the better.

Products You May Like

Articles You May Like

Climate Crisis Weekly: Humanity’s future is now in America’s hands
ESA awards contracts for moon and Mars exploration
Disney says its ‘primary focus’ for entertainment is streaming — announces a major reorg
Is Venus a living hell? Conversation with astrobiologist David Grinspoon
Agrophotovoltaic News — Bifacial Panels In Germany, Grazing Sheep In Austria

Leave a Reply

Your email address will not be published. Required fields are marked *