Strategy

UKRI evaluation strategy

From:
UKRI
Published:

Our commitment to effective evaluation of our investments.

Introduction

UK Research and Innovation (UKRI) is the largest public funder of research and innovation in the UK, spanning all disciplines and sectors. We invest more than £8 billion each year in people, places, ideas, innovation and impacts, to advance our understanding of the world around us, and tackle the challenges we face both nationally and globally.

Our UKRI strategy 2022 to 2027: transforming tomorrow together sets out our long-term, high-level priorities for how we will deliver our vision for an outstanding research and innovation system in the UK. The UKRI strategy emphasises the vital role that evaluation must play in achieving that vision, by helping us to understand our impact and learn from what we do.

This UKRI evaluation strategy builds upon the good practice in evaluation that is already embedded across the organisation. It outlines why evaluation is important, our vision for evaluation at UKRI, our areas of focus to achieve that vision, and features of evaluation within a research and innovation context.

We are publishing this strategy to aid transparency and to demonstrate our commitment to effective evaluation of our investments.

UKRI vision

Our vision is for an outstanding research and innovation system in the UK that gives everyone the opportunity to contribute and to benefit, enriching lives locally, nationally, and internationally.

We are committed to monitoring and evaluating our investments rigorously to understand our impact and to learn from ‘what works’… This will include commissioning independent evaluations of our major research and innovation grants and investments, as well as continuing to build our in-house skills, capabilities and analytical frameworks. (UKRI strategy 2022-27: transforming tomorrow together)

Importance of evaluation for UKRI

Evaluation is defined by HM Treasury’s Magenta Book (the Treasury’s guidance on evaluation) as the “systematic assessment of the design, implementation and outcomes of an intervention”. At UKRI, evaluations provide us with a deep understanding of how our investments were delivered and the outcomes and impacts they went on to achieve.

We are strongly committed to undertaking high-quality evaluation at UKRI, recognising the vital role it plays in making us an effective and efficient organisation.

Evaluation is important for UKRI as it enables us to:

  • make evidence-based decisions on how best to support research and innovation priorities, maximising the value and impact we deliver for the UK taxpayer
  • learn what went well and what went less well in the delivery of our investments and in achieving impact, so we can continuously improve what we do
  • demonstrate that we are a responsible investor of public funding and that our investments represent value-for-money, enabling our stakeholders and the public to hold us to account
  • communicate and celebrate our impact and success, helping to justify and secure future investment in research and innovation

Features of research and innovation evaluation

The nature of research and innovation activity introduces a variety of factors that need to be considered during the evaluation process.

For example, impacts from research and innovation investments can take a long time to arise, and often occur after the lifetime of the investment. At the same time, researchers, innovators, and organisations involved in the delivery of research and innovation are often fluid, with dynamic employment situations and frequent changes in staffing.

The timing of the evaluation therefore needs to be considered carefully to allow sufficient time for evidence of outcomes and impacts to arise, while avoiding loss of historical knowledge amongst staff.

There are also challenges associated with attributing outcomes and impacts to research and innovation. The pathway from investment to impact is often complex and rarely linear, and there is often a lot of activity taking place within the same sector or location that could have influenced the impact.

Experimental and quasi-experimental methods are also often not possible in a research and innovation context, as we do not allocate our funding randomly. Theory-based methods are therefore often preferable when measuring the impact of research and innovation.

Proactive steps can be taken through evaluation design and implementation to address these factors. Common mitigation strategies include:

  • use of mixed-method approaches (often both qualitative and quantitative)
  • establishing an understanding of the starting point and what would have happened if no funding was provided (a baseline counterfactual scenario)
  • undertaking interim evaluations during the lifetime of the programme
  • ensuring that adequate data is collected through monitoring processes while the programme is progressing

Our vision for evaluation

Our vision is for high-quality evaluation to be embedded throughout UKRI, generating robust, reliable, and trusted evidence, helping us understand and improve our impact.

We have come a long way towards achieving our vision, with strong evaluation networks in place, growing evaluation expertise, and evaluation reports published in our evaluation reports section.

To continue to strengthen our evaluation system and achieve our vision we are focusing on the following four key areas of action:

  • proportionate evaluation practices
  • high-quality evaluation
  • a culture of continuous improvement
  • ethical evaluation

Across these areas we will work collaboratively with the Department for Science, Innovation and Technology (DSIT) and with other research and innovation funders. We will collectively tackle challenges in undertaking research and innovation evaluation and promote the use of evaluation evidence within programme design and decision-making.

Achieving our vision

Proportionate evaluation practices

UKRI invests in a diverse range of activities that support research and innovation, with investments ranging from small grants to large programmes worth hundreds of millions of pounds.

With such diversity in our investments, we must ensure our evaluations are proportionate and that resource is targeted at providing UKRI with the highest-priority evidence.

Appropriate evaluation activity may include:

  • internally run evaluations
  • expert external peer review panels
  • full externally commissioned evaluations

Decisions on what to prioritise, and the extent of evaluation activity to undertake, are based upon the size of the investment, stakeholder areas of interest and the mechanisms used to deliver the investment.

Evaluation activity that goes beyond routine monitoring is considered if a UKRI investment meets one or more of the following criteria:

  • of a significant level, such as UKRI’s large cross-cutting funding programmes
  • strategically important with a clear need to have evidence about delivery or impact
  • novel or complex in its process, delivery mechanism or the results that it is trying to achieve
  • represents an opportunity to better understand what works in the delivery of research and innovation funding

We plan and design our evaluation activity to ensure the best use of resources, by carefully considering the level at which an evaluation is to be carried out.

For example, in some instances it is more appropriate to conduct an evaluation at portfolio or theme level, rather than at the level of the individual investment. The level that is most appropriate is decided based upon the purpose of the evaluation and the questions we need to answer.

We have cross-council evaluation networks in place to support the coordination and planning of our evaluation activity. Through these networks, we keep track of ongoing and upcoming evaluations across the organisation.

High-quality evaluation

High-quality evaluation is essential to ensure our evaluations generate robust, reliable, and trusted evidence. To achieve this, we must design our evaluations carefully, selecting the most appropriate evaluation type and the best methods.

Evaluations are classified into three main types depending on the questions to be answered:

  • process evaluations, which explore how effective the overall operation of the investment was (focusing on the decision-making process, financing, resourcing, and progress against objectives)
  • impact evaluations, which focus on what difference the investment made to the economy and society (in terms of knowledge creation, outcomes delivered and wider impacts)
  • value for money evaluations, which help us understand whether the investment was a good use or most effective use of the resource

In practice, there is a lot of interaction between these types. For example, process evaluation may feed into impact evaluation by helping us understand more about how the impacts were achieved. A value for money evaluation is also likely to draw on evidence related to both process and impact.

We often need to use a combination of evaluation types and evaluation methods to fully answer our evaluation questions. We are also keen to explore innovative approaches where they can help us to better answer our evaluation questions and enhance our evidence base.

The advice of evaluation practitioners, external consultants, and HM Treasury’s Magenta Book is used to guide the selection of evaluation methods. We also endeavour to align value for money and economic analysis with HM Treasury’s Green Book (the Treasury’s guidance on how to appraise policies, programmes and projects), with the aim to use that data to inform new investments through business cases.

An evaluation plan or evaluation framework (or both) is often developed, outlining the type or types of evaluation to be conducted and when, the evaluation questions to be answered, and the methods to be used.

High-quality evaluation is also dependent on having access to the vital data needed to support evaluation activity. Evaluation practitioners work closely with UKRI’s data teams to review the data that’s required to meet our evaluation needs and ensure it’s accessible via UKRI’s data systems.

A culture of continuous improvement

A culture of continuous improvement embedded throughout the organisation is vital to achieve our vision and maximise use of evaluation evidence within our decision-making processes. There are three key elements we are focusing on to help us achieve this culture:

  • embedding evaluation practices
  • maximising learning and underpinning decision-making
  • building our evaluation capacity and expertise

Embedding evaluation practices

To generate evidence of impact and learn what works across our diverse portfolio of investments, it is important for evaluation to be embedded in business processes across UKRI.

The role of evaluation and the evidence it creates should be considered at all stages of the funding lifecycle, from investment design through to investment governance. Robust monitoring processes must also be in place, so that data needed to support the evaluation is collected proactively throughout the lifetime of the investment.

Due to our size and diversity, embedding these practices must be a collective endeavour, where staff across the organisation understand what evaluation is, why it is important and what the processes are that they need to follow.

To support this, we have an internal UKRI Evaluation Framework, which is aligned with our evaluation strategy and guides UKRI staff on how to approach and deliver evaluation activity. The framework is accessible to all UKRI staff from our internal website, and staff are supported in its implementation by expert evaluation practitioners and a comprehensive set of evaluation training materials.

The framework also highlights the need for staff to involve evaluation practitioners from the start, so they can advise on evaluation design and are aware of evaluation activity ongoing in their area. This evaluation framework and other supporting materials are reviewed and updated when necessary to ensure continued support is provided.

Maximising learning and underpinning decision-making

To maximise learning and optimise use of evaluation evidence in our decision-making processes, effective communication and engagement is needed throughout. Involving anticipated users of the evaluation in designing the evaluation helps to ensure the evidence generated is relevant, and that key findings and learning can be shared with the right people at the right time to influence decisions.

When communicated effectively, evaluation evidence can underpin decision-making on investment design, strategy, and funding allocation.

The UKRI strategy also outlines six strategic priorities for the organisation such as enhancing place-based benefits, accelerating commercialisation, and addressing major national and global challenges via our five strategic themes. By incorporating questions aligned with these priorities into our evaluations, we can enhance the evidence base on what works in these areas.

As such, the audience for our evaluation activity is broad (both internally and externally), and can include business case analysts, programme boards, senior leadership teams, government sponsors and the wider research and innovation community.

We therefore use a variety of mechanisms to actively promote the findings of our evaluations, including via senior board meetings, our cross-UKRI networks, intranet pages, webinars and learning series. We have also developed a guide to help evaluation managers engage with their stakeholders early on and disseminate their evaluation findings effectively.

We are committed to sharing our evaluation findings externally, aiding transparency and enhancing the evidence base on research and innovation. We will continue to publish all our commissioned evaluations in our evaluation reports section as agreed in our evaluation publication protocol.

Building our capacity and expertise

To embed evaluation and adopt a culture of continuous improvement, it is vital we build evaluation capacity and expertise within our workforce. To do this, we are focused on embedding an evaluation community of practice throughout UKRI, supporting knowledge sharing, career development, and increased opportunities for staff.

Through the community of practice, we will continue to deliver cross-UKRI training on evaluation. Our training includes topics such as developing a theory of change, disseminating evaluation findings, and undertaking economic appraisal. This includes both expert-led training sessions and written training materials aligned with our evaluation framework. All staff involved in evaluation activity are encouraged to access this UKRI-wide training, accessible via our internal website.

We will also continue to invest in developing the expertise of our evaluation teams, by encouraging continuous professional development and membership of the government analytical professions. Staff are encouraged to participate in relevant conferences, training, and seminar series, and share knowledge gained with others via the community of practice. These activities support a culture of peer-to-peer learning and generate active discussion on how to further improve our evaluation processes.

Ethical evaluation

Ethical evaluation (where evaluation activity is conducted with integrity and respect) is fundamental to what we do. To ensure objectivity and independence, we will continue to commission external organisations to carry out evaluations of our major investments.

When a high level of subject matter expertise is required to assess the relative impact of outputs and outcomes, an independent peer review body may be needed to convene expert external advice. We will continue to provide guidance and support to external evaluators in the form of access to programme documents and other relevant material, while maintaining impartiality.

Great emphasis is also placed on information security and data protection, and UKRI has stringent measures in this area in place, compliant with obligations under the UK General Data Protection Regulation. Our Information Security Team promotes, encourages, and supports the safe and secure use of information by UKRI and those working with us.

The issues in these areas can be complex and sensitive, often requiring consideration on a case-by-case basis. Those commissioning evaluations therefore seek advice from our Information Security Team and Information Governance Data Protection specialists at an early stage to embed privacy by design and default. UKRI’s privacy notice and data protection policy set out what data we need, why we need it, and what we will do with it. We also work closely with our externally commissioned organisations to ensure they uphold our expected standards and adhere to UKRI policy.

In addition, all evaluation activity will continue to consider UKRI’s role as:

Commitments include prohibiting the use of journal impact factors to assess the quality of a research publication and promoting best practice in the use of metrics. For example, supplementing the use of quantitative indicators with qualitative evidence, rather than in isolation.

Feedback

We are committed to improving our evaluation strategy and approach to evaluation at UKRI and we value feedback, suggestions, or insights. For any queries or to provide feedback on our evaluation strategy contact us at:

Email: evaluation-performance@ukri.org

Page viewed: 9:19 am on 22 December 2024

This is the website for UKRI: our seven research councils, Research England and Innovate UK. Let us know if you have feedback or would like to help improve our online products and services.