Documents
Counterfactuals for Research and Innovation Evaluation - Report (PDF)
If you cannot open or read this document, you can ask for a different format.
Request a different format
Email web@ukri.org, telling us:
- the name of the document
- what format you need
- any assistive technology you use (such as type of screen reader).
Find out about our approach to the accessibility of our website.
Details
UK Research and Innovation (UKRI) commissioned Frontier Economics to conduct a rapid evidence assessment of counterfactual methodologies used in evaluating research and innovation programmes. This was done was to identify the methods used, the rationale for using them, the success factors, strengths and challenges associated with different methods, and potential learning for UKRI. 30 studies were reviewed, and methods were categorised according to the Maryland Scientific Methods scale, a measure of robustness in identifying a causal impact.
The review identified more evaluations of innovation- than research-related programmes that used counterfactual approaches. Difference-in-Differences, Regression Discontinuity Design, Qualitative Comparative Analysis and mixed methods were commonly used approaches. Fewer studies applied Synthetic Control Methods, Instrumental Variables, Randomised Control Trials and Machine Learning methods to R&I evaluation. The success factors and challenges relating to the methodologies interact with the characteristics of research and innovation programmes which add to the challenges of robust impact evaluation.