Peer reviews - STFC

Contents

Carrying out a peer review – STFC

This page provides general advice and guidance for reviewers on completing reports for the STFC peer review process. You may also find it useful to refer to How reviewers use the UKRI Funding Service.

Reviewers’ input is the single most important element in the peer review process, providing advice on the qualities of the many research proposals we receive each year. For the process to work effectively, reviewer comments should be timely, objective, fair and informed.

Reviews are based on a series of assessment criteria, which directly relate to the questions an applicant is asked to address. These may vary between opportunity types but the key areas will generally be:

  • vision
  • approach
  • applicant and team capability to deliver
  • costs and resources
  • ethics

Some funding opportunities also have additional specific assessment or selection criteria, which can be found in the relevant opportunity on the funding finder.

You can find details of specific STFC funding opportunities in the funding finder.

Good reviewing

To maximise their value to the peer review process, reviewer reports should aim to:

  • provide clear comments and recommendations
  • avoid making statements that may give away your identity
  • avoid repeating the content of or summarising the application
  • give justification for markings
  • be consistent between box markings and comments
  • provide enough information without being over-long
  • provide constructive criticism but do not use personal or aggressive comments
  • clearly identify strengths and weaknesses
  • raise concerns in the form of questions for the applicant
  • be mindful of the language you use and avoid terms that refer to ‘protected characteristics’ of the Equality Act 2010 or that could be interpreted as discriminatory.

It is important to bear in mind how your report will be used. Your report will be fed back anonymously to the applicant, who will then be allowed to respond to factual inaccuracies or questions you raise. Following this, members of peer review panels will be asked to use your reports as a tool for ranking proposals.

Using generative artificial intelligence (AI) in peer review

Reviewers and panellists are not permitted to use generative AI tools to develop their assessment. Using these tools can potentially compromise the confidentiality of the ideas that applicants have entrusted to UKRI to safeguard.

For more detail see our policy on the use of generative AI.

How it works

When you are asked to review a proposal, all the necessary documents will be sent to your Funding Service account. You may be contacted by us in advance of the formal request where, for example, we only need you to focus on certain aspects of the proposal.

The proposal you are asked to review may include a link, or links, to a website containing information on the research proposed. Reviewers are not required to consider this additional information when providing comments on a proposal. If you do choose to look at this information, it is possible that your anonymity to the applicant will be compromised.

All reviews on our behalf are submitted using the Funding Service.

Timescales and confidentiality of comments

If you cannot comment within the indicated timescale, confirm this immediately so we have time to approach an alternative reviewer or perhaps extend the deadline. In addition, confirm immediately if you do not feel qualified to comment at all.

Declaration of Research Assessment (DORA)

Guidance for UKRI grant assessors (reviewers and panel members)

We are committed to support the recommendations and principles set out by the San Francisco Declaration on Research Assessment (DORA). You should not use journal-based metrics, such as journal impact factors, as a surrogate measure of the quality of individual research articles, to assess an investigator’s contributions, or to make funding decisions.

For the purpose of research assessment, consider the value and impact of all research outputs (including datasets, software, inventions, patents, preprints, and other commercial activities) in addition to research publications. You should consider a broad range of impact measures, including qualitative indicators of research impact such as influence on policy and practice.

The content of a paper is more important than publication metrics or the identity of the journal in which it was published, especially for early-stage investigators. Therefore, you should not use journal impact factor (or any hierarchy of journals), conference rankings and metrics such as the H-index or i10-index when assessing UKRI grants.

For panel members only

We encourage you to challenge research assessment practices that rely inappropriately on journal impact factors or conference rankings and promote and teach best practice that focuses on the value and influence of specific research outputs. If you are unsure about DORA, speak to the panel convenor or the panel chair.

Unconscious bias

A particular equality issue in peer review is unconscious bias. Despite striving to be objective, people often hold implicit or unconscious assumptions that influence their judgement. Examples range from expectations or assumptions about physical or social characteristics associated with gender, ethnicity and age to those associated with certain jobs, academic institutions and fields of study.

Read a briefing note on unconscious bias.

Last updated: 12 November 2024

This is the website for UKRI: our seven research councils, Research England and Innovate UK. Let us know if you have feedback or would like to help improve our online products and services.