Evaluation of research
At the University of Warwick we are committed to fair and transparent mechanisms for monitoring and reporting research performance as well as for recruiting staff and assessing their research performance.
The University uses both qualitative and quantitative indicators to assess individual and institutional performance. We acknowledge the limitations of using either approach alone: qualitative indicators can be perceived as being subjective, whereas quantitative indicators can be viewed as being unsophisticated; conversely, qualitative indicators allow the application of expert disciplinary and interdisciplinary judgement, whereas quantitative indicators allow the application of assessment methodologies that are transparent and consistent.
Both approaches are important, and indeed both are used, and sometimes combined, successfully in the assessment processes used by the UK Research Excellence Framework (REF). The University additionally recognises the ever-increasing role of quantitative indicators in the external measurements of our reputation, as measured by various league tables and funding agencies as well as the risks and opportunities that metrics-based evaluation opens up for the valuation of research, both in terms of excellence and its role in society.
Below we list the principles by which the University uses quantitative indicators and qualitative methods, and then describe how we apply them specifically in assessing research outputs (e.g. journal articles, book chapters, monographs), income, postgraduate research (PGR) supervision, and in recruitment, performance management and promotion.
The principles were approved during the 2019-2020 committee year by the University Faculty Boards, University Research Committee, and the University Executive Board. A working group, chaired by Professor Noortje Marres, is taking forward the implementation of the principles during 2021, and will report regularly to the Open Research Group and Research Committee.
These principles were developed by the Open Research Group (ORG). They were based initially on the Leiden Manifesto for Research Metrics and were developed over 2019-2020 in discussion with our Research Community to include the views and to reflect the discipline and interdisciplinary make-up of the University of Warwick.
The University of Warwick is committed to finding the most appropriate mix of qualitative and quantitative measures to assess the quality of our research. Expert assessment is at the heart of this, but metrics-based indicators may be used to support expert assessment in a range of processes including, but not limited to, recruitment, probation, reward, promotion, and performance review. Such indicators will never be used in isolation or supersede expert assessment of the research outputs or the context in which they sit, especially when reviewing the contributions of an individual researcher.
The University may also choose to use quantitative metrics when undertaking the collective assessment of a research group, unit, department, or the institution. These indicators will be used with sensitivity to the context in which they sit and will be made explicit to the group being reviewed.
As one of the world’s leading Universities, we pride ourselves on the excellence of our research. Expert assessment is fundamental to assuring the relevance, rigour, and quality of our contribution to society. Our expert assessment must in turn be of the highest standards. To that end, wherever it can be, this will be undertaken in ways that are open and transparent. There are situations or processes where openness may not be appropriate, for example probation and promotion, merit pay and occasions where competitive advantage is critical. However, the University will maintain these under review, adopting the principle of openness where possible.
In the 2019 Research Strategy several work strands are supported by the strategic priority of disciplinary and interdisciplinary excellence. While the quality of the research undertaken is best assessed using expert evaluation, appropriate indicators of research performance – such as article-based metrics, and indicators of collaboration levels - can be helpful in monitoring progress towards these objectives.
Working within the Strategy Renewal process to accommodate variation in missions and the relative effectiveness of indicators, goals will be set by each Department and/or Research Unit with the support of Academic Resourcing Committee (ARC) and with reference to Research Committee. As part of this process the indicators will be reviewed at least once every five years to ensure they best fit the mission of the departments. To that end any relevant indicators, along with their potential uses, advantages and disadvantages, will be selected in line with any declarations that the University of Warwick has committed to and will be reviewed with any new declaration signed. The list will be developed by the Academic Departments and Research Units in conjunction with the Library, Research and Impact Services and the Strategic Planning and Analytics Team to reflect their discipline specific needs and best practice.
In line with the principle above, Research Committee and others will work to ensure that Departments and Research Units will be able to select from the list of measures those they assess would best suit evaluation of their individual work and collective missions. Selected indicators would then be used consistently across all areas of research performance monitoring in that Unit/Department.
In creating the list of relevant indicators, the University will ensure that, wherever possible, the indicators suggested as well as the underlying data are openly available, and it is clear which indicators will be used in all situations. Academics will therefore be able to see the data relating to themselves, and to make corrections where necessary. Staff managing publication systems will also continue to ensure that data in these systems is as accurate and robust as possible.
Research practices in disciplines vary widely and bibliometric indicators serve some disciplines better than others. For example, citation tools are primarily based on journal and conference outputs, not monographs or other forms of output, although there are developments in this area. International collaboration indicators will be less relevant to disciplines where academics tend to publish alone rather than in teams. In line with best practice, indicators will be normalized wherever appropriate and, ideally, based on percentiles rather than averages. The availability or otherwise of data will not solely be used to drive decision making about research activities and priorities. In addition, some departments may choose not to use metrics at all. Moreover, not all research will be submitted to the REF, and the university recognises excellence for the purposes of promotion and reward both in cases where those on non-research contracts (e.g. teaching-only contracts) also publish high-quality work, even if this is not evaluated by the REF exercise, and where such staff members also publish high quality and impactful pedagogical research.
The University places high importance on its strong international theme, one integrated with institutional strategy. We recognise that most citation tools are inherently biased towards English-language publications in certain fields, and those publications published in English speaking countries. This is because of the historical development of these tools and the prevalence of English as the ‘language of scholarship’. It is important that academics producing work in languages other than English or published in venues outside of the English-speaking world are not penalised for this. The University recognises that outputs of this nature are equally as valuable as others but accept that their impact may be demonstrated differently.
Performance as measured by indicators are affected by career stage, gender, and discipline. We will take these factors into account when interpreting metrics. It is also recognised that academics undertake a wide range of research communication activities, not all of which can be easily measured or benchmarked. When assessing the performance of individuals, consideration will be given to as wide a view of their expertise, experience, activities, and influence as possible by expert reviewers.
Any review of an individual researcher’s outputs be it for recruitment, promotion, performance review or another purpose, will be based primarily on qualitative expert review supported and in conjunction with using output metrics.
We commit to using multiple, appropriate, indicators to provide a more robust and wide-ranging picture of any review. Indicators will be used to support qualitative, expert assessment, and with awareness of the dangers of providing false precision or misplaced absolutes to either the individual or group being reviewed or the audience for the review.
It is accepted that any measurements can, in themselves, affect the system they are used to assess through the inevitable incentives they establish. To minimize such effects, a suite of indicators will be used, wherever practical to provide necessary counterbalances, especially in the evaluation of individual.
As the research activity of the University and the external environment develop, any bibliometric indicators chosen by the institution for use will be revisited and revised where appropriate. This will be overseen by the Open Research Group and the University Research Committee, with advice and guidance from the Library, Research and Impact Services and the Strategic Planning and Analytics Team, working with Academic Departments and Research Units.
Open research risks and opportunities
Listen to a recording of the round table discussion on 5 February 2020 exploring the opportunities and risks of Open Research.
Hosted by Noortje Marres (Centre for Interdisciplinary Methodologies) with Professor Sarah de Rijcke and Professor Ludo Waltman, two of the co-authors of the Leiden Manifesto for Research Metrics, Warwick researchers, Sarah Richardson (History) and Robin Ball (Physics), and Yvonne Budden and Robin Green from the Library.
Photo accreditation: Video still. The Leiden Manifesto for Research Metrics (2016).