(Note for readers who don’t work in a UK University: you may or may not know about the Research Excellence Framework, the process by which the Government assesses the quality of research produced by publicly funded Universities in the UK in order to determine how much money each institution should get from central funds. The most recent iteration of this exercise (REF2014) included, for the first time, assessment of the “impact” achieved by each institution’s research – in other words, what use has the research been to users outside of academia? If your research has resulted in the development of a cure for cancer, a significant new piece of legislation, or you are Brian Cox, your impact is 4* (the highest possible). If your impact is published in a journal no-one reads, never to be thought of by anyone outside of academia, your impact is 0*.)
This morning, in my capacity as Glasgow School of Law’s Impact Seneschal (technically Officer, but come on) I attended a conference entitled “Research Impact: Evidencing the REF”, a discussion of the…er, impact of including impact in REF2014 and what might happen re impact on the next go-round, presumably REF2020.
Presentation of the results of HEFCE-commissioned research into the use of impact in REF2014 formed part of the discussion. Catriona Manville at RAND Europe spoke about her evaluation of the impact process at the submissions stage and at the evaluation stage (see the reports here). Jonathan Grant at KCL carried out a “synthesizing” of the submitted impact case studies with the aim of finding out what sort of impacts were being achieved, where and in what disciplines (pilot study reported here, with full report to follow). Steven Hill of HEFCE also gave a presentation, and there was a Q&A session with experts including Michael Pidd, a member of REF Panel C (which is where law is assessed.)
From the discussion, the headline news is roughly as follows:
- First things first: impact is here to stay. There may be some tinkering changes with the process for the next REF – the impact template may be subsumed within the environment template, for example – but the overall gist will be the same. (Disclaimer: probably. Unless they change their minds.)
- RAND found that, on average, production of each impact case study cost the submitting University £7,500 and took 30 days. The average cost for production of the impact template was £4,500. Some took the view that this is reasonable given the amount of money being distributed by way of the REF. Others did not.
- 84% of case studies were rated 3* or 4*. It seems that evaluation panels found the 0* – 4* scale a bit too limited to differentiate properly between different case studies, and it may be that a wider scale (0* – 10*? No one mentioned specifics) is employed next time.
- Case studies were assessed by the REF panel members together with non-academic users. There wasn’t much support for changing this mechanism next time around. Most of the speakers seemed fairly suspicious of the idea of trying to formulate some kind of metric for impact assessment – neither Panel B nor C had used journal rankings as part of the impact evaluation process, and the KCL research suggests that the range of impacts is so diverse than developing any meaningful metric would be impossible.
- According to
SimonSteven (thanks @_loveresearch!) Hill, the ratio of case studies:FTE staff submitted is unlikely to change (unless the staff selection process as a whole changes, which I gather is likely to be a question in a forthcoming post-REF2014 consultation – staff selection processes cost Universities a lot of money, plus, as some of us may have noticed, they are not great for staff morale.)
- It is likely that case studies submitted last time will be eligible for resubmission provided that they have had further/ongoing impact since 2014. The general understanding seems to be that impacts can take a long time to develop – longer than one REF cycle in many cases – and that should be reflected in the assessment process.
Some difficult questions and things to think about:
- There was quite a lot of discussion about whether or when public engagement counts as impact. Some pointers: (a) engagement needs to be about research, not just basic knowledge of the subject matter – giving a talk that could come from a textbook is unlikely to count; (b) it doesn’t need to be your own research, so long as it is research being conducted by someone in your department/school/whatever – the example given here was Brian Cox, who is actually a particle physicist but covers a lot of the astronomy work conducted by others in his department in his media work; (c) numbers of people at your talk/downloading your podcast/visiting your webpage etc is not evidence of impact in itself – you need to capture how it affected their views, by eg having exhibition visitors fill in evaluation forms or recording what was tweeted about your programme.
- How are you supposed to compare an economic impact with a policy impact with a cultural impact? It seems the panels calibrated by comparing like with like, not comparing across categories, since it’s apples and oranges. (Yes, I know about that paper comparing apples with oranges.)
- How do you measure negative impacts – research which resulted in something not happening? The KCL research suggests Universities just didn’t attempt this in their case studies, but I think that’s a problem for something like law – excellent legal research might well produce the conclusion that legislation should not be introduced, for example.
- Luke Georghiou from the University of Manchester pointed out that, for all the political talk about the need of academics to engage with users, there is little mention of the need of users to actually listen to anything we’re saying. So it’s probably worth thinking about ways that we, institutionally or individually, can contribute to a bit of cultural change there.
There conference twitter hashtag #refmanc15 had a few busy users for anyone who wants more detail. Now all I need to do is evaluate the impact of my post about the impact of impact.