Rachel Hewitt suggests alternatives to the National Student Survey
Before the pandemic, I spent a lot of time talking about how much change in higher education policy there had been in recent years and how government focus on it seemed to have increased.
The coronavirus threw this up in the air. For much of the past eight months, it has felt like both higher education and the government have been in reactive mode, trying to get on top of a response to something that has drastically changed how we operate.
But in responding to the pandemic, the government has managed to slip in other policy changes and messaging that give an indication of its direction of travel on higher education.
One example of this is the policy paper published on reducing bureaucratic burden in research, innovation and higher education. This stemmed from the higher education restructuring regime, in which the government committed to reducing its bureaucratic burden on universities as well as encouraging universities to strip back their own bureaucracy.
These plans were broad-ranging, including cutting the Office for Students’ registration costs, reviewing research funding processes and, notably, a “radical root-and-branch” review of the National Student Survey.
While the review of the NSS was ostensibly described as a way of reducing the burden on universities that had been stretched by Covid-19, the language was clearly political in nature. The NSS was described as exerting a “downwards pressure on standards”, with universities achieving good scores by “dumbing down and spoon-feeding students”.
These were described as “valid concerns from some in the sector”. While I have heard many concerns raised about the NSS, I have never heard it described in quite this way before. I certainly don’t remember the National Union of Students’ boycott of the NSS focusing on “dumbing down”.
The government’s expectations of the review’s conclusions are clearly set, including no longer relying on a universal annual sample and possibly not making all the data publicly available (something that goes against the general trend towards open data).
This, to me, seems incompatible with a “radical root-and-branch” review. Before I joined the Higher Education Policy Institute, I led on the review of graduate outcomes data at the Higher Education Statistics Agency. It included two consultations (on the principles and the final model), two commissioned pieces of research, quality assurance assessments and steering groups of experts. There are those who have complaints about the Graduate Outcomes survey, which is what it culminated in, but nobody can deny the process to get there was robust. It also took two years.
The Office for Students has subsequently confirmed plans for pushing forward the NSS review, including splitting it into two sections, the first of which will culminate by the end of the year.
However, this has left it in a tricky position for the 2021 survey, for which the expensive contract has already been awarded. The consequence is that the OfS is actively discouraging universities from advertising the 2021 survey.
Given the significant role that universities’ own advertising campaigns play in promoting the survey, this seems to be the worst of all worlds. We are likely to be left with unusable data while still going through the exercise. It also feels ethically challenging. We are meant to be clear with data subjects about the uses their data will be put to when we ask them to share with us. Given the OfS has said it will come to a conclusion about how the data will be used in early 2021, and therefore after surveying has begun, surely we cannot be truly transparent with students? And is 2021 really the year to put less effort into getting students’ views of higher education?
Watching this all play out leads me to thinking about what the purpose of the NSS should be. According to the OfS, its aims are to: inform prospective students’ choices; provide data that support universities and colleges in improving the student experience; and support public accountability.
While the latest interventions from the government suggest it does not believe the survey is achieving the third aim, it is also important to consider how any alternative might achieve the first two.
The survey’s role in informing prospective students’ choices is mostly done through its use in league tables and official resources such as the Teaching Excellence Framework and Discover Uni. It seems inevitable that the role of the NSS in official resources will be reduced or removed altogether and that league tables may well be left without any measure of the student voice. Even those opposed to the existence of league tables are likely to agree that this will make them worse.
An alternative way to inform prospective students’ choices could be better use of existing data sources beyond the NSS. For example, the Higher Education Policy Institute and Advance HE Student Academic Experience Survey has a series of data, dating back a long time, and plenty of information about students’ experience of higher education.
Information beyond salary that is collected about graduates’ future careers could also be useful to prospective students. Using information from the Graduate Outcomes survey on whether graduates are in the career of their choice, whether they find their work meaningful and if they are using what they learned at university could be helpful here.
When considering how universities will address the second aim of the NSS—to provide data that support universities and colleges in improving the student experience—I start to doubt whether this review of the NSS will culminate in reducing bureaucracy.
It is unlikely that universities are going to want to stop having methods of tracking the student voice, and this will lead them to conduct more surveys of their own (notwithstanding that many have been doing this already). No longer having the efficiency benefits of doing this centrally is unlikely to result in savings.
However, it does provide an opportunity for universities to think about how they listen to the student voice. As Eve Alcock, a former student union president at the University of Bath, has highlighted, we could move towards greater use of qualitative data to get a fuller picture of the student experience, although none of this activity is likely to “reduce bureaucracy”.
In terms of the government’s use of the survey, I think it would be foolish to celebrate the culling of the NSS as a move away from metrics. Instead, it is likely to result in a focus on only the metrics the government perceives as important, which seem to be retention and graduate outcomes. This has been crystallised by the OfS developing a ‘start to success’ metric, combining the two measures.
Universities are likely to struggle on both these measures this year, given the challenges of delivering teaching in a pandemic and their students graduating into a recession. We can fight the government on its use of metrics, which feels like swimming against the tide, or we can make the case for why it should be using as broad a set of metrics as possible, including the student voice, to avoid a narrow picture of higher education and its value.
Rachel Hewitt is director of policy and advocacy at the Higher Education Policy Institute.