The National Student Survey

The great sociologist and social entrepreneur Michael Young had a reasonable claim to be one of the most influential shapers of contemporary Britain. In 1945, he wrote the Labour Party manifesto, which paved the way for the establishment of the NHS, the post-war Welfare State, public ownership, the National Parks and the new towns. In 1958, his biting satire The Rise of the Meritocracy coined a new word and a new political language – albeit one misunderstood by almost all politicians who have talked about meritocracy. He was one of the brains behind the establishment of the Open University, in the face of entrenched opposition from established universities. In the late 1950s, he founded the Consumer Association and Which? magazine in the face of entrenched opposition from manufacturers and retailers. When he died in 2002, his tombstone in Highgate cemetery described him as a ‘social visionary and innovator’.

Michael Young would have been in favour of the National Student Survey. The NSS was established in 2005, in the face of entrenched opposition from universities (there’s a theme here), which means that when the 2017 survey opens this week it is well into its second decade. It provides a barrage of questions answered by final year students around the country: on assessment and feedback, on library and learning resources, on teaching, on students’ unions, and on students’ overall satisfaction with their courses. Because students around the country complete it, it enables institutions – for better or worse – to benchmark performance. The intention of the National Student Survey – which is why Michael Young would have been in favour of it – was to empower student voice.   Over the last twelve years it has done that, and, for all the pain, it has been worth it because it has made universities listen harder to students and value their contributions.

The NSS is intended to empower student voice

The NSS is intended to empower student voice

The NSS is far from perfect. It is not, of course, completed by all students – completion rates are at about 70% which means that it is a sample at best, although Hallam has achieved over 78% for the last three years. It is, as its name suggests, a satisfaction survey. It’s not, and it should never be, the only means by which universities collect feedback from students – although its existence has stimulated a range of innovations in student engagement and the student voice around the sector. Its use as one element in the metrics underpinning the Teaching Excellence Framework is controversial – and I should know, being chair of the TEF panel.

But all social statistics are imperfect. Anyone – or any institution – which bases policy and intervention on a single data point would be foolish: in terms of the TEF, I, as chair, have been clear that the NSS scores are not over-weighted. But the NSS does provide valuable information. As with any data there are some basic questions to ask about the response rate, about the pattern, about the ways in which it aligns with other data sources. At Sheffield Hallam we have used, and we do use, the NSS data to help us build a picture of the University. The analysis which is undertaken at Hallam – led by Neil McKay, Dean of Students and Director of STEER is – frankly – exceptional. It looks at patterns in the data; at trends over time; at the patterns across the University and across the sector. It is viewed alongside the other information we have, emerging from a range of sources including the invaluable Student Voice report compiled by our Students’ Union. Our response to all surveys is not grounded in populism or crude customer satisfaction but in the professional expertise of our colleagues and the standpoints of our students. Used in this way NSS is an invaluable tool in supporting our drive to improve student experience and further enhance quality. There are specific ways in which the NSS has improved provision for students – library opening hours, online submission, improved facilities in the Students’ Union, feedback turnaround – which have emerged from responses to the NSS.

Over the next few weeks, final year students at Hallam, and around the country, will be completing the NSS. The NUS have called for a boycott. I think they are wrong. I understand their concerns about the use of the TEF in league tables and so on, but the data it generates is incredibly valuable to the University. If the only question it asked was about overall satisfaction, it would be of little use, but the granular questions and detailed comments are invaluable in identifying student concerns and benchmarking those, but also in understanding what students value.  That is why it is these more granular questions that are used in the TEF. In none of this does the NSS stand alone, but as part of a patchwork of evidence to underpin improvement.  All data needs treating with caution; all data needs to be stress-tested and triangulated but, as the saying goes – without data, you are just another person with an opinion.

 

One thought on “The National Student Survey

  1. I really enjoyed this blog and the links to the politics of students experience. I’m interested in the NSS from a diversity platform- how will it intersect with our Prevent Agenda and how will it help us to ensure students have ‘equal chance’ of a good experience. I think that’s a challenge to all universities and one I will support however I can

Leave a Reply

Your email address will not be published. Required fields are marked *