Building an Evaluative Mindset at Hallam: Our Strategic Context

Written by Alan Donnelly

Blog #2 of 6 on Building an Evaluative Mindset at Hallam: Our Strategic Context

Within Hallam’s Access and Participation Plan (APP), the institution has outlined its commitment to develop and maintain a more sustainable evaluative culture. The strategic context of the APP is to eliminate inequality in Higher Education particularly for underrepresented groups, but the approach at Hallam is broader. We aim to ensure that evidence-informed practice is applied to all areas seeking impact so that it works for all students.  This involves taking steps to ensure that the application of evidence and evaluation to inform decision-making is prioritised and embedded into the foundation of the university.

The five sections of the Evaluation Framework adopted by the Office for Students

Why evaluate?

Given the context of metric-informed provision, there is an increasing imperative to gather evidence concerning the impact of actions upon student experiences and outcomes:

  • Measuring change effectively to find out whether something you have implemented has met its objectives and you can then assess how well those objectives have been realised.
  • Understanding the ‘counterfactual’ and its importance, which involves identifying what would have occurred if an activity had not been implemented and comparing this against the measured outcomes after the initiative.
  • Promoting the use of evidence and understanding of ‘what works, for whom, and in what circumstances’ to feed into decision-making.
  • Becoming a learning organisation that demonstrates a commitment to continuous improvement in practice and outcomes, including across its leadership and staff base (Centre for Social Mobility, 2019).
  • Demonstrating accountability for the use or value of funding (Parsons, 2017).

Developing an evaluative mindset

In order to cultivate and maintain an evaluative culture, a university-wide approach is required that goes beyond processes and structures by providing meaningful opportunities for staff to engage in self-reflection, inquiry and evidence-informed learning (Mayne, 2008). This includes:

  • Applying evidence-informed practices to direct the design and delivery of activities;
  • Developing initiatives with specific and measurable objectives and monitoring their progress against commitments;
  • Progressing from data driven approaches to the development of theoretical reasoning for change;
  • Disseminating knowledge and best practice;
  • Designing, piloting and rolling out evaluative leadership provision to enhance stakeholders’ skills and capabilities;
  • Establishing a working climate that identifies opportunities to learn from past experiences and to explore issues appreciatively, as opposed to apportioning blame.

Initiating change through leadership, cohesion and collaboration

Senior leaders are in a key position to make the role of evaluation more formal and visible, which is often overlooked as an expectation. One of the main suggestions from relevant guidance is to ensure that evaluations are ‘joined-up’ across the whole institution and linked to wider strategic work that is being undertaken (The Centre for Social Mobility, 2019). A collective approach is required to initiative change, and includes:

  • Outlining the role of evaluation within Hallam’s APP as the first step in raising its strategic profile.
  • Setting up forums and developing an overall evaluation framework, with the aim of ensuring that evaluative work is carried out in a consistent, cohesive and collaborative manner.
  • Consulting students in planning, monitoring and evaluation.
  • Ensuring that there is adequate budget, designated roles and timeframes for evaluation (Parsons, 2017), which are often considered to be a ‘hidden’ resource.

Building capacity

Shifting from a reactive to a proactive stance within an organisation requires capacity building for all staff. Stakeholders must be provided with opportunities to enhance their capabilities for evaluation and their ability to interrogate, synthesise and apply evidence.  A number of initiatives by STEER currently provide staff with access to relevant training, up-skilling opportunities and systematic ways of sharing evidence, such as:

Emphasising a focus on learning and reflective practice

The emphasis on learning is not restricted to up-skilling staff members and includes the creation of a climate where there is a willingness to use the learning from evaluation results to inform the decision-making process, irrespective of whether the findings are positive or negative.  Higher Education has shown a tendency to bias the reporting of success rather than failure (Dawson & Dawson 2017).  However, identifying the reasons why the impact of an activity has not been realised can provide valuable information about the changes that are necessary and generate greater understanding about the issues that are prevalent. Encouraging learning through experience, by recording the reflections of a range of stakeholders about what has happened during initiatives, can also help to generate further points for learning:

The third blog post in this series will be covering Programme (Activity/Intervention) Design, which relates to the use of evidence and evaluation to inform the design of these initiatives.

References and Further Reading

A Digital Glossary of key terms found in these blogs can be found here: (created by Liz Austen, STEER and Stella Jones-Devitt)

Adesola. (2019). Student Researcher Reflections: Working on a National Evaluation [Blog post]. Retrieved from (last accessed 18th November 2019)

Austen, L. (2019). Making a Difference with Data [Blog post]. Retrieved from (last accessed 18th November 2019)

Austen, L. (2019). Using Evidence for Enhancement [Blog post]. Retrieved from (last accessed 18th November 2019)

Centre for Social Mobility (2019). Using standards of evidence to evaluate impact of outreach.  Retrieved from (last accessed 18th November 2019)

Dawson, P., & Dawson, S. L. (2018). Sharing successes and hiding failures: ‘reporting bias’ in learning and teaching research. Studies in Higher Education, 43(8), 1405-1416.

Evaluation Support Scotland (2019). Reflective Practice. Retrieved from the Evaluation Support Scotland website: (last accessed 18th November 2019)

Mayne, J. (2008). Building an evaluative culture for effective evaluation and results management. ILAC Brief No. 20. Rome: Institutional Learning and Change (ILAC) Initiative.

Parsons, D. (2017). Demystifying evaluation: Practical approaches for researchers and users. Policy Press.

Pickering, N. (2019). Case Study: National Student Survey and Appreciative Inquiry. Retrieved from the QAA Scotland website: (last accessed 18th November 2019)

Student Engagement, Evaluation and Research [STEER] (2019). Digital Storytelling @ SHU. Retrieved from (last accessed 18th November 2019)

Student Engagement, Evaluation and Research [STEER] (2019). STEER Events. Retrieved from (last accessed 18th November 2019)