Evaluation

The evaluation of the programme has run alongside our activities since it started and is embedded in all that we do. As a formative evaluation it has allowed us to learn and change the programme as we have gone along, making both tweaks and larger adjustments to ensure that we are maximising our impact on Fellows’ and Mentors’ practice and learning, and to shifts in the education landscape.

Using theoretical models of professional development to support the building of an understanding of impact of the programme [1], the evaluation draws on a range of data, collected by methods including surveys, feedback forms and interviews. This data will be collected from each cohort of Fellow and Mentors, and analysed and reported on annually. The evaluation follows the university’s policies of ethical research [2], and additional information will be given to Fellows and Mentors when the programme begins to enable informed consent for participation in data collection and analysis.

The outcomes of the evaluation are used to improve the programme for future cohorts, shared with Fellows, Mentors and Wipro Limited, and disseminated to other STEM educators through networks including conferences and journal articles.

[1] Boylan, M., Coldwell, M., Maxwell, B. and Jordan, J. (2017) Rethinking models of professional learning as tools: a conceptual analysis to inform research and practice, Professional Development in Education, DOI: http://dx.doi.org/10.1080.

[2] https://www.shu.ac.uk/research/ethics-integrity-and-practice

For more information or to express an interest in joining the programme, get in touch by email at:WiproTeacherFellow@shu.ac.ukby phone on 0114 2256060 or via X @teacherfellow.

Comments are closed.