Tag Archives: feedback

Use of video: thinking outside the lecture

Claire Cornock & Mike Robinson
@SHUMaths

Parallel session 4, CoLab 4.4

Short Abstract
Various staff in the Engineering and Mathematics department have been using videos to enhance their practice. This session will include discussion, demonstration and hands-on experience of different technologies that can be used to create videos along with several case studies of how they are being used in teaching.

Back to event programme

Detailed Outline
In this session, we aim to give staff enough knowledge, experience and confidence to start producing their own videos to enhance their teaching practice. To this end, the session will include:

• outline of the advantages and disadvantages of using videos in teaching brief case studies based on current use within the department, including how technology has been used to address particular problems

• presentation of the available technologies for producing videos, to include desktop PCs, tablet PCs, Android tablets, iPads, and more traditional whiteboard and cameras
• hands-on practice with a selection of the available technology.

The opportunity for colleagues to use the technology to start producing their own videos is the key part of this workshop, along with the opportunity to establish links between staff with shared interests. It is hoped that one outcome from this session will be an ongoing common-interest group for staff interested in using videos.

Several case studies will be presented within the session. These include:

• Addressing the problem of trying to teach how to use technology within a lecture room by creating short video examples. This was within a first year module that centred around the use of excel. Physical constraints of having a large group within a lecture room meant that students were struggling to carry out tasks shown in lectures. Now students make use of the short video examples in their own independent study time and more of the students are attempting the more technical tasks in assignments. The students often make unprompted comments on the usefulness of the videos.

• Videos of mathematical processes. Doing mathematics is a process which often requires correction, modification, and thought, and yet the finished product is typically concise and static, and much of the thinking is hidden (this is particularly true of any attempts which do not lead to a solution). In lectures the process of doing maths can be demonstrated, typically with speech describing the thought process, and writing recording the finished product. Students understandably struggle to keep notes on this without losing some important details. Videos offer an opportunity to provide students with a reference source in which both the thought process and the final product are recorded.

• Providing video feedback to students. Typical feedback on mathematical problems might include a set of model answers, but as with the mathematical processes above, these often lack detail about the thought processes behind the approach. Producing model solutions using screencasting can provide students with an audio commentary alongside the concise formal solution and has proved effective at engaging students.

Within these examples, several different types of technology have been used. These include PCs, tablet PCs, Android tablets and the traditional whiteboard and camera. We will discuss and demonstrate how some of these can be used.

Subverting Multiple-choice Questions for Deep Learning

Lee-Ann Sequeira

Parallel session 3, CoLab 3.6

Short Abstract
The aim of this workshop is to show through examples and evidence how online formative multiple-choice questions (MCQs) can be used to promote conceptual understanding and peer learning for students studying in and away from the classroom.

By the end of the workshop, participants would have:
• Understood how formative MCQs can be used to promote deep learning in face-to-face and distance learning
• Completed a short online formative MCQ quiz to see how it works from a participant’s perspective
• Begun to construct an online formative MCQ quiz

Back to event programme

Detailed Outline
An online formative MCQ has key features that distinguish it from a traditional MCQ that is used to test recall of information.

• The topic for the MCQ should be complex and difficult to understand. It should be fundamental to that field/module, such that it is a conceptual building block on which more advanced concepts are built (eg. threshold concepts).
• As the aim is to further the learner’s understanding, it is advisable to create a set of MCQs – about five-eight MCQs – on a single topic that explore different aspects of that topic and/or increase in complexity.
• Similar to traditional MCQs, formative MCQs have a question stem, correct answer(s) and distractors. Distractors (incorrect options) are key as they help to identify areas of confusion or poor understanding.
• MCQs can have varying levels of pre-programmed, automatic feedback, for example, referring students to a particular topic in the text or explaining certain problematic parts of the question or inviting them to attend a mop-up tutorial to discuss problematic areas identified in the MCQs.
• MCQs are often created in a virtual learning environment (VLE) which tracks the learners’ responses and generates reports showing the learners’ attempts by question, user, overall class performance, etc. in real time. These metrics can be used by the tutor to diagnose which questions or topics the students are finding problematic and accordingly focus her/his efforts in that direction.
• Learning activities such as online discussion fora and webinars can be used to explain and clarify issues identified in the MCQs (determined by the metrics) and increases opportunities for learners to ask follow-up questions and discuss them, thereby advancing the learner’s understanding.

In this way, online formative MCQs are compatible with a variety of pedagogical purposes – peer learning, the flipped classroom, self-assessment and revision, etc. Examples and evidence will be presented from a number of disciplines – economics, health sciences, physics, education, etc., including feedback from students and staff who are involved in a pilot study.

This is the proposed outline for the CoLab session:
• Introduction to MCQs
(Types and features of MCQs, merits and limitations)
• Quiz time!
(Participants take an MCQ quiz to get an idea of how it works from a student’s perspective)
• Hands-on activity: DIY MCQ
(Participants begin constructing an MCQ quiz using the guidance provided.)

3.7 Transcending Modularity through Flexible Formative Feedback

3.7 Hallam L&T Conference Session 3-7 – Oli Johnson

What is the role of feedback in promoting learning across modules and how can feedback be embedded more effectively into the student journey? The HEA-funded Flexible Formative Feedback Project is led by a team of Student Ambassadors for Learning and Teaching (SALT)—a nationally-recognised student-staff partnership scheme in which teams of students design and lead on learning and teaching enhancement activities. Working in collaboration with students and staff in a cross-section of ten departments, the project team are constructing a feedback profile of existing practice and mapping student experiences onto the feedback environments of their disciplinary areas. Data will be used to identify case studies of best practice and to inform the development of discipline-specific tools for the provision, collation and use of feedback that is both flexible—i.e. adaptable to disciplinary and individual student needs—and formative—i.e. action-oriented and developmental.

As a longitudinal study, the project will revisit students over the course of the academic year to find out the extent to which their expectations have been met by the feedback process. Although data collection is ongoing, our initial consultation has identified a ‘feedback gap’ between student expectations and experiences, the fault lines of which appear early at level one. This paper, which will be presented by a staff-student team, will share the initial findings of the project and explore strategies to narrow this gap including the development of study skills training packages and feedback collation tools. It will consider the underdeveloped role of feedback as a synoptic learning tool with the potential to transcend the modular nature of assessment as part of a broader transition to self-regulated learning. It will conclude with a reflection on the implications of this process for the future-proofing of feedback in the context of rapid technological development and the changing university environment.

The Essence of Belonging: Sheffield Hallam Students’ Union ethnographic research into student communities (2014)

Jessica Baily, Emily Connor & Emmet Cleaver, SHU Students’ Union

Sheffield Hallam Students Union conducted a research project on the theme of ‘belonging’. The Education Officer and Welfare and Community Officer ran a series of filmed student interviews from a large range of demographics to discover exactly how students at Sheffield Hallam found their sense of belonging to their course, campus, sports team, society or within the institution. Following the interview, students were issued with cameras and were required to visually ‘capture’ this sense of belonging in a series of photographs. The research culminated in ‘The Belonging Hub’, a room and event at the Students Union of the students photographs coupled with a video documentary displaying all of their responses.

Following on from Sheffield Hallam Students Union’s research into how students forge a sense of belonging with their course, society or institution, this session was designed to engage staff with an approach to understanding how to create communities in the classroom. Using the research as a basis for discussion, participants will relate and apply the student perspective into the various challenges of teaching from classroom engagement, attendance and feedback. Members will hopefully leave with an understanding of the kinds of atmospheres students work best in and have ideas of how to replicate if not recreate these climates in their own practices.

296 – Course-Centred Assessment – Andrew Middleton, Christine O’Leary, Graham Holden, Serena Bufton, Mike Bramhall, Alison Purvis

This session aims to inspire, inform and challenge participants towards finding holistic approaches to course-centred assessment. With reference to good practice assessment principles (e.g. Nicol & McFarlane-Dick, 2006), this panel session will provide examples of course-centred assessment strategies and models designed to engage and empower the learner through their course (Nicol, 2009). A course-centered approach to assessment lends itself to the development of student self-regulation, to authentic assessment practices and supports a more dialogic approach (Freeman & Dobbins, 2013). It can encourage a shift away from fragmented learning experiences, which can be an inadvertent result of module-centred assessment tasks (Gibbs, 2012; Price et al. 2011). The examples discussed will demonstrate how assessment and feedback can help students to make formative connections across and through their course. A series of short presentations will be given exploring what a course-centred approach means for assessment practice, how it can enable integrated and authentic approaches to assessment, and the benefits it presents to the student experience. Session activities for engagement: In the second half of the workshop participants will be involved in small group activities aimed at developing and sharing key ideas on the various integrated course assessment strategies.

Click to visit presentations:  296 LTA Conf -AssessmentPatterns-Course View Blanks

296 Integrated assessment

296 LTA Conf -AssessmentPatterns-Course View Example

289 – Course ethos: it’s not the students who are strange – Neil Challis, Michael Robinson

If you talk to some lecturers, you will find no shortage of opinion about the shortcomings of our students.  They are ill-prepared for the university curriculum; don’t turn up to class; aren’t interested if it isn’t assessed; lack motivation; don’t love the subject; don’t know what they should; are only interested in how to pass the exam; lack basic skills; can’t write properly… Teaching in a university would be great, if it weren’t for the students.

Our own experiences are atypical. Contrast the caricature presented above with an equally broad-brush picture of a lecturer: we love our subject; are good at it; are motivated and hard-working; are interested in a deeper understanding; could cope with exam stress; and we’ve spent our adult lives surrounded by similar colleagues.

Drawing on work from the More Maths Grads project, which examined four diverse departments ranging from 30 to 350 students per cohort, we compare what students and staff say about their aspirations and consider how this impacts the students’ enjoyment and confidence in their subject. Whilst we found evidence of special effort being made to overcome the perceived student shortcomings, we nevertheless detect some frustration at these. If we become frustrated, is it us or them that have the problem? Is it reasonable of us to expect our students to share our outlook?

We suggest that our perceptions can lead to messages – explicit or implied – to students about their abilities can easily damage their confidence and well-being.

In particular we discuss ways to generate a more positive attitude so that more of our students might report, as one did:

“The … tutors treat the students as equals, I have never been talked down to …  I feel that the tutors and students work as a team aiming for one goal and that is the students understanding and enjoyment of the subject.”

Click to view:  289 course ethos it’s not the students who are strange

284 – Exploring ways of using formative feedback to improve student engagement with simulation modules – Vicky Thirlaway, Amy Musgrove

It is now well established that courses should seek to use Assessment for Learning, rather than Assessment of Learning, and that the form of assessment can have a significant impact on the student experience (the “backwash” effect described by Biggs (1996)). A key component of any assessment for learning strategy is to include authentic assessment tasks which the students can see have a relationship to the “real world” (McDowell, 2012; Gikandi et al., 2011). Fostering student engagement with a “make believe” scenario is a challenge: the activities must be perceived to be “credible” if students are going to engage with them.

Simulation can be an effective way of allowing students to contextualise their learning and develop the skills they will need to turn theory into practice. It can, therefore, have a role to play in developing employability skills.

Research demonstrates that formative assessment and feedback can significantly improve student performance (Black and Wiliam, 1998) and arguably, this is even more crucial when the assessment measures skills of application that the student may not have had to demonstrate in their previous educational experiences (Ramaprasad, 1983; Sadler, 1989 cited in Jordan, 2012).  Tutors often feel that students fail to engage with formative assessment, and take little notice of feedback provided (Orsmond et al., 2013).

This presentation will argue that cohorts of students are primarily strategic learners, and therefore are reluctant to engage with learning activities that do not directly feed into assessment, even where they acknowledge the validity of the exercise (Coles, 2009). It will be suggested that “formative” assessment should be compulsory and, therefore, must be part of the totality of summative assessment on the module. We have some experience of embedding compulsory formative assessment and feedback within a simulation exercise. The presentation will evaluate the successes and shortcomings of our experience, and consider alternative ways of providing formative feedback within the simulation whilst maintaining the authenticity of the task and the credibility of the summative assessment.

References

Biggs JB (1996) “Assessing learning quality: Reconciling institutional, staff and educational demands” Assessment & Evaluation in Higher Education, 21: 5–16.

Black, P and Wiliam, D (1998) “Assessment and classroom learning” Assessment in Education: Principles, Policy and Practice” 5(1): 7-73

Cauley, K, M and McMillan, J. H (2010) “Formative Assessment Techniques to support student motivation and achievement” The Clearing House: A Journal of Educational Strategies, Issues and Ideas, 83:1, 1-6

Carless, D, D. Salter, M, Yang, and J. Lam. (2011) “Developing sustainable feedback practices” Studies in Higher Education 36, no. 4: 395–407.

Clark, I (2008) “Assessment is for Learning: Formative Assessment and Positive Learning Interactions” Florida Journal of Educational Adminsitration & Policy 2(1): 1-15

Clarke, I (2012) “Formative Assessment: Assessment is for self-regulated learning” Educ Psychol Rev, 24, 205-249

Coles, C (2009) “The Role of New Technology in Improving Engagement among Law Students in Higher Education”, Journal of Information, Law & Technology (JILT), 3, <http://go.warwick.ac.uk/jilt/2009_3/coles>

Duncan, N. (2007) “Feedforward’: Improving students’ use of tutors’ comments.” Assessment & Evaluation in Higher Education 32, no. 3: 271–83.

Gikandi, J. W, Morrow, D and Davis, N. E (2011) “Online formative assessment in higher education: A review of the literature” Computers and Education, 57, 2333-2351

Handley, K, Price, M and Millar, J (2011) “Beyond ‘doing time’: Investigating the concept of student engagement with feedback” Oxford Review of Education, vol37, No.4, Aug 2011, 543-560

Jordan, S (2012) “Student engagement with assessment and feedback: Some lessons from short-answer free-text e-assessment questions” Computers and Education, 818-834

Mann, S.J. (2001) “Alternative perspectives on the student experience: alienation and engagement” Studies in Higher Education, 26(1): 7-19

McDowell, L. ‘Assessment for Learning’ in L. Clouder & Broughan (eds) (2012) “Improving Student Engagement and Development Through Assessment” London: Taylor & Francis

Orsmond, P, Maw, S, Park, S, Gomez, S & Crook, A. (2013): Moving feedback forward: theory to practice, Assessment & Evaluation in HigherEducation, 38:2, 240-252

Parkin, H. J, Hepplestone, S, Holden, G Irwin, B and Thorpe, L (2012) “A role for technology in enhancing students’ engagement with feedback” Assessment and Evaluation in Higher Education, Vo.37, No.8, Dec 2012, 963-973

Price, M, Handley, K and Millar, J (2011) “Feedback: Focusing attention on engagement.” Studies in Higher Education, Vol.36, No.8, Dec 2011, 879-896

Schartel, S. A (2012) “Giving Feedback: An integral part of education” Best Practice and Clinical Anaesthesiology, 26, 77-87

Wingate, U (2010) “The impact of formative feedback on the development of academic writing” Assessment and Evaluation in Higher Education, vol.35, No.5, Aug 2010, 519-533

284 Final power point SHU conference

269 – Does e-learning and mobile technology have a place within HE Learning, Teaching and Assessment? – Jo Marsden

Strand: The technology enhanced course Anticipated outcomes:• Discuss the viability & usefulness of iPads within LTA Session outline (or abstract): max 300 words Technology has brought about irreversible change to the world (Su 2009) and educators have had to acknowledge the reality of technologically-induced change and it’s constantly evolving pace.  This extreme growth in the capabilities of technology, especially mobile technology, alongside increasing affordability has led to the acknowledgement of a ubiquitous learning tool within higher education (Pollaro and Broussard 2011).   As non-traditional methods of education become more established and as a factor of that informal and flexible learning environments become necessary for students in an ever-connected society, e-learning will play a significant role (Fetaji 2008). Within this example e-learning has been utilised as a tool for:• student engagement• student learning• a teaching aid• assessment support This has been through the use of:• iPads• screencasting• online feedback• Google Docs• Google Forms   The views of the students, teaching staff and support staff have been collected on the use of these tools within different settings.   The outcomes from the students were positive in the use of different learning environments and technologies and assisted in student engagement, however there were questions raised over the impact on student learning.   The use of iPads for the purpose of assessment support assisted in achieving the new assessment regulations of a 3-week turnaround and in facilitating online student feedback, which was also favourably received.    The examples incorporated a blended approach to teaching and learning for isolated modules within the Department of Sport.   As technology usage within Higher Education becomes more prevalent and staff become more aware of the options, and also the ways in which to combine technology into the classroom, the real focus needs to shift to the course design and the integration of technology within this.

247 – Open Badges: Supporting Learning and Employability by Recognising Skills Development – Ian Glover

Open Badges were developed in 2010 by the Mozilla Foundation, with support from Peer2Peer University and the MacArthur Foundation. They are designed to be a method of validating and certifying knowledge and experience in a less formal manner than degree certificates and grade transcripts, and have been identified as having a high potential impact on education, likely to be felt within the next 2-5 years (Open University, 2012, p. 16-18). Additionally, they have the potential to be a motivational tool to encourage students to take control of their studies and help emphasise the need for extra-curricular experience and achievement. In this way, Open Badges can support employability strategies by providing students with clear targets that are relevant to industry.

Open Badges support linking to evidence to justify their award, meaning that they can aid students in developing portfolios of work and, by making the badges publically viewable, provide evidence of their work to prospective employers. Another major benefit of Open Badges is that they help expose the skills and competencies students have acquired through their studies. Students often overlook this aspect of Higher Education because the focus is on grades, yet it is the underlying skills that are often most valued by employers (McDowell, 2013).

This paper discusses the implications of Open Badge adoption on Higher Education, highlight examples of their use, and stimulate consideration of the potential of this recent innovation. Several existing online systems are available, and these are discussed along with some suggestions on possible uses for Open Badges.

References

McDowell, L. M. (2013). Skills and Labour market change. White Paper. http://www.nelep.co.uk/media/2624/linda-mcdowell-skills.pdf [accessed 04 May 2013].

Open University. (2012). Innovating Pedagogy 2012. White Paper. http://www.open.ac.uk/personalpages/mike.sharples/Reports/Innovating_Pedagogy_report_July_2012.pdf

[accessed 04 May 2013].

Click to view presentation:  247 Open Badges – SHULT13

246 – Making Connections: Using technology to improve student engagement with feedback – Stuart Hepplestone, Helen Parkin

Making Connections: Using technology to improve student engagement with feedback

This paper will present the findings of a research study at SHU to identify technological interventions that might help students make connections between the feedback that they receive and their future learning. Using a qualitative approach, the study worked with ten tutors and twenty students. This was made up of four Level 5 cohorts (one from each faculty) including one module tutor and between three and six students, and an additional six tutors who taught on unrelated modules. The findings of the project cover each aspect of the assessment process from both the staff and student perspective including submission, giving and receiving feedback, storage and future use of feedback. In summary:

the process of submitting assignments should be easy and convenient, from anywhere and at anytime

any tool should embrace the current variety of feedback practice, yet achieve consistency in publishing feedback alongside the rest of the students’ learning materials

students store all their feedback in one place; there is a preference for hard copy because of circumstance, i.e. it is easier to print an electronic copy than to covert hard copy to an electronic format

students were more likely to look at and use feedback at the point of their next assignment if it is online

In light of these findings, a range of technological developments that might help students establish or better make connections between the feedback that they receive and future learning, including:

An end-to-end online marking experience that facilitates ease and efficiency of marking online.

An online assessment and feedback that enables students to store all feedback from all modules in one place alongside an assessment calendar, advice on how to use feedback effectively, space for action planning and dialogue around their feedback.

Please click to view presentation:  246 LT conf 2013 – making connections