Academic Integrity
You are here: Academic Essentials> Academic Integrity
Welcome to the new homepage for all things Academic Integrity/Conduct. This page will link you to both staff and student guidance on the following:
This page is currently under refresh/development. If you have any enquiries about the content, please contact the team. |
Conduct and Discipline
University’s Rules and Regulations | Student Conduct | Academic Conduct |
Conduct and discipline guidance. | Student conduct rules and regulations. | For staff managing academic conduct cases. |
What is contract cheating?
Contract cheating and, more broadly, what Phillip Dawson (2021) calls ‘cognitive offloading’ is where students ask other people or, in some cases, artificial intelligence, to create assignments for them. Students can also purchase and trade ready-made answers, custom answers or notes and reflective case studies through a variety of commercial and informal routes.
If you want to find out a bit more about this, just have a look for your module or your assignments on sites such as StuDocU, Chegg and Fiverr; or Google one of your recent assignment or exam questions and see what turns up.
Using TurnItIn and Authorship
Although TurnItIn is helpful for tracking conventional types of plagiarism, the normal originality report does not help with allegations of contract cheating, where work is original but created by someone other than the student who has submitted it. However, we now have an additional service as part of our TurnItIn license called Authorship which compares all the work a student has submitted to TurnItIn and creates a set of indicators that might flag a risk for contract cheating.
The Authorship reports have to be generated by the team in Student Experience, Teaching and Learning (SETL). If you suspect there is a problem with an assignment submission related to contract cheating we can support your investigation by generating a report through Authorship. Please include a paper ID number from TurnItIn. You don’t need to include a copy of the student’s work if they have uploaded it to TurnItIn already, but if the student has not uploaded the work to TurnItIn already, you will need to supply us with a Word document or with the full student ID, and the module information.
Visit the TEL Help website ‘contract cheating referral form‘ guidance on how to request an investigation if you suspect contract cheating.
More support available for you
We can also offer training sessions for staff on how to identify cheating and bring an allegation; on how to understand the regulations and become a panel member on an Academic Conduct panel; and on good practice for assessment design to help build a culture of academic integrity. Please contact Jill Lebihan j.lebihan@shu.ac.uk if you would like more information on bespoke training sessions for your academic teams.
In addition, there are many useful resources on identifying different types of plagiarism and other kinds of cheating that might help you. Here are a few helpful ones:
- Contract Cheating Checklist (developed by the LSEAIN contract cheating working group)
- SHU Guidance on substantiating contract cheating
- TurnItIn text instructions on interrogating reports
- Sheffield Hallam video instructions on analysing a TurnItIn report.
Assessment and Feedback
There is reference to this within the existing staff resources under:
- Course Design > Design Principles > Assessment and Awards > Academic Integrity – October 2017.
- Supporting students > academic misconduct – links to regulations page.
Assessment 4 Students
There is reference to this within the existing students’ resources under:
- Regulations, policies and procedures > conduct and discipline
- Preparing to submit your work > academic conduct
Generative AI for student, teaching, assessment and general use
Hallam Resources
The Digital Learning Team have produced their Generative AI for student, teaching, assessment and general use Frequently Asked Questions (FAQ) handout (opens in new window).
Mick Marriott: Course Leader for Computer Science with AI, spoke to Hallam FM around the UK AI Safety Summit, arguing that there is a lot of ‘unnecessary speculation’ around the threats of artificial intelligence to humanity.
Nigel Francis, Cardiff University and our very own Professor David Smith MSc, BSc, Senior Lecturer and NTF Fellow, Department of Biosciences and Chemistry, HWLS have written the following papers:
- Generative AI (GenAI) exploded into the media in November 2022 when OpenAI launched its latest version of ChatGPT (GPT3.5). Since then there has been a near-daily launch of new tools and features, which shows little sign of slowing down, leaving the higher education sector with little choice but to embrace these new tools. Read their Generative AI in assessment (opens in new window) staff guide.
- The growing use of artificial intelligence (AI) tools offers new ways to aid your learning and research. However, these tools must be used responsibly to maintain academic integrity. Their guidelines on using AI in academic assessments (opens in new window) gives you a clear framework for using AI ethically across different types of assessments.
- As generative Artificial Intelligence (AI) tools become more widely available, it is important, to use them appropriately, ethically and in a way that upholds academic integrity. AI systems can provide helpful information and boost your studies, but they have several limitations that you should be aware. Read their student guidance on using generative artificial intelligence for assessment (opens in new window).
External Resources
- Ethan and Lilach Mollick, The Wharton School, 6 x 10 min videos on generative AI – possibly the best starting point. 50 minutes total in 10 min bite-size you tube videos!
- UNESCO Quick Start Guide by Mike Sharples.
- A useful little intro to generative AI from the University of Saskatchewan – a learning resource for staff and students.
- Phillip Dawson, Don’t Fear The Robot video.
- You might enjoy Demis Hassibis on Radio 4, The Bottom Line, where he talks about the potential and the risks of AI, and also models ethical citizenship in the use of DeepMind’s research.
Last updated: 3rd November 2023 NB |