Read guidance on assessment design to better support student digital literacy and mitigate academic integrity challenges associated with generative AI.
Review existing assessment practices
Try out Bing, ChatGPT, Midjourney or other generative AI tools for yourself. Try entering a sample past exam question, (don’t use a real one) problem set or writing task and see what it comes up with.
Consider your current assessments critically. Does the current design really measure what you want it to measure? How else might you assess students’ learning? If needed, consider amending your exam questions or written assessment tasks to make them more difficult for an AI tool to answer (see below).
Make clear to your students what is acceptable use of AI in the context of your assignments, and what is not. Talk with your students about the uses of generative AI that would constitute academic misconduct.
Explore our AI web pages for further guidance on AI in teaching and assessment, check dates for workshops and contact your Academic Developer or Learning Technologist if you would like more support.
Before you make changes
Before making changes, discuss any proposed changes with your course convenor or Board of Study
If implementing short term changes, ensure all adaptions align with existing assessment criteria and learning outcomes.
If implementing long term changes, check AQP’s timetable for compliant ‘material/definitive’ course change and their guidance on ensuring changes made are compliant with the Competition and Markets Authority (CMA)
What assessments types are most at risk?
Generative AI tools expose a wide range of current assessment modes to intentional or accidental academic misconduct. Click on the assessment modes below for a summary of their vulnerability.
- Essays
Tools like ChatGPT will produce human-like text outputs that students could submit as their own without engaging in any critical thinking. Iterative prompting can dramatically increase the output quality, even mimicking the student's writing style. Generative AI can be used to redraft existing text to bypass plagiarism checkers, with tools like Consensus and Jenni being capable of inserting genuine research citations.
Note, we strongly advise against using AI “detectors” to check students work for personation, as these detectors can be biased against non-native English speakers. In addition, by inputting student working into detection tools, you are feeding generative AI with material specifically relevant to your subject.
- Lab reports and data interpretation
Generative AI is capable of fabricating data that fit defined research trends. Some AI tools allow CSV or PDF files to be uploaded, which opens the scope for automated data analysis and interpretation, bypassing the critical thinking and understanding typically required. Many tools, including ChatGPT, Claude and Bard can write or debug code in common programming languages like Python or R to visualise the analysed data. Data analytics can then be performed using natural language prompts to integrate the data.
- Paper critques
PDFs can be uploaded to some Generative AI tools, which can be asked to compare and contrast the studies. Some tools allow for the interrogation of PDFs by asking guided questions or can rephrase the contents in a different manner, for example producing summaries for a lay audience. Additionally, this can include suggestions for future work albeit the suggestions may not be practical.
- Literature reviews
Tools like Claude allow for multiple papers to be uploaded in various formats. The tools can then be asked to summarise the papers for common themes and ideas producing a synthesised review. Here, again, iterative prompting can dramatically enhance the quality of the output.
- Research and grant proposals
As with paper critiques, Generative AI can synthesise ideas from across multiple papers, for example, to suggest potential experimental designs. Suggestions are likely to be accurate but experimental approaches may not be optimal or will be impractical from a time or cost perspective.
- Presentations
Students can use Generative AI to write a script for a presentation, and tools like Gamma can create entire presentations from simple text inputs. If not being presented in person, then it is hypothesised that Generative AI could potentially be used to synthesise a student's speech in real-time during a remote oral examination, allowing someone else to take the exam on their behalf. Some tools can also make it look as if someone is looking directly at a camera while they are reading from a (Generative AI) prepared script.
- Podcasts
As above, although speculative and technically challenging at this point, Generative AI could be used to fabricate a student’s voice. However, Generative AI can create a conversational script between two individuals.
- Posters
Generative AI can suggest layouts, provide an outline, write content, fabricate, analyse, and visually present data for posters.
Changes you can make now
The changes suggested here focus on actions you can take now to make your existing assessments more AI resilient. All can be made without changing your existing formative or summative assessment mode, timing, or weighting. As such, they do not need to be approved by your Board of Study.
Nevertheless, before making changes, it’s recommended you discuss your changes with your course convenor or Board of Study, especially if you intend to allow the use of AI in assessment.
It is important to ensure that your formative assessments continue to give students the opportunity to practice elements of the summative assessments, be that subject matter or submission types and technologies. Ensure to adjust your formative assessments accordingly.
Note, that making such changes does not negate the need to consider more significant, longer term, changes to approaches to teaching, learning and assessment. It is recommended that this is done in parallel to making smaller, more immediate, changes.
- 1. Discuss AI and academic integrity
When disussing AI, explain that the default position of the University of Sussex is that the use of AI in assessments is not permitted unless specified by the module convenor.
If you have chosen to incorporate or allow the use of AI in an assessment, you should make clear in the assessment brief what is permissible and explain how you want students to acknowledge the use of AI. This brief should be accessible on the assignment information page in Canvas.
Explore the uses and limitations of AI with students and colleagues in a context of transparency. Such conversations should be dynamic and collaborative and make explicit links to discipline specific challenges or opportunities of technological advancements.
You should also discuss the purpose of your assessments and the value for students’ own learning and development of engaging with them.
See also: Talking with students about AI
- 2. Review writing assessments
AI can easily replicate the predicable structure of an essay and provide contextually relevant information and examples. Make it more difficult for students to rely on AI generated material by:
- creating writing assignments based on scenarios,
- asking students to adapt information for various audiences and purposes
- inviting students to reflect on their personal experiences, opinions or observations.
See our page on 'Developing writing assignments' in the 'Assessment design' section of our website for additional guidance and examples.
- 3. Revise exam questions
- pose hypothetical questions with no simple answer.
- pose hypothetical scenario-based questions.
- ask students to interpret information from a range of sources.
- require students draw on specific sources, e.g. chapter in course text book.
- ask students to evaluate AI responses (help to develop students critical AI literacy).
Students will need to be given the opportunity to practice responding to novel types of exam questions.
- 4. Revise your MCQs
- consider including images in the question (ensuring the provision of accurate alt text)
- reduce the number, but increase the complexity, of your questions
- present questions that require the student to apply a concept or principle to an up-to date scenario or case study
- use distractors that are all plausible, consistent in content and structure, and share important information with the correct option thereby increasing the need for the evaluation of all options to identify the correct answer
- in the case of mathematical or formulaic questions, consider adding a final question which could be a file upload question, to which the student has to upload a file or image of their workings out.
- 5. Revise your assessment format (not mode)
For example, require students to upload a film of themselves talking though their approach and solution (This won’t work with all submission points, so you should check with your learning technologist).
- 6. Incorporate peer evaluation into group assessments
- engage students in reflection on own and peers’ contributions to group submissions
- can be used with existing group assessments without changing mode
- the new Buddycheck tool, integrated into Canvas, supports peer evaluation and feedback.
If using AI in teaching and assessment
This is a fast-moving space and AI will soon be embedded in our everyday tools. Nevertheless, if using generative AI in your teaching or assessment please take into consideration that:
- most free to access applications require users create an account
- issues of access and equity can arise, particularly when paid for versions of AI tools are available
- it is important to give consideration to inclusive teaching practice: How might you reduce or remove the barriers identified above, for example by paring or grouping students or by preparing in advance generative AI content for them to review or critique.
- it is advisable, therefore, that staff and students use the protected web version of Microsoft Copilot which can be accessed by anyone with a Sussex account as this provides access to a version of Copilot that helps to protect user data (although you will need to remind students to save their chats if they need to refer to them at a later date).
The AI and academic integrity page provides further guidance on ensuring that permissions for the use of AI in all modules are communicated clearly and consistently, and on how students might be asked to acknowldge the use of AI.
Longer term changes
“AI provides a much-needed catalyst to re-think how and why we assess within education.”
In their response to the Department for Education consultation on generative AI in education, QAA argue that the rise of generative AI also provides a further incentive to re-imagine the purpose and methods of assessment in education. This is also an opportunity to interrogate why and how we assess students, both to ensure they are valid and robust, and that they adequately prepare students for life after graduation.
This might include, for example:
- updating learning outcomes and associated criteria to reflect capabilities that graduates will need in an AI-enabled world
- curriculum design which focuses on assessment for (rather than ‘of’) learning, e.g. applying Authentic Assessment principles
- designing assessments that utilise research skills, new technologies, fact-checking and critical thinking, delivered through alternative means including vivas, presentations or portfolios.
- designing in opportunities for students to develop their feedback literacy
- assessing the process by which a final product / submission was generated, rather than the product itself
- making assessment more flexible.
This includes creating space to develop students and staff digital literacy (specifically AI in this case). Recent advancements in consumer technology serves to remind us of the need to ensure we’re developing critical and transferrable digital skills. It also demands that we review and, where necessary, reimagine assessment strategies.
For an overview of assessment strategies, and their strengths and weaknesses, see the QAA advice on developing sustainable assessment strategies in the era of ChatGPT.
For options to consider when making changes to assessment, see UCL’s assessment menu of fifty learning activities that require higher order thinking and/or engage students critically with AI.
Find out how to access help and support, and find additional links and examples of practice, on the main AI in Teaching and Assessment page.
See more from Artificial Intelligence in teaching and assessment