Guidance on why and how to incorporate generative artificial intelligence (AI) into your teaching and assessment.

What is generative AI?

‘Generative artificial intelligence’ encompasses a range of tools that use AI models to create seemingly new, realistic content (such as text, images, audio and code) in response to questions and prompts from users. AI models including ChatGPT and Google’s Bard, are trained with a large collection of data, material sourced from the internet, and taught to generate responses based upon it.

The capabilities and uses for AI tools are developing rapidly. With this in mind we’ve put together suggestions for why and how you might incorporate generative AI into your teaching and assessment, and engage students in productive conversations about its limitations and acceptable use.

Responding to advances in AI

Generative AI's rapid development has made us all part of a learning community.

We need to positively embrace opportunities to work in partnership with students, to build trust and transparency in assessment processes that are co-created, iterative, and supportive of critical thinking.

This includes supporting students to develop their digital literacy, enabling them to use generative AI tools effectively, ethically and critically. We  need to continue to reflect critically on where the boundaries lie between using AI to support learning and assessment and academic misconduct, to then communicate them clearly. These are also concerns for students who are learning and working in the era of AI.

What AI is good for

Applications such as Grammarly and Microsoft Word have been using algorithms for a long time to suggest next words, correct spellings or provide alternative sentence structures. We don’t use such tools passively. Rather, we learn from them about how to effectively communicate knowledge and ideas.

Listed below are some examples, informed by a student facing resource developed by UCL, of the ways in which generative AI tools expand such learning opportunities:

  • answering questions where answers are based on material which can be found on the internet
  • drafting ideas and planning or structuring written materials or presentations
  • generating ideas for graphics, images and visuals
  • providing explanations
  • summarising text and transcripts
  • reviewing and critically analysing written materials to assess their validity or quality
  • helping to improve grammar and writing structure (especially helpful if English is a second language)
  • providing opportunities to experiment with different writing styles
  • debugging code
  • helping to overcome writer’s block
  • helping students who struggle with written tasks or planning
  • enhancing accessibility technologies such as auto-generated video captions.

The limitations of AI

Open.ai, the creators of ChatGPT, have provided helpful guidance for educators and students. The limitations can be summarised as:

  • AI tools are not intelligent or sentient
  • While their output can appear confident, plausible and well written, AI tools frequently get things wrong and can’t be relied upon to be accurate. They can sometimes distort the truth or offer arguments that are wrong or don’t make sense.
  • They perform better in subjects that are widely written about, and less well in niche or specialist areas.
  • They cannot currently provide accurate references – they might fabricate well formatted but fictitious citations.
  • They can perpetuate harmful stereotypes and biases and are skewed towards Western perspectives and people.

AI and cheating

Our response to generative AI should focus on engaging students in conversations about the importance of academic integrity, as well as the effective use of sources and tools in their work.

The University's policy on academic misconduct via ‘personation’ was expanded in January 2023 to include AI generated text or content. However, as explained in the University’s March 2023 statement on Advances in Technology and Academic Integrity, it does not ban the use of AI. It is up to module convenors as to how students might be allowed to use AI generated text or responses in their submissions. If students have been given explicit permission use AI it should be referenced or acknowledged (e.g. for use of examples or images in assignments).

  • Personation Policy, terminology and process

    Our Academic Misconduct Policy now defines personation in written submissions as ‘where someone, or software (unless explicitly permitted in the assessment guidance from the module convenor) other than the student prepares the work, part of the work, or provides substantial assistance with work submitted for assessment. This includes but is not limited to: AI generated text or responses.’

    As with any case of suspected personation, suspected cases should be reported by module convenors to their school investigating officer. Cases may be referred to an academic misconduct panel which will consider evidence and make a determination based on balance of probability.

    Email queries to academicmisconduct@sussex.ac.uk

  • Talking with students about appropriate use of sources

    Transparency with students about how to use sources, from academic publications to Wikipedia and web-based sources, and about the pros, cons and acceptable use of AI tools, is vital. You might talk with students about:

    • the extent to which using a tool like ChatGPT could be considered dishonest. Where is the line? Is generating a framework or an entire essay cheating? What do they think the quality will be like?
    • how they use calculators, spell checkers, translators, and Google searches and AI tools.
    • how generative AI tools may result in academic misconduct and how it can be used ethically and as a support to their learning.
    • how AI generated responses get things wrong.
    • how detection tools already exist (and with more sophisticated ones in development), as well as a web-based sub-culture of ways to ‘fool’ the detection systems. Explain that all are fallible and cannot be relied upon.
    • acceptable use of AI and other sources in academic work.
  • Enabling acknowledgement of the use of AI in assessment preparation

    If module convenors choose to allow students to use AI this must be made clear in the assessment brief, including guidance on what is acceptable and how it should be acknowledged. This could be through:

    • the Assessment section of your Canvas module site (essential)
    • lecture assessment briefings (in addition to above)
    • the course handbook, if a policy has been agreed at School or Course level (check with your Director for Teaching and Learning).

    It's important to discuss with colleagues and marking teams the implications of such statements for applying marking criteria; for example if permitted, then students should not be penalised for acknowledging the use of AI. Your School may also have a policy about acceptable use.

    The following guides provide useful examples of how to talk to students about acceptable use of AI and how to acknowledge its use in submissions:

  • AI checking tools and Turnitin

    Do not submit student work to online ‘AI checkers’ (there are many). You will be sharing student work with unregulated sites and further feeding the AI tools. The results of these checkers can be inconsistent.

Teaching and assessing with generative AI

There are many ways that we, as educators, can use AI tools and their outputs to help engage students in the kind of analysis and critical thinking that is key to deep and meaningful learning.

You can create examples or case studies to teach with, create submission exemplars, and have discussions about:

Examples and links

Some examples of teaching and assessing with generative AI are being shared across the Higher Education sector.

These include:

Visit our Teaching with AI Collaborative Padlet to explore more ideas, examples, and links, and to share your own.

Next steps and further guidance

Through innovative, authentic, and appropriate assessment design, as well as staff and student education, we can continue to measure attainment through a wide range of assessments.

Find support and guidance on curriculum and assessment design, including guidance on authentic and flexible assessment models and approaches.

To request workshops, training or additional support, contact your Academic Developer.

 

Page last updated March 2023.

See more from Support and guidance