Learn about the practical applications, limitations and ethical considerations of using AI.

What is generative AI?

‘Generative artificial intelligence’ encompasses a range of tools that use AI models to generate seemingly new, realistic content such as text, images, audio and code in response to questions and prompts from users. Such AI models, which include ChatGPT and Microsoft Copilot, are trained on a large corpus of data, i.e. material sourced from the internet, and then taught to generate responses based upon it.  

The capabilities of, and uses for, generative AI tools are developing rapidly and will become increasingly embedded in everyday online tools, such as search engines, MS Office and more.

What is generative AI good for?

Applications such as Grammarly and Microsoft Word have been using algorithms for a long time to suggest next words, correct spellings or provide alternative sentence structures. We don’t always use such tools passively. Rather, we can learn from them about how to effectively communicate knowledge and ideas. Similarly, if used critically, the internet provides an easily accessible treasure trove of ideas and inspiration.

Listed below are some examples, informed by student-facing resources developed by UCL and Sydney University, of the ways in which generative AI tools can expand learning opportunities. Note that these tools should be used critically and purposefully. Both students and staff must draw on their own knowledge and expertise to check that the content generated by AI is accurate (see below for the limitations of AI). With this critical perspective in mind, AI can be useful in the following ways.

Using generative AI for learning

This could include:

  • providing explanations
  • reviewing and critically analysing written materials to assess their validity or quality
  • summarising text and transcripts
  • helping students who struggle with written tasks or planning
  • looking for literature sources.

Using generative AI for creating

This could include:

  • overcoming writer's block
  • suggesting a structure for a piece of work
  • generating ideas for graphics, images and visuals
  • getting standards-based feedback
  • improving written expression
  • helping to improve grammar and writing structure (especially helpful if English is a second language)
  • debugging code
  • providing opportunities to experiment with different writing styles.

Using generative AI for employability

This could include:

  • creating resumes and cover letters
  • identifying career opportunities
  • making sense of job descriptions
  • preparing for interviews.

Accessibility applications

In addition to the uses above, generative AI tools can also help students overcome barriers to inclusivity, which includes barriers of assumed knowledge which is often referred to as the ‘hidden curriculum’. The list below is informed by a 2023 presentation by Melissa Chard Hall, (Student, Study Skills tutor and ADHD self-advocate), who argues that generative Ai can help students:

  • overcome ‘mental blocks’
  • interpret assessment briefs, learning outcomes or feedback
  • engage with and revise from notes or lecture recordings/transcripts
  • provide plans for study periods (e.g. ‘what should I do next?’)
  • enhance accessibility technologies such as auto-generated video captions.

Applications for teachers and administrators

Generative AI tools can also be helpful in creating:

  • multiple-choice questions, answers and feedback
  • simulated data sets
  • submission exemplars
  • fictional case studies
  • educational games and characters (personas)
  • ideas for essay questions
  • learning outcomes
  • marking criteria and rubrics
  • references
  • interview questions
  • pub quiz questions, and much more.

The tools are also useful for enhancing student engagement in the classroom.

Limitations of AI

Although generative AI offers numerous benefits, it's important to be aware of the limitations and considerations when using generative AI, particularly in educational settings.

The limitations listed below are adapted from helpful guidance for educators and students produced by Open.AI, the creators of ChatGPT:

  • AI tools are not intelligent or sentient
  • While their output can appear confident, plausible and well written, AI tools frequently get things wrong and can’t be relied upon to be accurate.
  • They are prone to ‘hallucination’, whereby they sometimes make up facts, distort the truth or offer arguments that are wrong or don’t make sense.
  • They don’t consistently provide sources for the content they produce.
  • They perform better in subjects that are widely written about, and less well in niche or specialist areas.
  • They cannot currently provide accurate references – they might fabricate well-formatted but fictitious citations.
  • They can perpetuate harmful stereotypes and biases and are skewed towards Western perspectives and people.

This means these tools should be avoided for:

  • research, e.g. as a substitute for Google Scholar/Web of Science etc.
  • writing from scratch
  • unsupervised editing
  • tasks about which one has no proper knowledge or can’t verify

Ethical implications

Generative AI tools are not neutral. The world is still in the early days of grappling with the ethical and legal impacts and implications of their development and use. Engaging critically with generative AI is, therefore, important for staff and students. 

Practical considerations

When using AI yourself, or guiding students in its use, it is important to protect one’s own and others' information and rights. If you are using university data in an AI tool, such as learning and teaching content, then ensure you use the data protected version of Microsoft Copilot accessed via your Sussex account to ensure that any data is protected.

Before using generative AI tools

Read the terms of use and, if you decide to use the tool, comply with its terms. While interacting with generative AI, it is important to avoid providing the AI tools with certain information to protect your and others' information and rights.

Avoid sharing protected or highly protected data with AI tools

This includes personally identifiable information, names, addresses, unpublished research data and results, biometric data, health and medical information, geolocation data, government-issued personal identifiers, confidential or commercially sensitive material, unpublished exams, and security credentials.

Express permission should be obtained from the person who controls the copyright

Without permission from the person who controls the copyright, third party copyrighted material, including e-texts, should not be entered into generative AI products. 

Where possible, opt out of data collection

Microsoft Copilot when accessed via your Sussex account will ensure your data is protected and is the recommended AI tool. For other AI tools you will need to ensure, where possible, that you opt out of data collection.

Find out how to access help and support, and find additional links and examples of practice, on the main AI in Teaching and Assessment page.  

See more from Artificial Intelligence in teaching and assessment