Learn about the practical applications, limitations and ethical considerations of using AI.
What is generative AI?
‘Generative artificial intelligence’ encompasses a range of tools that use AI models to generate seemingly new, realistic content such as text, images, audio and code in response to questions and prompts from users. Such AI models, which include ChatGPT and Microsoft Copilot, are trained on a large corpus of data, i.e. material sourced from the internet, and then taught to generate responses based upon it.
The capabilities of, and uses for, generative AI tools are developing rapidly and will become increasingly embedded in everyday online tools, such as search engines, MS Office and more.
What is generative AI good for?
Applications such as Grammarly and Microsoft Word have been using algorithms for a long time to suggest next words, correct spellings or provide alternative sentence structures. We don’t always use such tools passively. Rather, we can learn from them about how to effectively communicate knowledge and ideas. Similarly, if used critically, the internet provides an easily accessible treasure trove of ideas and inspiration.
Listed below are some examples, informed by student-facing resources developed by UCL and Sydney University, of the ways in which generative AI tools can expand learning opportunities. Note that these tools should be used critically and purposefully. Both students and staff must draw on their own knowledge and expertise to check that the content generated by AI is accurate (see below for the limitations of AI). With this critical perspective in mind, AI can be useful in the following ways.
Using generative AI for learning
This could include:
- providing explanations
- reviewing and critically analysing written materials to assess their validity or quality
- summarising text and transcripts
- helping students who struggle with written tasks or planning
- looking for literature sources.
Using generative AI for creating
This could include:
- overcoming writer's block
- suggesting a structure for a piece of work
- generating ideas for graphics, images and visuals
- getting standards-based feedback
- improving written expression
- helping to improve grammar and writing structure (especially helpful if English is a second language)
- debugging code
- providing opportunities to experiment with different writing styles.
Using generative AI for employability
This could include:
- creating resumes and cover letters
- identifying career opportunities
- making sense of job descriptions
- preparing for interviews.
Accessibility applications
In addition to the uses above, generative AI tools can also help students overcome barriers to inclusivity, which includes barriers of assumed knowledge which is often referred to as the ‘hidden curriculum’. The list below is informed by a 2023 presentation by Melissa Chard Hall, (Student, Study Skills tutor and ADHD self-advocate), who argues that generative Ai can help students:
- overcome ‘mental blocks’
- interpret assessment briefs, learning outcomes or feedback
- engage with and revise from notes or lecture recordings/transcripts
- provide plans for study periods (e.g. ‘what should I do next?’)
- enhance accessibility technologies such as auto-generated video captions.
Applications for teachers and administrators
Generative AI tools can also be helpful in creating:
- multiple-choice questions, answers and feedback
- simulated data sets
- submission exemplars
- fictional case studies
- educational games and characters (personas)
- ideas for essay questions
- learning outcomes
- marking criteria and rubrics
- references
- interview questions
- pub quiz questions, and much more.
The tools are also useful for enhancing student engagement in the classroom.
Limitations of AI
Although generative AI offers numerous benefits, it's important to be aware of the limitations and considerations when using generative AI, particularly in educational settings.
The limitations listed below are adapted from helpful guidance for educators and students produced by Open.AI, the creators of ChatGPT:
- AI tools are not intelligent or sentient
- While their output can appear confident, plausible and well written, AI tools frequently get things wrong and can’t be relied upon to be accurate.
- They are prone to ‘hallucination’, whereby they sometimes make up facts, distort the truth or offer arguments that are wrong or don’t make sense.
- They don’t consistently provide sources for the content they produce.
- They perform better in subjects that are widely written about, and less well in niche or specialist areas.
- They cannot currently provide accurate references – they might fabricate well-formatted but fictitious citations.
- They can perpetuate harmful stereotypes and biases and are skewed towards Western perspectives and people.
This means these tools should be avoided for:
- research, e.g. as a substitute for Google Scholar/Web of Science etc.
- writing from scratch
- unsupervised editing
- tasks about which one has no proper knowledge or can’t verify
Ethical implications
Generative AI tools are not neutral. The world is still in the early days of grappling with the ethical and legal impacts and implications of their development and use. Engaging critically with generative AI is, therefore, important for staff and students.
- Exploitation and bias
Generative Ai are been trained using sources which perpetuate biases (i.e. the internet). This is as much a concern for image generation tools, as it is for text-based tools. See, for example, this discussion, by Dustin Hosseni in 2023, which illustrates the problematic intersections of racialized gender, race, ethnicity in AI generated representations.
Such AI tools have also been refined using human labour, which can be exploitative and damaging. See, for example, this January 2023 Time article which reports on how low paid Kenyan workers were tasked by OpenAI with labelling textual descriptions of sexual abuse, hate speech, and violence.
- Plagiarism and copyright considerations
AI platforms recover patterns and relationships, which they then use to create rules, and then make judgments and predictions, when responding to a prompt. As this April 2023 report in the Harvard Business Review explains, “this process comes with legal risks, including intellectual property infringement. For example, does copyright, patent, trademark infringement apply to AI creations? Is it clear who owns the content that generative AI platforms create?” These are legal challenges that as still being resolved.
The legal relationship between intellectual property law and generative AI is currently under review by UK government . A first step in this process is the creation of a working group of industry representatives from the technology, creative and research sectors and the Intellectual Property Office, which began meeting in June 2023, to draft a voluntary code of practice on copyright and AI.
Practical considerations
When using AI yourself, or guiding students in its use, it is important to protect one’s own and others' information and rights. If you are using university data in an AI tool, such as learning and teaching content, then ensure you use the data protected version of Microsoft Copilot accessed via your Sussex account to ensure that any data is protected.
Before using generative AI tools
Read the terms of use and, if you decide to use the tool, comply with its terms. While interacting with generative AI, it is important to avoid providing the AI tools with certain information to protect your and others' information and rights.
Avoid sharing protected or highly protected data with AI tools
This includes personally identifiable information, names, addresses, unpublished research data and results, biometric data, health and medical information, geolocation data, government-issued personal identifiers, confidential or commercially sensitive material, unpublished exams, and security credentials.
Express permission should be obtained from the person who controls the copyright
Without permission from the person who controls the copyright, third party copyrighted material, including e-texts, should not be entered into generative AI products.
Where possible, opt out of data collection
Microsoft Copilot when accessed via your Sussex account will ensure your data is protected and is the recommended AI tool. For other AI tools you will need to ensure, where possible, that you opt out of data collection.
Find out how to access help and support, and find additional links and examples of practice, on the main AI in Teaching and Assessment page.
See more from Artificial Intelligence in teaching and assessment