Discover ways to discuss acceptable uses of artificial intelligence with students.

Our response to generative AI, as for the emergence of Google, Wikipedia, essay mills and so on, needs to include conversations with students about the importance of academic integrity and the effective use of sources and tools in their work.  These are also concerns for students who are learning and working in the era of AI. 

Also, given the pace of change in the capabilities of generative AI and its applications, we are all, educators and students, part of a learning community. We need, therefore, to positively embrace opportunities to work in partnership with students, to build trust and transparency in learning and assessment processes that are co-created, iterative, and supportive of critical thinking.  

It is important to create a collaborative and open environment that encourages students to ask questions and share their thoughts about generative AI tools alongside discussions about their use.

Here are a few suggestions for how to foster an open dialogue with students:

  • Tell students when and how the use of AI is permitted in assessment (please make sure you check the specific guidance on allowing and acknowledging the use of AI in assessments). 
  • Explain how using generative AI tools may result in academic misconduct and how it can be used ethically and as a support to their learning.  
  • Acknowledge that AI detection tools already exist, many much more sophisticated ones are in development and, predictably, a web-based sub-culture of ways to ‘fool’ the detection systems is also growing. Explain that all are fallible and cannot be relied upon.
  • Consider co-creating a contract with students or asking students to include a cover page with their submission that details how AI has been used.
  • Engage students in a discussion about the potential benefits and challenges associated with AI. Highlight the positive aspects, such as improved efficiency, innovation and convenience. At the same time, address concerns about job displacement, privacy and bias as well as ethical issues such as data privacy and algorithmic bias.
  • Engage students in a discussion of the extent to which using a tool like ChatGPT could be considered dishonest? Where is the line? Is generating a framework cheating? An entire essay? What do they think the quality will be like?    
  • Discuss the purpose of learning, the importance of students being proactive and engaged learners, and how assessment supports them in developing and assessing their own development thorough their course.
  • Consider setting learning tasks that incorporate AI. For instance, students could use AI to brainstorm and generate ideas, create outlines, refine presentations and review and edit work. Ensure you explain how you expect such use of AI to be acknowledged by students with their submissions.
  • Develop activities to build critical AI literacy. For example, students can evaluate whether AI identifies accurate and credible sources, assess what datasets are used and consider whose commercial interests are served and what biases are reproduced or created. For more ideas, Digital Assessment Advisors at UCL have developed an assessment menu of fifty learning activities that require higher order thinking and/or engage students critically with AI.

See more from Artificial Intelligence in teaching and assessment