Introduction
One of the most important things we can do for our students is to have a conversation with them at the beginning of the course about AI:
- How AI is likely to be used in their work life.
- What use of AI is appropriate and relevant in your course.
- How they can use it in assessments.
Provide detailed explanations of permitted uses of AI in your assessment instructions (refer to the Guidance on AI in assessment page). Then have a conversation at the end of the course to find out how students have used AI and what their reflections are.
You should be having discussions within your School/Faculty to ensure that programmatic outcomes are being achieved and you have a scaffolded approach to use of AI in your assessments across the program. What is most important is that you have sufficient evidence that the students are doing the learning and achieving the learning outcomes of your course. We also need to acknowledge external accreditation bodies and the need to keep up to date with their advice in relation to the impact of generative AI on assessment. Communication is key for your students to understand how and why they should (or shouldn't) use AI in your course.
You can also ask and talk with colleagues around the university in the ChatGPT/AI Ed Community – see the AI Upskilling and events page.
Prompt techniques and ideas to try in your teaching
Generative AI is likely to be helpful for teaching academics in creating class and course outlines, summarising foundational concepts, developing assessment questions and rubrics and providing general feedback.
The following guides present examples for academic staff to harness the potential of generative AI as a teaching assistant, aiming to streamline workload and enhance operational efficiencies. However, due to the inherently unpredictable nature of generative AI outputs, it is crucial to independently verify all generated content before you use it.
How to use GenAI effectively – the art of prompting with examples
Adobe Firefly guide
Learn how to create images and improve creative workflows using text prompts.
Generating assessment rubrics with AI
Using AI for Rubric Writing
Helena Pacitti, Nexus Fellow and Lecturer, introduces the benefits and limitations of using generative AI to write assessment rubrics.
Part 1: Utilising Generative AI to Design Assessment Rubrics
Part 1 focuses on using generative AI to structure the layout of a rubric, review and refine rubric criteria and weights.
Part 2: Utilising Generative AI to Design Assessment Rubrics
Part 2 focuses on reviewing and adjusting performance descriptors to ensure appropriate use of terminology, detail so students can demonstrate learning and reconciling any conflicting descriptions.
Enhancing equity and accessibility with AI
Supporting Universal Design for Learning using AI tools
Universal Design for Learning (UDL) expert Prof. Terry Cumming, Professor of Special Education in the School of Education and UNSW Scientia Education Fellow, discusses the role of UDL in creating inclusive educational environments, as well as how AI can support UDL implementation.
Creativity for Accessibility: Using AI tools to implement UDL
Prof. Terry Cumming, Professor discusses how getting creative with AI tools can make teaching and assessment more accessible from both the teacher and student perspective.
Creating Lightbulb Moments: Implement Universal Design for Learning with AI
Lucy Jellema (Educational Developer, Equity) explores how teachers can use AI to present course materials in accessible formats, help students see how course content relates to their own lives, and brainstorm activities that can engage classes of all sizes.
Enhancing learning for neurodiverse learners with AI
Prof. Terry Cumming discusses how AI technology can enhance learning for neurodiverse learners.
Case studies
Explore how we can harness the power of AI to enhance learning for our students through these case studies written by UNSW academics.
Implementing Generative AI with Students Using the RECESS Model
Nexus Fellow and Senior Lecturer Andrew Dymock explains how genAI can systematically be implemented using the RECESS model to promote learning with students.
GenAI in first year
Dr Sharon Aris presents a case study for introducing genAI use into a first year social sciences research course.
Minimising the use of generative AI through simulated role-play
Discover how Associate Professor Andrea Benvenuti of the School of Social Sciences made use of simulated role-play to reduce the use of generative AI, combat plagiarism and prepare students for real-world policy-making.
Critiquing an AI output to strengthen communication skills
Learn how Associate Professor Jenny Richmond from the School of Psychology explored generative AI technology with her students to critique the output over a series of rounds, with the aim of strengthening communication skills.
Collaborative project with personal learning reflection
Learn how Associate Professor Zixiu Guo and Associate Professor Carmen Leong rethought their recent assessment to minimise the impact of generative AI responses.
Generative AI as an "assisted format" for generating a research rationale
Learn how EF Lecturer Gee Chong Ling and Associate Professor Jai Tree from the School of Biotechnology and Biomolecular Sciences joined forces to explore AI as a planner and potential learning assistant for their students to help them understand the data generated from a science experiment.
Using ChatGPT for support in statistical computing
Associate Professor Sam Kirshner, from UNSW's School of Information Technology & Technology Management, discusses how to use ChatGPT for support in statistical computing.
Collaborative discussions and experimentation with students
Professor Andy Baker from UNSW's School of Biological, Earth and Environmental Sciences discusses how he experimented with AI in a course.
Using Elicit as a research tool
Dr May Lim, Senior Lecturer in UNSW's School of Chemical Engineering, highlights how she uses Elicit as a research tool in her course.
Envisioning the future – using data insights for student learning and support
Professor Simon McIntyre, Director Educational Innovation, outlines the vision behind UNSW's ambitious and exciting project to use data, learning analytics and innovative technologies for a much more personalised student experience. Learn more about the project, including practical perspectives from an early adopter, Associate Professor Lynn Gribble.
Examples
Students already have access to a wide range of software that can improve their work. Whether the software/service is appropriate to use depends on the nature of the assessment and the instructions they have been given. Possible legitimate uses include:
- Checking spelling, punctuation and grammar
- Adjusting tone or style, e.g. removing colloquialisms and creating a professional tone
- Linking articles via citations and creating summaries
- Using speech-to-text tools via dictation
- Automatically translating text into another language
- Organising and outlining text passages
Similarly, generative AI can be used by students for a range of purposes, such as:
- Suggesting research topics
- Explaining course content in an easily understandable way
- Producing an outline as a starting point for an assignment
- Proofreading and correcting text in a similar way to grammar tools
- Giving feedback on style and content
- Generating an answer that the student can critically reflect on
- Generating images or videos
All of these uses for generative AI could be legitimate or illegitimate, depending on what is explicitly allowed in the course.
AI software is now able to generate substantial portions of the work to satisfy certain assessment types (for example generating answers to short answer questions, multiple choice, coding, numerical working). Submitting purely AI-generated work, or not attributing the role of AI in student work when mandated, is not permissible per the University’s academic integrity policies.
For these reasons, students will need explicit instructions on when and how their GenAI usage should be acknowledged in referencing (see Guidance on AI in assessment for more information). They will also need a clear understanding of the uses and limitations of different GenAI tools, especially as AI can give misinformation or 'hallucinations' (see Limitations and pitfalls for more information).
How do UNSW students feel about AI?
How do UNSW students use AI tools?
Limitations and pitfalls
Prior to using generative AI tools such as ChatGPT or Microsoft Copilot, it is crucial to consider both their implications and inherent limitations. These can include:
- Privacy: Avoid using GenAI for any task requiring you to provide any person's personal/sensitive information, or for detecting AI-generated content. Microsoft Copilot with Commercial Data should be used when you are using generative AI to analyse UNSW data (e.g., when providing feedback on student work), as this will ensure that it remain confidential.
- Hallucination: AI tools may struggle to admit when they lack information and may offer an answer anyway. They can also struggle to cite sources and may create fake sources.
- Biases: Despite the common perception of GenAI tools as neutral, they can harbour biases, including:
- Cultural bias: GenAI tools predominantly undergo training on datasets composed of mainly U.S. and British content. It is important to note that gender and racial biases are also present in GenAI tools.
- Confirmation bias: GenAI algorithms prioritise seeking content aligned with their pre-existing knowledge, showing a tendency to overlook sources that challenge or contradict their current understanding.
- Authority bias: Algorithms attribute higher accuracy to the opinions of individuals deemed experts in a field rather than prioritising objective truth.
- Incomplete data: ChatGPT-3.5 has only been trained on data up to a certain point in time (September 2021). Other tools such as GPT-4o and Microsoft Copilot can search current web pages.
- Environmental impact: Training GenAI involves significant energy consumption and carbon emissions.
- Copyright and ownership: The copyright and ownership status of generated materials (e.g., text, images, videos, code, etc.) remains unclear.
- Misinformation: The use of GenAI to create and disseminate false information (including deepfakes) raises concerns. Students must be educated to critically evaluate generative AI outputs.
- Equity: While GenAI can enhance access, especially as a personalised tutor, limited availability of necessary technology (e.g., laptops, internet) in financially constrained communities worsens existing educational inequities if not addressed.
- Ethical use/development of GenAI for business: With discussions still ongoing regarding the extent of government versus business responsibility, this is an unclear area.
With the increasing use of AI in society, it is likely that university courses will need to include the study and understanding of GenAI and its ethical implications. It is important to consider the frequency and placement of GenAI content within a curriculum, so that students are not overwhelmed with similar activities across all their courses. A programmatic approach to incorporating GenAI may be considered within your Faculty.
For more information, please see Ethical and Responsible Use of Artificial Intelligence at UNSW and Academic Integrity.