University of Otago
AI in Higher Education Symposium

University of Otago clocktower

AI in Higher Education Symposium Presenters

Jonathan Adams, Toi Ohomai | Te Pūkenga

Jonathan Adams

Jonathan Adams has been the Education Technology Advisor in Whanake Ake Academic Development  at Toi Ohomai for the past six years. He has a background in EdTech design and product development for Samsung and Pearson Education working with TAFE and Higher Education clients in Australia and Singapore. He has been leading the trial and evaluation of Cogniti.ai at Toi Ohomai

Christine Cheyne, Toi Ohomai | Te Pūkenga

Christine Cheyne

Christine is the Regional Kaiako Technology Advisor in Whanake Ake | Academic Development at Toi Ohomai. Previously, she has been the Research Development Manager at EIT, and Associate Professor and Co-ordinator of the Resource and Environmental Planning course at Massey University. Her research in public policy and engagement has informed our thinking around AI design and ethical design issues, as well as piloting the Cogniti.ai application in teaching and learning.

AI design and policy for education

The increasing availability and testing of Artificial Intelligence in Education (AIED) is highlighting the concurrent gap and demand for ethical design and use. This paper proposes the design thinking framework for use in AI design. Design thinking inverts the current AI development process which builds the AI application first, then looks to apply this to human problems. In contrast, the human-centred focus of design thinking in AI development places empathy and agency with users and marginalised or affected parties at the heart of the design process. Design thinking shifts the dominant discourse from the technological merits of AI development to the merits of the AI design for the needs and interests of ākonga (students) and kaiako (teachers), as defined by them. It ensures that AI tools are not just those that are feasible, but desirable from end-user perspectives. By applying design thinking principles, the AI applications are intrinsically aligned to ākonga needs. We consider design thinking to be grounded in consideration of human-centric ethical and cultural influences that shape educational technology uptake in Aotearoa New Zealand.


Sophia Li, Manukau Institute of Technology

Sophia Li

Sophia Li is a senior lecturer from the School of Education, Manukau Institute of Technology with fifteen years of working experience in early childhood education in China and New Zealand. She is teaching early childhood education courses for the degree and diploma programmes. 

Her research interests are applied linguistics, early childhood education and tertiary education. Her new passion is to explore the legitimate use of AI in tertiary teaching. 

When we have to use AI: Exploring AI-supported assessment tools in tertiary education 

Generative Artificial Intelligence (GenAI) is claiming its status as an effective and powerful learning tool in tertiary education. However, the assessment tools remain dominated by mainstream academic writing-based tasks. This study aims to explore integrating GenAI into assessment designs to strengthen learners’ critical thinking, problem-solving and academic skills. There are two examples provided in this presentation. One of the assessments seeks insight from New Zealand’s pre-contact literacy (Rāwiri, 2016) and visual anthropology (El Guindi, 2004) by using a visual/written dual model (Paivio, 1971) for a brand-new assessment design. The other one is focused on integrating GenAI into an existing assessment tool for a smoother adaptation of GenAI in tertiary education settings. The authors argue that the ethical and responsible use of GenAI in assessments improves the learners’ AI literacy, addresses equity and accessibility issues in higher education, and is highly beneficial for the ongoing learning of tertiary learners. Based on this ongoing exploration of using GenAI for assessments, the authors argue that new assessment tools are urgent and necessary for the current and future NZ tertiary landscape. It is the responsibility of all educators’ to face the challenges and opportunities brought by GenAI to create an ethical future for higher education.


Geri Harris, Auckland University of Technology

Geri Harris

Geri Harris is a Senior Lecturer in the AUT Business School. She is a technology futurist who brings complex tech concepts (including AI) into engaging classroom discussions. Geri has a unique blend of ongoing professional industry engagement, academic excellence, and a deep understanding of the business, social, environmental and ethical aspects of technology innovation. 

She works with students, organisations, and the public to explore how emerging digital technologies are transforming the way we do business and how people go about their daily lives. She regularly speaks on expert panels and gives presentations addressing responsible uses of AI, particularly in higher education.    

Embracing generative AI in higher education: Navigating the transition in AUT  

The emergence of generative artificial intelligence (GenAI) has prompted significant changes in higher education. We are undergoing a profound transformation, prompting universities to reevaluate pedagogical and assessment strategies. In this short paper, we adapt the Kuebler-Ross Five Stages of Grief to reveal how Auckland University of Technology (AUT) recognise GenAI as an indispensable part of modern education. The paper uses empirical accounts from early AI adopters to show that AUT is at a place of acceptance – embracing AI not just as a tool, but as a transformative force. AUT educators are adapting to this new technology, despite traditional resistance to change teaching practices. Accounts of integrating GenAI into learning, teaching and assessment demonstrate that our educators are taking the lead in guiding this transformation. We are not blind to gen AI's flaws, which include bias, transparency and privacy. But we debate, explore and upskill ourselves to address these concerns. Our students keep pushing us forward in this cycle through anger at GenAI’s arrival to acceptance that it is here to stay, and we must continue to find ways to move forward in collaboration with our learners.


John Davies, Auckland University of Technology 

John Davies

John is a Senior Manager in the Office of Learning, Teaching and Educational Design (LTED) at Auckland University of Technology. His role involves directing and implementing the university's strategic aims for learning and teaching, with specific responsibility for educational technologies.  

Current priority projects include providing leadership for the roll-out of a university-wide Assessment Project, the integration of generative AI into curriculum design and delivery, and overseeing the design of digital education experiences at AUT.    

Nell Mann, Auckland University of Technology 

Nell MannNell is the Director of the Office of Learning, Teaching and Educational Design (LTED) at Auckland University of Technology. 

She provides the leadership to drive forward key aspects of university policy to strengthen the quality of learning and teaching through the delivery of prioritised initiatives, projects and services. 

LTED has been the key driver in AUT’s response to AI in assessment – working both at a strategic level in policy development, as well as in the design and operationalisation of the implementation.  

A university's comprehensive and integrated response to generative AI in assessment: preparing for a new educational landscape 

The continued development of generative Artificial Intelligence (GenAI) has caused tertiary education to review and evaluate their assessment practices. At Auckland University of Technology (AUT), we have taken a whole-of-institution approach to the systematic integration of generative AI into assessment design. This work is grounded in a new set of assessment principles, policy and procedures that provide a foundation on which to build a sustainable approach to the integration of generative AI into assessment and feedback design. Alongside the policy, a framework has been created to enable teaching staff to make informed short and longer-term decisions about assessment design. In this short paper, we aim to showcase our approach by focusing on three areas: (1) exploring the broader contexts related to generative AI and its influence on our work at AUT, (2) detailing our specific responses to generative AI and assessment that align with institutional strategy, and (3) anticipating future opportunities and challenges in implementing our approach at scale.


Jonathan Adams, Toi Ohomai | Te Pūkenga 

Jonathan Adams has been the Education Technology Advisor in Whanake Ake Academic Development  at Toi Ohomai for the past six years. He has a background in EdTech design and product development for Samsung and Pearson Education working with TAFE and Higher Education clients in Australia and Singapore. He has been leading the trial and evaluation of Cogniti.ai at Toi Ohomai

Christine Cheyne, Toi Ohomai | Te Pūkenga

Christine Cheyne

Christine is the Regional Kaiako Technology Advisor in Whanake Ake | Academic Development at Toi Ohomai. Previously, she has been the Research Development Manager at EIT, and Associate Professor and Co-ordinator of the Resource and Environmental Planning course at Massey University. Her research in public policy and engagement has informed our thinking around AI design and ethical design issues, as well as piloting the Cogniti.ai application in teaching and learning.

GenAI chatbots for tertiary students using Cogniti.ai.

This report shares initial results trialling generative AI (GenAI) agents or chatbots using Cogniti.ai in a tertiary setting in New Zealand. The report evaluates the utility and value of GenAI chatbots in the context of personalised learning and equity of access for EdTech and learning technology.The increasing availability and testing of Artificial Intelligence in Education (AIED) is highlighting the concurrent gap and demand for ethical design and use. The speed of change with this technology makes it imperative we explore the capabilities of newly developed chatbots as quickly as possible, and to make recommendations for use in our tertiary sector, considering a wide range of current chatbot uses already, as described by Liu, D. (2023). This report confirms that AI chatbots are valuable for students in high-stakes testing scenarios, or when used in tandem with a “flipped classroom” delivery.


John Williams, University of Otago

John WilliamsJohn has been a staff member at the Department of Marketing at the Otago Business School since 1996. Prior to that, he was a student of Marketing at Otago since 1984. His research and scholarly interests include research methods and philosophy of science, consumer behaviour; tourism; business ethics and social marketing, and information technology and its impact on business and society. John teaches Digital Marketing, one of the most popular papers in the Department, which reflects his interests. He is also an adjunct research fellow of the Ehrenberg-Bass Institute for Marketing Science, University of South Australia.

Is GAI any different from other forms of academic integrity?

Generative AI (GenAI) has burst on the higher education scene dramatically, and while some faculty have been thinking about what this means for learning and teaching before now, others seemingly confronted these issues in practice for the first time in semester 1 2024. This phenomenon mirrors popular surveys that have been published recently (Fletcher & Neilsen, 2024), which report that although GenAI is in the news for some people, for the general population, a very small proportion (2%) have used the most widely-known GenAI service, ChatGPT, even once, and furthermore can see no or very little use for this technology in their work or private lives. However, higher education faculty who see GenAI as a threat would like to ban or have limited, if any, ability to do so. The implication is that activities designed to facilitate and assess learning need to be revised or devised, such that those activities can either be completed using GenAI, and that use is inherent in the activity, or they cannot be completed using GenAI. However, faculty are cautioned not to assume that GenAI cannot be used in such a case, and to make a genuine effort to test any assumptions.


Patrick Mazzocco, University of Otago

Patrick has a diverse professional background, having worked as a photographer, horse wrangler, tour guide, strength and conditioning coach, corporate program coordinator, and consultant, among other professions. His work has taken him around the world, providing many unique opportunities that have led to a career in learning and development. Collectively, these wide-ranging experiences allow him to apply the information and skills he has acquired towards facilitating transformative learning initiatives and developing career readiness programmes.

Leveraging AI tools for academic writing among L2 doctoral students

Doctoral students, particularly those for whom English is not their first language, face challenges in academic writing – which is troubling given the importance of thesis completion for graduation. In recent years, the use of artificial intelligence (AI) in academia has become increasingly prevalent, providing significant support in research and writing tasks. However, debates persist regarding the extent to which AI should assist students in their thesis writing endeavours. From the perspective of second language socialisation theory, this study explores the perceptions and experiences of non-native English-speaking (L2) doctoral students regarding AI integration in academic writing. Focus group interviews with seven doctoral students in New Zealand and Malaysia revealed challenges faced by these students in relation to their English language proficiency, organisation, comparative analysis, critical thinking, writing and reading comprehension. To overcome these hurdles, participants in this study have indicated that they utilise AI tools for proofreading, initiating writing, idea generation and reading research papers. While AI aids in overcoming writing difficulties, over reliance on such tools may not enhance academic writing skills. This study provides insights into the role of AI in academic writing and the ethical considerations surrounding its use in supporting L2 doctoral students’ academic journeys.


Rachel Spronken-Smith, University of Otago

Rachel Sproken-Smith

Rachel Spronken-Smith is a Professor in Higher Education and Geography in the Graduate Research School at the University of Otago. She has been a geography lecturer, academic developer, head and dean, and is now working in researcher development. Her current research interests are in doctoral education and doctoral outcomes, undergraduate research and inquiry, and curriculum change. She has won several university teaching awards and a national tertiary teaching award in 2015. In 2016, she won the TERNZ-HERDSA medal for Sustained Contribution to the Research Environment in NZ and was a Fulbright Scholar in 2018.

Embedding ChatGPT as a tutor and career adviser in an online course

Given a need for improved career development of doctoral candidates, we developed a short onlinecourse to support career planning. We brought together a large team with relevant expertise todevelop a career readiness course, which was launched mid-2024. This innovative, self-paced, fullyonline course uses an EdX platform and incorporates video and text content, as well as usingChatGPT as a tutor and career adviser. The objectives of this case study are to discuss how the course was developed – including how ChatGPT was incorporated in the course – and to relay preliminary feedback from participants regarding their experiences of ChatGPT as both a tutor and a career adviser. Although there have been some technical difficulties embedding ChatGPT in the course, the inclusion of ChatGPT as both a tutor and a career adviser has received positive feedback from testers and participants. This case study illustrates how ChatGPT can be used to provide personalised, instant and helpful feedback.  


David Parsons, academyEX

David Parsons

David is Research Director at academyEX and Adjunct Professor at the University of Canterbury. His experience includes academic roles at Southampton Solent University in the UK and Massey University in New Zealand, and senior professional roles in the international software industry. He holds a PhD in Information Technology and has research interests in the role of AI and digital technologies in education, agile and lean thinking, curriculum design, and contemporary software development. He has produced over 200 publications, including several books on various aspects of software development. He is a Certified Member of the Association for Learning Technologies.

Kathryn MacCallum, University of Canterbury

Kathryn MacCallum

Kathryn is an Associate Professor of Digital Education Futures at the University of Canterbury. As Director of the Digital Education Futures Lab, she leads a community of researchers exploring digital technologies in education across all contexts, from kindergarten to tertiary education. Kathryn's research focuses on the role and impact of technology in education, particularly how emerging technologies like AI can support the development of digital literacies and how this shifts educational practices and norms. Her innovative work has influenced educational practices in New Zealand and internationally. Kathryn has published extensively and serves as editor-in-chief for several leading international journals.

Supporting tertiary students' critical AI literacy: A practical tool for educators

Ongoing developments in AI challenge educators to ensure that their course delivery integrates relevant aspects of AI literacy within their domains of study. This paper presents an online AI literacy course design evaluation tool, based on an AI literacy framework developed from an international Delphi Study. The framework has four capability levels, each addressing six categories of knowledge and skills. The tool addresses the initial level: “know and understand AI” of the four-level framework, since this includes all the foundational components of AI literacy that are important to all learners. An illustrative example of the tool in use is presented, applied to a level 7 micro-credential that is designed to develop the digital skills of professionals in context. The results, which include recommendations generated by the tool, are discussed against the course learning outcomes to explore how they can be useful for enhancing course design to support AI literacy.


Tim Gander, academyEx   

Tim Gander

Tim Gander is Postgraduate Director at academyEX and consultant at Futurelearning.nz. His recent publication (Gander & Shaw, 2024) highlighted the urgent requirement for learners to build AI literacy to support access to learning. With over 20 years of experience, Tim has worked extensively with teachers, school leaders, and tertiary organisations to enable innovation and equity. Tim is the founder and editor-in-chief of the peer-reviewed journal He Rourou and has been running a community of practice with over 500 educators nationwide that exclusively focuses on AI in education. Tim regularly writes about practice innovations and education leadership on his blog here.

Geri Harris, Auckland University of Technology

Geri Harris is a Senior Lecturer in the AUT Business School. She is a technology futurist who brings complex tech concepts (including AI) into engaging classroom discussions. 

Geri has a unique blend of ongoing professional industry engagement, academic excellence, and a deep understanding of the business, social, environmental, and ethical aspects of technology innovation. 

She works with students, organisations and the public to explore how emerging digital technologies are transforming the way we do business and how people go about their daily lives. She regularly speaks on expert panels and gives presentations addressing responsible uses of AI, particularly in higher education. 

Understanding AI Literacy for higher education students: Implications for assessment

The level of AI literacy among New Zealand's learners varies significantly. The use of AI tools in assessments also shows a lack of consistency in ethical and responsible usage. This paper aims to lay the groundwork for understanding AI literacy and the ethical use of AI tools in assessments. The research has practical implications for educators, policymakers and students. Currently, existing AI literacy frameworks have not been empirically tested, so the true nature of AI literacy in New Zealand is largely unknown. The authors propose a comprehensive study to evaluate and enhance AI literacy among higher education learners, using a mixed-methods approach. This includes conducting a survey to assess students' familiarity with AI concepts, practical application skills, and understanding of ethical considerations within assessments. The survey findings are expected to illustrate the impacts of AI technologies on assessment equity, access and quality. This will contribute to the development of a theory on the effective and ethical use of AI tools in academic assessments.