• AIGE
    • Teaching with Responsible AI Network
    • Digital Poverty and Inclusion Research
    • The Educational Research Greenhouse
    • AIGE in Action
    • Gallery: But did they actually write it?
    • Services
  • Scenarios AI Governance
    • About the case studies and scenarios?
    • Why Case Studies and Scenarios?
    • Case Study Template
    • Developing AI Literacy
  • Mitigating Risks of AI in Education
    • Deepfakes
    • Still Learning, Still Failing?
    • Optimised for Inequity
    • The Pilot that Went too far
    • Lessons from the NewCo Chatbot Example
    • The Los Angelese School Chatbot Debacle
  • Academic and Research Integrity
    • Mirror, Mask, or Misdirection?
    • Assessment Reform
    • did a human write this
    • it just said no
  • Leadership
    • Balancing workload and Assessment reform
    • Programmatic Possibilities
    • Automation and Abdication
    • The Global Influence of Big Tech
    • Who's in Charge here?
    • It Works, But Does It Belong?
    • Everyone and No One
  • Human Oversight
    • Hands Off Learning
    • Click to comprehend
    • Marked by the Machine
    • Just Follow the System
    • Implementing AI-Driven Recommender Engin
    • Facial Recognition Technology in educati
  • Engagement
    • Whose Voice Counts?
    • The Algorithm Didn’t See Me
    • Flagged and Forgotten
    • The library as a central hub
    • Accredited programs
  • Ethical AI
    • GenAI Hallucinates
    • The System Said So
    • Not Meant to Exclude
    • Justice Deferred
  • Compliance
    • Scan First, Act Later
    • Lost in the System
    • We Never Looked Under the Hood
    • Show Us the Proof
  • Monitoring
    • Aligning AI Tools with Educational Value
    • It wasn't ready
    • It Drifted
    • It solved the wrong problem
  • Transparency
    • It was a black box
    • we signed before we asked
    • behind closed algorithms
  • About Us

GenAI Tells a Different Story Teaching Science when AI Hallucinates

ETHICAL AI

How to cite this learning scenario

Arantes, J. (2025). GenAI Tells a Different Story. Case Studies in AI Governance for Education. www.AI4education.org. Licensed under a Creative Commons Attribution 4.0 International License.
abstract
This scenario explores the epistemic and ethical tensions arising as GenAI becomes embedded in pre-service science education. When a GenAI system presents fabricated or biased scientific claims as fact, pre-service teachers must critically interrogate the outputs, confronting the shifting nature of ‘truth’ and ‘objectivity’ in science teaching. The scenario foregrounds technical agonism as a lens to examine how future educators negotiate competing accounts of knowledge, evidence, and authority. It invites inquiry into the evolving skills, literacies, and ethical dispositions required to teach science in a post-truth, GenAI-mediated world.

“In a world where even our machines can invent convincing falsehoods, the mark of a true science educator is not in knowing all the answers, but in teaching students to question, verify, and think together.”

Teaching Science when AI Hallucinates

In a methods class, pre-service science teachers are tasked with designing lesson plans using a popular GenAI tool. Maria, a student, prompts the GenAI to generate an explanation of photosynthesis suitable for Year 7. The output is polished but subtly incorrect, attributing a fictional compound as critical to plant energy conversion. During a class review, some students trust the output because “AI knows more than us,” while others spot inconsistencies based on their foundational knowledge. Before the lesson, the class had read Arantes (2024), "Understanding Intersections Between GenAI and Pre-Service Teacher Education: What Do We Need to Understand About the Changing Face of Truth in Science Education? "(Journal of Science Education and Technology, 1-12). The article introduced the concept of technical agonism—a way of describing the ongoing struggle or debate between different sources of knowledge, especially when technology is involved. In simple terms, technical agonism means not taking any one answer (even from AI) at face value, but instead actively questioning, comparing, and weighing competing explanations and sources. It’s about embracing disagreement and discussion as a natural part of understanding, rather than looking for one easy ‘correct’ answer. Prompted by this reading, the lecturer intervenes, guiding the group to use technical agonism in practice: “How do we verify scientific ‘facts’ in an era when GenAI can confidently fabricate plausible-sounding content? Who, or what, becomes the arbiter of truth when machine-generated authority challenges established knowledge and evidence?” The group debates objectivity, the limits of evidence, and the role of critical discussion when GenAI is part of classroom knowledge. They negotiate their responsibilities as future science educators, learning that healthy skepticism, dialogue, and collaborative inquiry are essential skills—not only for themselves, but for the students they will one day teach.

Research Topics

Research Questions

Epistemic agency and critical literacy in GenAI-mediated science education Technical agonism and its role in teacher preparation for AI-rich classrooms The impact of GenAI hallucinations on pre-service teachers’ understanding of scientific objectivity Gender, authority, and trust in AI-generated science content Pedagogical frameworks for teaching ‘post-truth’ science in teacher education programs The ethics of GenAI use in curriculum design and knowledge validation
How do pre-service teachers evaluate and respond to GenAI-generated scientific inaccuracies in lesson planning? What skills and dispositions do teacher educators identify as essential for navigating epistemic uncertainty introduced by GenAI? How does technical agonism shape discussions of evidence, authority, and objectivity among pre-service science teachers? In what ways do GenAI hallucinations affect pre-service teachers’ trust in technology versus traditional sources of scientific knowledge? How are gendered perceptions of expertise and authority reinforced or challenged in AI-mediated science education settings?

Data collection:

Facilitated discussions using vignettes (like the above) to prompt reflection on truth, evidence, and authority. Systematic examination of lesson plans created with and without GenAI assistance to identify patterns in fact-checking and citation. Pre-service teachers keep diaries detailing their decision-making processes when confronting conflicting or inaccurate AI-generated content. In-depth interviews with pre-service teachers and educators about their experiences, beliefs, and strategies for navigating GenAI in science education. Quantitative measures to assess changes in epistemic beliefs, trust in AI, and critical digital literacy before and after exposure to GenAI-based tasks. 

Do you want to know more?
© Copyright 2024 Web.com Group, Inc. All rights reserved. All registered trademarks herein are the property of their respective owners.
Subscribe to the AIGE Newsletter
Acknowledgement of Country We acknowledge the Ancestors, Elders and families of the Kulin Nation (who are the traditional owners of the land. where this work has bene predominantly completed. As we share our own knowledge practices, we pay respect to the deep knowledge embedded within the Aboriginal community and recognise their ownership of Country. We acknowledge that the land on which we meet, learn, and share knowledge is a place of age-old ceremonies of celebration, initiation and renewal, and that the Traditional Owners' living culture and practices have a unique role in the life of this region.

We use cookies to enable essential functionality on our website, and analyze website traffic. By clicking Accept you consent to our use of cookies. Read about how we use cookies.

Your Cookie Settings

We use cookies to enable essential functionality on our website, and analyze website traffic. Read about how we use cookies.

Cookie Categories
Essential

These cookies are strictly necessary to provide you with services available through our websites. You cannot refuse these cookies without impacting how our websites function. You can block or delete them by changing your browser settings, as described under the heading "Managing cookies" in the Privacy and Cookies Policy.

Analytics

These cookies collect information that is used in aggregate form to help us understand how our websites are being used or how effective our marketing campaigns are.