• AIGE
    • Teaching with Responsible AI Network
    • Digital Poverty and Inclusion Research
    • The Educational Research Greenhouse
    • AIGE in Action
    • Gallery: But did they actually write it?
    • Services
  • Scenarios AI Governance
    • About the case studies and scenarios?
    • Why Case Studies and Scenarios?
    • Case Study Template
    • Developing AI Literacy
  • Mitigating Risks of AI in Education
    • Deepfakes
    • Still Learning, Still Failing?
    • Optimised for Inequity
    • The Pilot that Went too far
    • Lessons from the NewCo Chatbot Example
    • The Los Angelese School Chatbot Debacle
  • Academic and Research Integrity
    • Mirror, Mask, or Misdirection?
    • Assessment Reform
    • did a human write this
    • it just said no
  • Leadership
    • Balancing workload and Assessment reform
    • Programmatic Possibilities
    • Automation and Abdication
    • The Global Influence of Big Tech
    • Who's in Charge here?
    • It Works, But Does It Belong?
    • Everyone and No One
  • Human Oversight
    • Hands Off Learning
    • Click to comprehend
    • Marked by the Machine
    • Just Follow the System
    • Implementing AI-Driven Recommender Engin
    • Facial Recognition Technology in educati
  • Engagement
    • Whose Voice Counts?
    • The Algorithm Didn’t See Me
    • Flagged and Forgotten
    • The library as a central hub
    • Accredited programs
  • Ethical AI
    • GenAI Hallucinates
    • The System Said So
    • Not Meant to Exclude
    • Justice Deferred
  • Compliance
    • Scan First, Act Later
    • Lost in the System
    • We Never Looked Under the Hood
    • Show Us the Proof
  • Monitoring
    • Aligning AI Tools with Educational Value
    • It wasn't ready
    • It Drifted
    • It solved the wrong problem
  • Transparency
    • It was a black box
    • we signed before we asked
    • behind closed algorithms
  • About Us

‘Not Meant to Exclude’ A case to explore Addressing Biases and Unintended Discrimination in Educational AI Systems

ETHICAL AI

How to cite this learning scenario

Arantes, J. (2025). Not Meant to Exclude. Case Studies in AI Governance for Education. www.AI4education.org. Licensed under a Creative Commons Attribution 4.0 International License.
abstract
This case study investigates how well-meaning AI systems can produce discriminatory outcomes when they are designed or deployed without a critical understanding of social context, structural bias, and inclusion. Set in a culturally and linguistically diverse school, the fictionalised—but research-informed—scenario traces the rollout of an AI-powered career pathways tool. Despite aiming to promote equity, the system reinforced stereotypes about gender, race, and disability in its recommendations. The case illustrates the importance of equity audits, inclusive datasets, and collaborative design processes to reduce harm and create AI systems that genuinely support all learners.

AI doesn’t have to mean to discriminate for discrimination to occur. Equity must be designed into every stage of AI development—because intention is not the same as impact.

Not Meant to Exclude

In 2024, Sunrise District School Board introduced PathwayAI, a career exploration tool using AI to match students with potential post-school destinations based on academic history, behaviour reports, and interests. Marketed as a tool to “unlock every student’s potential,” the algorithm quickly became a central part of student wellbeing conversations and subject selection. But troubling patterns began to emerge. Students with disabilities were frequently steered toward low-skill, manual careers—even when their academic results were strong. Girls were rarely recommended for STEM careers. Students from refugee backgrounds were advised against university, based on incomplete or misunderstood educational histories. When challenged, developers explained that the system was trained on “success patterns” from previous student cohorts—cohorts shaped by historic inequities, migration barriers, and exclusionary policies. Teachers reported frustration at having to “undo” the assumptions students internalised from the tool. Parents raised concerns about algorithmic bias reinforcing disadvantage rather than challenging it. A public investigation followed, which identified discriminatory outcomes and recommended the urgent redesign of PathwayAI using inclusive design principles, intersectional audit processes, and stakeholder co-design. Following these findings, the district adopted an equity-by-design framework for AI in education. This included partnerships with equity advisors, culturally and linguistically diverse communities, and disability advocates to review datasets and reframe system logic. This case reinforces the need for proactive identification and remediation of bias in AI—and for educators, developers, and policymakers to centre inclusion from the outset.

Overview

discussion and application



Discussion Questions
This case challenges stakeholders to confront the often invisible ways AI can reinforce discrimination—and explore what it takes to design for justice, not just convenience.
Learning Objectives Participants will: Understand how algorithmic bias can emerge from both design and deployment choices. Identify examples of unintended discrimination in educational AI systems. Explore frameworks and tools to audit and redress bias in existing AI technologies. Develop strategies to embed equity, inclusion, and anti-discrimination principles in AI planning and implementation. 

How can biases become embedded in AI tools, even when the intention is to promote equity? What steps can institutions take to uncover and address discriminatory outcomes in AI systems? How can educators support students who experience exclusion or stereotyping through AI-based tools? What role can inclusive datasets and lived experience play in shaping ethical AI? How might an “equity-by-design” approach reshape the development and use of AI in schools?

Suggested activity:

Conduct a sample equity audit of an AI or data tool currently used in your school or institution. Whose data is represented? Whose isn't?

Do you want to know more?
© Copyright 2024 Web.com Group, Inc. All rights reserved. All registered trademarks herein are the property of their respective owners.
Subscribe to the AIGE Newsletter
Acknowledgement of Country We acknowledge the Ancestors, Elders and families of the Kulin Nation (who are the traditional owners of the land. where this work has bene predominantly completed. As we share our own knowledge practices, we pay respect to the deep knowledge embedded within the Aboriginal community and recognise their ownership of Country. We acknowledge that the land on which we meet, learn, and share knowledge is a place of age-old ceremonies of celebration, initiation and renewal, and that the Traditional Owners' living culture and practices have a unique role in the life of this region.

We use cookies to enable essential functionality on our website, and analyze website traffic. By clicking Accept you consent to our use of cookies. Read about how we use cookies.

Your Cookie Settings

We use cookies to enable essential functionality on our website, and analyze website traffic. Read about how we use cookies.

Cookie Categories
Essential

These cookies are strictly necessary to provide you with services available through our websites. You cannot refuse these cookies without impacting how our websites function. You can block or delete them by changing your browser settings, as described under the heading "Managing cookies" in the Privacy and Cookies Policy.

Analytics

These cookies collect information that is used in aggregate form to help us understand how our websites are being used or how effective our marketing campaigns are.