• AIGE
    • Teaching with Responsible AI Network
    • Digital Poverty and Inclusion Research
    • The Educational Research Greenhouse
    • AIGE in Action
    • Gallery: But did they actually write it?
    • Services
  • Scenarios AI Governance
    • About the case studies and scenarios?
    • Why Case Studies and Scenarios?
    • Case Study Template
    • Developing AI Literacy
  • Mitigating Risks of AI in Education
    • Deepfakes
    • Still Learning, Still Failing?
    • Optimised for Inequity
    • The Pilot that Went too far
    • Lessons from the NewCo Chatbot Example
    • The Los Angelese School Chatbot Debacle
  • Academic and Research Integrity
    • Mirror, Mask, or Misdirection?
    • Assessment Reform
    • did a human write this
    • it just said no
  • Leadership
    • Balancing workload and Assessment reform
    • Programmatic Possibilities
    • Automation and Abdication
    • The Global Influence of Big Tech
    • Who's in Charge here?
    • It Works, But Does It Belong?
    • Everyone and No One
  • Human Oversight
    • Hands Off Learning
    • Click to comprehend
    • Marked by the Machine
    • Just Follow the System
    • Implementing AI-Driven Recommender Engin
    • Facial Recognition Technology in educati
  • Engagement
    • Whose Voice Counts?
    • The Algorithm Didn’t See Me
    • Flagged and Forgotten
    • The library as a central hub
    • Accredited programs
  • Ethical AI
    • GenAI Hallucinates
    • The System Said So
    • Not Meant to Exclude
    • Justice Deferred
  • Compliance
    • Scan First, Act Later
    • Lost in the System
    • We Never Looked Under the Hood
    • Show Us the Proof
  • Monitoring
    • Aligning AI Tools with Educational Value
    • It wasn't ready
    • It Drifted
    • It solved the wrong problem
  • Transparency
    • It was a black box
    • we signed before we asked
    • behind closed algorithms
  • About Us

‘Justice Deferred’ A case to explore Equity and Justice in AI-Driven Educational Decision-Making

ACCOUNTABILITY & LEADERSHIP IN AI

How to cite this learning scenario

Arantes, J. (2025). Justice Deferred. Case Studies in AI Governance for Education. www.AI4education.org. Licensed under a Creative Commons Attribution 4.0 International License.
abstract
This case study explores how AI-driven decisions—while efficient and scalable—can contradict the principles of educational equity and justice if not designed and implemented with care. Set in a national education reform initiative, this fictionalised case illustrates how a centralised AI tool designed to automate school resourcing and teacher allocation inadvertently entrenched inequalities in low-income, remote, and historically underfunded communities. Despite being data-informed, the system failed to acknowledge structural disadvantage, cultural context, and Indigenous self-determination. This case highlights the need for AI systems to be accountable to justice—not just to performance metrics.

Equity isn’t just a principle—it’s a responsibility. If AI decisions reproduce structural injustice, they are not neutral. They are complicit.

Justice Deferred

As part of a 2024 national reform initiative, the Department of Education implemented Resourcely, an AI-driven system to allocate teaching staff and funding based on enrolment numbers, student performance, and historic attendance trends. The tool was pitched as a way to improve fairness and efficiency by removing human discretion and bias from decision-making. But within the first year of deployment, several First Nations communities and rural schools reported sharp drops in support. The AI had interpreted historic underperformance and absenteeism as signs of reduced need—rather than as symptoms of entrenched disadvantage and systemic neglect. These schools were allocated fewer staff, received less learning support, and were deprioritised in digital upgrades. Community leaders, educators, and local principals pushed back, arguing that the system rewarded past advantage while penalising those already marginalised. The AI couldn’t read cultural context, intergenerational trauma, or the need for community-led recovery. Teachers reported burnout, students disengaged, and trust in the system deteriorated. An inquiry found that while the algorithm had functioned as designed, it failed to centre the principles of educational justice. A new framework was introduced, requiring all AI-based education reforms to undergo an Equity Impact Review, co-designed with communities, culturally informed stakeholders, and rights-based organisations. Resourcely was updated to include contextual indicators—like remoteness, cultural load, and histories of funding exclusion—and to ensure that AI-supported decisions helped to close gaps, not widen them. This case reminds us that justice must be intentionally designed into AI systems. Efficiency alone is not enough—especially in systems meant to serve the public good.

Research Topics

Research Questions

How can AI systems designed for efficiency unintentionally create or reinforce injustice? What does it mean to embed equity and justice into the logic of an algorithm? Who should be involved in designing and reviewing AI decision-making tools in education? What frameworks exist to help evaluate AI systems from a social justice lens? How can institutions balance data-driven governance with the need for human judgement, lived experience, and cultural wisdom?
Understand how AI systems can reproduce or amplify structural inequities without deliberate intervention. Identify strategies to evaluate and redesign AI decision-making processes to uphold equity and justice. Explore participatory and rights-based approaches to AI governance in education. Develop capacity to assess AI decisions through a social justice lens.

Data collection:

Conduct an Equity Impact Review through stakeholder interviews and data analysis to evaluate who benefits and who is excluded by an AI system currently in use.
Do you want to know more?
© Copyright 2024 Web.com Group, Inc. All rights reserved. All registered trademarks herein are the property of their respective owners.
Subscribe to the AIGE Newsletter
Acknowledgement of Country We acknowledge the Ancestors, Elders and families of the Kulin Nation (who are the traditional owners of the land. where this work has bene predominantly completed. As we share our own knowledge practices, we pay respect to the deep knowledge embedded within the Aboriginal community and recognise their ownership of Country. We acknowledge that the land on which we meet, learn, and share knowledge is a place of age-old ceremonies of celebration, initiation and renewal, and that the Traditional Owners' living culture and practices have a unique role in the life of this region.

We use cookies to enable essential functionality on our website, and analyze website traffic. By clicking Accept you consent to our use of cookies. Read about how we use cookies.

Your Cookie Settings

We use cookies to enable essential functionality on our website, and analyze website traffic. Read about how we use cookies.

Cookie Categories
Essential

These cookies are strictly necessary to provide you with services available through our websites. You cannot refuse these cookies without impacting how our websites function. You can block or delete them by changing your browser settings, as described under the heading "Managing cookies" in the Privacy and Cookies Policy.

Analytics

These cookies collect information that is used in aggregate form to help us understand how our websites are being used or how effective our marketing campaigns are.