• Home
    • Teaching with Responsible AI Network
    • Digital Poverty and Inclusion Research
    • The Educational Research Greenhouse
    • But did they actually write it?
    • AIGE in Action
    • Services
  • The Smartglasses Lab
    • Transfeminist Lens
    • Academic Freedom
    • Doxxed at a Glance
    • Tech, entitlement and equity
    • Covert recording on placement
  • Scenarios about Leadership
    • GBV Series: Sexualised Deepfakes
    • GBV Series: Deepfakes and Credibility
    • Shared Language
    • Accountability
    • Oversight
    • Aligning Values
    • Fragmented Leadership
    • Scan First, Act Later
  • Scenarios about Teaching and Learning
    • AI Myths: Objectivity
    • AI Myths: Neutrality
    • Teaching: Bias in Lesson plans
    • Assessment Reform: Workload
    • Assessment Reform: Trust
    • Assessment Reform: Accreditation
  • Ethical Scenarios
    • Ethical Deployment of AI
    • Student Data Privacy
    • Commercialization
    • Facial Recognition
    • Recommender Systems
    • GenAI Hallucinates
  • Scenarios about Digital Citizenship
    • Whose Voice Counts?
    • Diversity
    • CALD Students
    • Justice Deferred
    • Contesting AI decisions
    • Bias
  • Scenarios about Inclusive Assessment
    • Supporting and Safeguarding
    • Human in the Loop
    • The role of the teacher
    • AI Summaries
    • The Library as a central hub
    • Authorship
  • Placement and Permission to Teach
    • Remote placement and Deepfakes
    • Wellbeing on PTT
    • Professional Risk on PTT
    • AI Hallucination in Search Results
  • About
    • About the scenarios
    • Why Case Studies and Scenarios?
    • Case Study Template
    • Developing AI Literacy
    • About Us

Supporting and Safeguarding



AI Note-Taking in Supervision and Research

How to cite this learning scenario

Arantes, J. (2025). Supporting and Safeguarding: AI Note-Taking in Supervision and Research. www.AI4education.org. Licensed under a Creative Commons Attribution 4.0 International License.
abstract
Abstract AI note-taking tools such as Otter.ai and Microsoft Copilot are increasingly used in academic settings to record, transcribe, and summarise meetings. These tools offer substantial benefits for international students, neurodivergent learners, and others who rely on transcripts to support comprehension and review of complex material. At the same time, they introduce risks regarding consent, data security, and privacy. Rather than framing support and surveillance as competing forces, this scenario recognises that both accessibility and safeguarding obligations must be pursued together. This scenario establishes first the benefits of AI tools in terms of inclusion and accessiblity as essential, to focus on recent developments that underscore the importance of communications of how we may conceive acceptable use. A class action against Otter.ai in the United States highlights risks when recording occurs without consent, while the Office of the Victorian Information Commissioner (OVIC) stresses that generative AI use in meetings requires clear communication, transparency, and governance. With wearable devices such as smart glasses now widely available, universities must anticipate similar issues beyond transcription apps. This scenario explores how institutions can provide the benefits of AI note-taking while mitigating risks through effective governance, communication, and practical guidance. It demonstrates that both accessibility and privacy are achievable, provided universities embed robust policies and transparent practices into supervision and research environments.

“Good governance and effective communication are essential to achieving both inclusion and privacy.”

Ensuring Accessibility While Managing Privacy Risks Through Governance

Amira, a first-year PhD student, opened her laptop as her supervision meeting began and activated Otter.ai. Otter is great - it captured the conversation in real time, producing a transcript as her supervisor spoke and flagging key action points for later. For Amira, who is both neurodivergent and working in her second language, the transcript was more than just a record. It gave her the chance to return to the complex detail of feedback after the meeting ended. She felt reduced anxiety and it allowed her to participate more fully in the discussion. The tech felt a little like a lifeline, levelling a space that often felt stacked against her. Half an hour in, her supervisor glanced at the screen and realised the conversation was being recorded. No one had asked for consent. A flicker of unease passed over his face as he remembered recent media coverage of a class action lawsuit against Otter.ai, which alleged that its “autonomous note taker” had recorded conversations without permission. He also recalled the warning from the Office of the Victorian Information Commissioner that any use of AI to capture meetings arguably required clear notice, explicit consent, and strong governance. What for Amira felt like a tool of inclusion now seemed, to her supervisor, like a risk to consent, privacy, confidentiality, and the integrity of the institution. The moment created tension between two truths. Amira needed the transcript to study effectively and manage her workload. Her supervisor worried about the ethical and legal implications of recording without agreement. Both felt uncertain about how to proceed. In response to situations like this, the university began drafting clearer protocols. Supervisors were encouraged to begin meetings by stating openly whether AI note-taking tools would be used and asking for agreement from all present. Transcripts were redirected to secure university servers rather than commercial platforms, and staff and students were provided with practical training on how to use the tools responsibly. Over time, these measures gave people confidence that accessibility and privacy could exist together. The sense of urgency only grew as smart glasses, now selling for less than a hundred dollars, entered supervision meetings. Covert recording was no longer science fiction but an everyday possibility and as areuslt, the university also explicitly mentioned smart glasses in their policy and provided a clear comms piece about covert recording. Aiming to be proactive and anticipate governance needed, they opted to prepare students like Amira and their supervisors for this new landscape, embedding practices that preserved trust while keeping inclusion at the centre. For Amira, the transcript remained a vital support. For her supervisor, the governance protocols became a reassurance that rights and protections were being upheld. Together they showed that technology in higher education does not have to be framed as a trade-off between support and safeguarding. With clear rules and transparent communication, it is possible to achieve both

Research Topics

Research Questions

How can universities design governance frameworks that simultaneously support accessibility and mitigate privacy risks in AI note-taking? What forms of communication most effectively secure meaningful consent from participants in academic meetings? How might anticipatory policies address emerging risks from wearable recording technologies in higher education?
Institutional governance strategies for balancing accessibility and privacy in AI note-taking. The role of communication and consent in fostering trust in AI-mediated academic meetings. Anticipatory governance of wearable AI technologies in supervision and research contexts.

Data Collection

Case studies of universities that have implemented policies on AI note-taking and recording. Deliberative workshops with students, supervisors, and administrators to co-design governance scenarios. Comparative legal analysis of privacy, accessibility, and AI use across jurisdictions (e.g., Australia, EU, US). References WebProNews. (2025, August 16). Otter.ai hit with class action lawsuit over unconsented recordings. Retrieved from https://www.webpronews.com/otter-ai-hit-with-class-action-lawsuit-over-unconsented-recordings Office of the Victorian Information Commissioner (OVIC). (2023). Guiding Principles for Surveillance. Retrieved from https://ovic.vic.gov.au/privacy/resources-for-organisations/guiding-principles-for-surveillance August 2025

Do you want to know more?
Acknowledgement of CountryWe acknowledge the Ancestors, Elders, and families of the Kulin Nation, who are the Traditional Owners of the land where this work has been predominantly completed. As we share our own knowledge practices, we pay respect to the deep knowledge embedded within the Aboriginal community and recognise their custodianship of Country. We acknowledge that the land on which we meet, learn, and share knowledge is a place of age-old ceremonies of celebration, initiation, and renewal, and that the Traditional Owners’ living culture and practices continue to have a unique role in the life of this region.
Subscribe to the AIGE Newsletter
© Copyright 2024 Web.com Group, Inc. All rights reserved. All registered trademarks herein are the property of their respective owners.

We use cookies to enable essential functionality on our website, and analyze website traffic. By clicking Accept you consent to our use of cookies. Read about how we use cookies.

Your Cookie Settings

We use cookies to enable essential functionality on our website, and analyze website traffic. Read about how we use cookies.

Cookie Categories
Essential

These cookies are strictly necessary to provide you with services available through our websites. You cannot refuse these cookies without impacting how our websites function. You can block or delete them by changing your browser settings, as described under the heading "Managing cookies" in the Privacy and Cookies Policy.

Analytics

These cookies collect information that is used in aggregate form to help us understand how our websites are being used or how effective our marketing campaigns are.