Supporting and Safeguarding
AI Note-Taking in Supervision and Research
How to cite this learning scenario
Arantes, J. (2025). Supporting and Safeguarding: AI Note-Taking in Supervision and Research. www.AI4education.org. Licensed under a Creative Commons Attribution 4.0 International License.
abstract
Abstract
AI note-taking tools such as Otter.ai and Microsoft Copilot are increasingly used in academic settings to record, transcribe, and summarise meetings. These tools offer substantial benefits for international students, neurodivergent learners, and others who rely on transcripts to support comprehension and review of complex material. At the same time, they introduce risks regarding consent, data security, and privacy. Rather than framing support and surveillance as competing forces, this scenario recognises that both accessibility and safeguarding obligations must be pursued together. This scenario establishes first the benefits of AI tools in terms of inclusion and accessiblity as essential, to focus on recent developments that underscore the importance of communications of how we may conceive acceptable use. A class action against Otter.ai in the United States highlights risks when recording occurs without consent, while the Office of the Victorian Information Commissioner (OVIC) stresses that generative AI use in meetings requires clear communication, transparency, and governance. With wearable devices such as smart glasses now widely available, universities must anticipate similar issues beyond transcription apps. This scenario explores how institutions can provide the benefits of AI note-taking while mitigating risks through effective governance, communication, and practical guidance. It demonstrates that both accessibility and privacy are achievable, provided universities embed robust policies and transparent practices into supervision and research environments.
“Good governance and effective communication are essential to achieving both inclusion and privacy.”
Ensuring Accessibility While Managing Privacy Risks Through Governance
Amira, a first-year PhD student, opened her laptop as her supervision meeting began and activated Otter.ai. Otter is great - it captured the conversation in real time, producing a transcript as her supervisor spoke and flagging key action points for later. For Amira, who is both neurodivergent and working in her second language, the transcript was more than just a record. It gave her the chance to return to the complex detail of feedback after the meeting ended. She felt reduced anxiety and it allowed her to participate more fully in the discussion. The tech felt a little like a lifeline, levelling a space that often felt stacked against her.
Half an hour in, her supervisor glanced at the screen and realised the conversation was being recorded. No one had asked for consent. A flicker of unease passed over his face as he remembered recent media coverage of a class action lawsuit against Otter.ai, which alleged that its “autonomous note taker” had recorded conversations without permission. He also recalled the warning from the Office of the Victorian Information Commissioner that any use of AI to capture meetings arguably required clear notice, explicit consent, and strong governance. What for Amira felt like a tool of inclusion now seemed, to her supervisor, like a risk to consent, privacy, confidentiality, and the integrity of the institution.
The moment created tension between two truths. Amira needed the transcript to study effectively and manage her workload. Her supervisor worried about the ethical and legal implications of recording without agreement. Both felt uncertain about how to proceed.
In response to situations like this, the university began drafting clearer protocols. Supervisors were encouraged to begin meetings by stating openly whether AI note-taking tools would be used and asking for agreement from all present. Transcripts were redirected to secure university servers rather than commercial platforms, and staff and students were provided with practical training on how to use the tools responsibly. Over time, these measures gave people confidence that accessibility and privacy could exist together.
The sense of urgency only grew as smart glasses, now selling for less than a hundred dollars, entered supervision meetings. Covert recording was no longer science fiction but an everyday possibility and as areuslt, the university also explicitly mentioned smart glasses in their policy and provided a clear comms piece about covert recording. Aiming to be proactive and anticipate governance needed, they opted to prepare students like Amira and their supervisors for this new landscape, embedding practices that preserved trust while keeping inclusion at the centre.
For Amira, the transcript remained a vital support. For her supervisor, the governance protocols became a reassurance that rights and protections were being upheld. Together they showed that technology in higher education does not have to be framed as a trade-off between support and safeguarding. With clear rules and transparent communication, it is possible to achieve both
Research Topics
Research Questions
How can universities design governance frameworks that simultaneously support accessibility and mitigate privacy risks in AI note-taking?
What forms of communication most effectively secure meaningful consent from participants in academic meetings?
How might anticipatory policies address emerging risks from wearable recording technologies in higher education?
Institutional governance strategies for balancing accessibility and privacy in AI note-taking.
The role of communication and consent in fostering trust in AI-mediated academic meetings.
Anticipatory governance of wearable AI technologies in supervision and research contexts.
Data Collection
Case studies of universities that have implemented policies on AI note-taking and recording.
Deliberative workshops with students, supervisors, and administrators to co-design governance scenarios.
Comparative legal analysis of privacy, accessibility, and AI use across jurisdictions (e.g., Australia, EU, US).
References
WebProNews. (2025, August 16). Otter.ai hit with class action lawsuit over unconsented recordings. Retrieved from https://www.webpronews.com/otter-ai-hit-with-class-action-lawsuit-over-unconsented-recordings
Office of the Victorian Information Commissioner (OVIC). (2023). Guiding Principles for Surveillance. Retrieved from https://ovic.vic.gov.au/privacy/resources-for-organisations/guiding-principles-for-surveillance
August 2025