Skip to content

EU AI Act in Education

AI systems that determine access to education, evaluate learning outcomes or monitor examinations are classified as high-risk under Annex III Category 3. This affects EdTech providers, universities and schools across the EU.

Updated: March 20268 min read

What Annex III Category 3 covers

The EU AI Act explicitly lists three high-risk sub-areas in education and vocational training:

  1. Access to education – AI systems that determine whether a person is admitted to an educational institution or programme (e.g. university admissions, school placement)
  2. Evaluation of learning outcomes – AI used to assess students, including automated essay grading, exam scoring and competency evaluation
  3. Monitoring and detection during examinations – AI-based proctoring systems that monitor student behaviour during tests, including gaze tracking, movement detection and browser monitoring

Typical scenarios and risk levels

ApplicationRisk levelEU AI Act basis
University admissions AIHigh riskAnnex III Nr. 3(a)
Automated essay/exam gradingHigh riskAnnex III Nr. 3(b)
AI proctoring / exam surveillanceHigh riskAnnex III Nr. 3(c)
Adaptive learning platformsLimited riskArt. 50
AI tutoring chatbotLimited riskArt. 50(1)
Learning management system (LMS)Minimal risk

Why education AI is particularly sensitive

AI in education affects a uniquely vulnerable population – children and young people. The EU AI Act's Recital 47 specifically highlights that AI systems in education can have significant consequences for the future opportunities of individuals. Key concerns include:

  • Algorithmic bias can systematically disadvantage students from certain socio-economic or ethnic backgrounds
  • Proctoring AI can disproportionately flag students with disabilities or non-standard testing environments
  • Admissions AI can perpetuate historical inequalities if trained on biased data
  • Right to explanation: students and parents have a legitimate interest in understanding how AI-based decisions were made

GDPR and AI Act interplay in schools

Education AI typically processes personal data of minors, which triggers enhanced GDPR protections. Schools and universities acting as deployers face dual compliance obligations:

  • GDPR Art. 8 — Consent for minors requires parental approval in many member states
  • GDPR Art. 22 — Right not to be subject to solely automated decision-making
  • EU AI Act Art. 14 — Human oversight mandatory for all high-risk education AI
  • EU AI Act Art. 27 — FRIA mandatory when public educational institutions deploy high-risk AI

Documentation obligations

For providers and deployers of high-risk AI in education:

  • Technical Documentation per Annex IV – with particular focus on bias testing, data representativeness and fairness metrics
  • Transparency notice – Students, parents and teachers must be informed about AI use in educational decisions (Art. 13 + Art. 26)
  • FRIA – Mandatory for public educational institutions deploying high-risk AI per Art. 27

Next steps for your organisation

  1. Inventory – Map all AI tools used in teaching, assessment, admissions and administration
  2. Classification – Use the free risk check
  3. Documentation – Generate your compliance documentation

Generate compliance drafts for education AI now →

Compliance drafts

FRIA, Technical Documentation and Transparency Notice – AI-generated drafts in minutes.

Generate drafts

Ready for your EU AI Act Compliance?

AI-generated compliance documents as a solid working basis for you or your lawyers.

EU AI Act in Education | AIvunera