What Annex III Category 3 covers
The EU AI Act explicitly lists three high-risk sub-areas in education and vocational training:
- Access to education – AI systems that determine whether a person is admitted to an educational institution or programme (e.g. university admissions, school placement)
- Evaluation of learning outcomes – AI used to assess students, including automated essay grading, exam scoring and competency evaluation
- Monitoring and detection during examinations – AI-based proctoring systems that monitor student behaviour during tests, including gaze tracking, movement detection and browser monitoring
Typical scenarios and risk levels
| Application | Risk level | EU AI Act basis |
|---|---|---|
| University admissions AI | High risk | Annex III Nr. 3(a) |
| Automated essay/exam grading | High risk | Annex III Nr. 3(b) |
| AI proctoring / exam surveillance | High risk | Annex III Nr. 3(c) |
| Adaptive learning platforms | Limited risk | Art. 50 |
| AI tutoring chatbot | Limited risk | Art. 50(1) |
| Learning management system (LMS) | Minimal risk | — |
Why education AI is particularly sensitive
AI in education affects a uniquely vulnerable population – children and young people. The EU AI Act's Recital 47 specifically highlights that AI systems in education can have significant consequences for the future opportunities of individuals. Key concerns include:
- Algorithmic bias can systematically disadvantage students from certain socio-economic or ethnic backgrounds
- Proctoring AI can disproportionately flag students with disabilities or non-standard testing environments
- Admissions AI can perpetuate historical inequalities if trained on biased data
- Right to explanation: students and parents have a legitimate interest in understanding how AI-based decisions were made
GDPR and AI Act interplay in schools
Education AI typically processes personal data of minors, which triggers enhanced GDPR protections. Schools and universities acting as deployers face dual compliance obligations:
- GDPR Art. 8 — Consent for minors requires parental approval in many member states
- GDPR Art. 22 — Right not to be subject to solely automated decision-making
- EU AI Act Art. 14 — Human oversight mandatory for all high-risk education AI
- EU AI Act Art. 27 — FRIA mandatory when public educational institutions deploy high-risk AI
Documentation obligations
For providers and deployers of high-risk AI in education:
- Technical Documentation per Annex IV – with particular focus on bias testing, data representativeness and fairness metrics
- Transparency notice – Students, parents and teachers must be informed about AI use in educational decisions (Art. 13 + Art. 26)
- FRIA – Mandatory for public educational institutions deploying high-risk AI per Art. 27
Next steps for your organisation
- Inventory – Map all AI tools used in teaching, assessment, admissions and administration
- Classification – Use the free risk check
- Documentation – Generate your compliance documentation