Skip to content

EU AI Act in the Financial Sector

Creditworthiness assessment, insurance pricing and fraud detection – the financial sector is particularly affected by the EU AI Act. Annex III No. 5(a) explicitly classifies AI-based credit scoring as a high-risk system.

Updated: February 202610 min read

Why the financial sector is in focus

Annex III No. 5(a) of Regulation (EU) 2024/1689 explicitly classifies AI systems for creditworthiness assessment of natural persons as high-risk. Only the detection of financial fraud is excluded.

Additionally, AI systems for risk assessment and pricing in life and health insurance fall under Annex III No. 5(b).

High-risk AI systems in the financial sector

ApplicationAnnex IIIHigh-risk?
Credit scoring of natural personsNr. 5(a)Yes
Creditworthiness assessmentNr. 5(a)Yes
Insurance pricing (life/health)Nr. 5(b)Yes
AML/KYC-ScreeningNo (fraud detection)
Algorithmic tradingNo (no natural person affected)
Customer service chatbotNo (Art. 50 transparency obligation)

Regulatory interplay

Financial service providers must implement the EU AI Act in the context of existing financial regulation:

  • MiFID II (Directive 2014/65/EU) – Algorithmic trading requirements, best execution for AI systems
  • CRD/CRR – Credit risk models and internal ratings
  • Solvency II – AI in the insurance sector (risk assessment, pricing)
  • DORA (Regulation (EU) 2022/2554) – Digital operational resilience, ICT risk management
  • GDPR – Automated individual decisions (Art. 22 GDPR) in credit decisions

Market surveillance: BaFin as primary authority

Under Art. 74 EU AI Act, the sector-specific authorities are primarily responsible for the financial sector:

CountryPrimary authorityHorizontal authority
GermanyBaFinBundesnetzagentur
AustriaFMARTR
NetherlandsAFM/DNBACM
FranceACPR/AMFARCOM

FRIA obligation for credit scoring

Art. 27(1)(b) EU AI Act explicitly obliges deployers of AI systems under Annex III No. 5(a) – i.e. credit scoring systems – to conduct a Fundamental Rights Impact Assessment (FRIA).

This applies regardless of whether the deployer is a public or private entity. The FRIA must specifically analyse discrimination risks in credit decisions.

Documentation obligations for financial AI

  • Technical Documentation – Providers of credit scoring AI must document all 13 Annex IV sections
  • Transparency notice – Borrowers must be informed that AI is involved in the decision
  • FRIA – Deployers must assess fundamental rights risks (non-discrimination under Art. 21 EU Charter)
  • Art. 22 GDPR disclosure – Right to explanation of automated decisions continues to apply in parallel

Next steps

  1. Inventory – Identify all AI systems in credit, scoring and insurance
  2. Risk check – Use the free risk check
  3. BaFin requirements – Review current BaFin circulars on AI in financial services
  4. Documentation Generate compliance drafts with industry-specific context

Generate compliance drafts for financial AI now →

Compliance drafts

FRIA, Technical Documentation and Transparency Notice – AI-generated drafts in minutes.

Generate drafts

Ready for your EU AI Act Compliance?

AI-generated compliance documents as a solid working basis for you or your lawyers.

EU AI Act in the Financial Sector | AIvunera