Table 3 Summary of EU AI Act requirements for general-purpose AI models

From: Navigating the European Union Artificial Intelligence Act for Healthcare

Requirement

Summary

1. Technical documentation and transparency information for downstream providers (Art 53 (1a, b))

• General model description: tasks, integration capabilities, use policies, release date, distribution, architecture, parameters, input/output modalities, formats, licence

• Development and integration: technical requirements, design and training specifications, methodologies, key choices, optimisation goals, data details for training, testing, validation (type, provenance, curation, biases detection), computational resources, training time, known or estimated energy consumption

• Open-source models may be exempted from these requirements if they do not pose a systemic risk

2. Copyright policy (Art 53 (1c))

• Implementation of an EU copyright-compliant policy

3. Training data transparency (Art 53 (1d))

• Publicly available summary about training content according to an AI Office template to be defined

4. Authorised representatives (Art 54 (1–5))

• For providers established in third countries, an authorised representative in the Union must be appointed, ensuring the provision of all necessary documentation and information and conformity with the AI Act

If a systemic risk is assumed:

5. Model evaluation, risk mitigation and management, cybersecurity (Art 55 (1a–d))

• Evaluation using public protocols, tools, or other methodologies

• Systemic risk assessment and mitigation, including adversarial testing

• Ensuring the protection of cybersecurity

  1. AI artificial intelligence, Art article, EU European Union.