Documentation

Welcome to the High-Risk AI Systems Database documentation center. Here you'll find comprehensive guidance and resources to help you understand and comply with the AI Act requirements. Whether you're a provider looking to register an AI system or a deployer seeking to understand your obligations, this documentation will guide you through the process.

Getting Started

AI Act Overview

Learn about the European AI Act, its objectives, and its requirements for high-risk AI systems.

Provider Guidelines

Step-by-step instructions for providers to register and manage their AI systems in the database.

Deployer Guidelines

Essential information for deployers about their obligations and responsibilities under the AI Act.

Provider Guidelines

As a provider of high-risk AI systems, you have specific obligations under the AI Act. This section guides you through the registration process and ongoing compliance requirements.

Registration Process

  1. Create an Account: Register on the platform with your organization details.
  2. Provider Information: Complete your provider profile with legal name, contact details, and registration numbers.
  3. System Registration: Register each high-risk AI system with complete technical specifications.
  4. Conformity Assessment: Upload conformity assessment documentation and certificates.
  5. Submit for Review: Submit your registration for review by competent authorities.

Required Information

When registering your AI system, you will need to provide:

  • Provider identification and contact information
  • AI system name, trade name, and unique reference
  • Intended purpose and area of application
  • Components, functions, and operating logic
  • Data requirements and processing information
  • Conformity assessment certificates
  • Instructions for use and deployment
  • Risk management documentation

Deployer Guidelines

As a deployer, you are responsible for using high-risk AI systems in accordance with the AI Act requirements and the instructions provided by the provider.

Key Responsibilities

  • Due Diligence: Ensure the AI system is properly registered and compliant before deployment.
  • Instructions: Follow the provider's instructions for use and deployment.
  • Human Oversight: Implement appropriate human oversight measures as specified.
  • Monitoring: Monitor the AI system's operation and report any serious incidents.
  • Data Quality: Ensure input data is relevant, accurate, and appropriate for the intended purpose.
  • Impact Assessment: Conduct fundamental rights and data protection impact assessments where required.

Incident Reporting

Deployers must inform providers and relevant authorities about any serious incidents or malfunctioning that constitute a breach of EU law intended to protect fundamental rights. This should be done without undue delay after becoming aware of the incident.

Have Questions?

Visit our comprehensive FAQ page for answers to common questions about the AI Act, registration requirements, and compliance obligations.

Frequently Asked Questions

Get answers to the most common questions about high-risk AI systems, registration, and compliance.

Additional Resources

Official AI Act Text

Access the complete text of the Artificial Intelligence Act as published in the Official Journal of the European Union.

Contact Support

Need help with your registration or have questions about compliance? Our support team is here to assist you.