The AI Act: the evolution of "trustworthy AI" from policy documents to mandatory regulation - Equipe Numérique, Organisation et Société
Pré-Publication, Document De Travail Année : 2024

The AI Act: the evolution of "trustworthy AI" from policy documents to mandatory regulation

Résumé

What with the dangers of artificial intelligence for individuals and society, and the rapid evolution of these technologies, Europe has decided to take the lead by imposing strict requirements for the placing on the market of "AI systems". This new European law, adopted in the summer of 2024, is better know as "the AI Act". The AI Act is based on a hierarchy of risks, where riskier systems will be subject to stricter obligations. While the AI Act is not the first law in Europe to be based on risk -the General Data Protection Regulation (GDPR) and subsequent laws on digital technologies have already started this trend -it is the first to take it to such a level. But the AI Act also draws on the concept of "trustworthy AI", a term coined by policy documents that preceded it, and according to which AI must notably be ethical and technically robust. In this work, we retrace the story of the AI Act, in order to understand the origin of its main concepts and structure. We also take a look at the final version of the text, its hierarchy of AI systems and the corresponding obligations, as well as the governance ecosystem it puts in place to ensure that these rules are properly implemented. The picture we draw shows a regulation that is quite unique in the European legal landscape, despite its many roots and inspirations.
Fichier principal
Vignette du fichier
Gornet_trustworthinessAIAct.pdf (1.3 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04785519 , version 1 (15-11-2024)

Identifiants

  • HAL Id : hal-04785519 , version 1

Citer

Mélanie Gornet. The AI Act: the evolution of "trustworthy AI" from policy documents to mandatory regulation. 2024. ⟨hal-04785519⟩
13 Consultations
0 Téléchargements

Partager

More