09/01/25

New Luxembourg bill designates national authorities under the AI act: the CNPD takes centre stage

On 23 December 2024, the Luxembourg government submitted bill of law n°8476. This Bill is a pivotal step in aligning Luxembourg’s legal framework with the Regulation (EU) 2024/1689 laying down harmonised rules on artificial intelligence. Its primary objective is to designate the national competent authorities responsible for the enforcement and oversight of the AI Act.

On 23 December 2024, the Luxembourg government submitted bill of law n°8476 (the “Bill”). This Bill is a pivotal step in aligning Luxembourg’s legal framework with the Regulation (EU) 2024/1689 laying down harmonised rules on artificial intelligence (the “AI Act”). Its primary objective is to designate the national competent authorities responsible for the enforcement and oversight of the AI Act.

I. Context

Artificial Intelligence (AI) has emerged as a transformative force in recent years, driven by rapid advances in deep learning and computational power. With applications spanning healthcare, education, business operations, and decision-making, AI offers unprecedented opportunities for innovation and growth. However, its rise has also highlighted critical ethical challenges, including potential risks to fundamental rights and democratic processes.

To address these concerns while fostering innovation, the European Union adopted the AI Act, which came into force on 1 August 2024. The AI Act provides a framework to regulate the development and use of AI systems while ensuring the protection of fundamental rights. Among its key mandates, the AI Act requires each Member State to:

  • Designate at least one notifying authority, responsible for assessing, accrediting, and monitoring conformity assessment bodies tasked with certifying AI systems’ compliance with the regulation.
  • Designate at least one market surveillance authority, charged with overseeing compliance, investigating non-compliance, and ensuring only lawful AI systems are made available on the market.

Member States must also establish a conformity assessment body called “notified body”.

Together, these authorities (referred to as national competent authorities) must operate independently and impartially, with sufficient resources and technical expertise to fulfil their duties.

Additionally, the AI Act mandates the establishment of AI regulatory sandboxes at the national level within 24 months of the AI Act’s entry into force. These sandboxes will allow businesses to test innovative AI systems in a controlled regulatory environment.

For further information on the AI Act, please refer to our Newsflash

II. Main implementation measures of the Bill

1. Designation of national competent authorities under the AI Act

The Bill designates existing Luxembourg authorities as the national competent authorities under the AI Act. These authorities have been assigned sector-specific responsibilities to ensure effective regulation and oversight.  

  • Notifying authorities. Notifying authorities will oversee the assessment and designation of conformity assessment bodies to ensure they meet the required standards for certifying AI systems. The authorities include:
    • The Luxembourg Office of Accreditation and Surveillance (OLAS) [1]. The OLAS will accredit and oversee conformity assessment bodies tasked with evaluating general purposes AI systems.
    • The Luxembourg Agency for Medicines and Health Products (ALMPS) [2].The ALMPS will accredit and supervise conformity assessment bodies specific to AI systems in the healthcare and medical device sectors.
    • The Luxembourg State Data Protection Commissioner’s Office (CGP) [3].
  • Notified body. The notified body is a conformity assessment body which performs third-party conformity assessment activities, including testing, certification, and inspection. In Luxembourg, the National Commission for Data Protection (CNPD) [4] will be appointed as notified body for high-risk AI system intended to be put into service by law enforcement, immigration or asylum authorities.
  • Market surveillance authorities. The market surveillance authorities will ensure AI systems comply with the AI Act’s requirements based on their respective areas of expertise. These authorities include:
    • The National Commission for Data Protection (CNPD). The CNPD will be the default market surveillance authority and serve as primary point of contact in Luxembourg. The CNPD will also coordinate activities across the remaining market surveillance authorities in order to facilitate collaboration between sector-specific authorities and EU institutions.

      The following other Market Surveillance Authorities are designated as exceptions to the CNPD’s jurisdiction:
      • The Judicial Supervisory Authority (JSA) [5]. The JSA will monitor compliance of AI systems used by judicial courts, prosecution services, and the administrative order in the performance of its judicial functions.
      • The Financial Sector Supervisory Commission (CSSF). [6] The CSSF will supervise the use of AI systems by entities under its supervision. It will, additionally, be conducting risk assessments and audits to ensure compliance with financial regulations as well as the AI Act.
      • The Supervisory Authority for the Insurance Sector (CAA) [7]. The CAA will monitor AI systems used by entities under its supervision and investigates practices involving bias or discrimination in AI-driven insurance decisions.The Luxembourg Institute for Standardization, Accreditation, Safety, and Quality (ILNAS) [8]. The ILNAs will oversee AI systems for markets listed in points 1 to 10 of Annex I [9], provided the AI systems meet the conditions of Article 6(1) of the AI Act [10] and point 2 of Annex III, which concerns products used in critical infrastructure.The Luxembourg Institute of Regulation (ILR) [11]. The ILR is designated as the market surveillance authority for compliance with Article 26 of the AI Act regarding obligations for deployers of high-risk AI systems. This applies to operators of essential or important services under the future NIS 2 implementing law, without prejudice to the CSSF’s role under the same law.
      • The Luxembourg Agency for Medicines and Health Products (ALMPS). ALMPS will supervise the use of AI in medical devices and in vitro diagnostic medical devices (markets listed in points 11 and 12 of Annex 1) provided the AI systems meet the conditions of Article 6(1) of the AI Act.
      • The Luxembourg Independent Audiovisual Authority (ALIA) [12]. ALIA will monitor the compliance with article 50 paragraphs 2 and 4 of the AI Act (i.e. transparency obligations of (i) providers of AI systems, including general-purpose AI systems, generating synthetic audio, image, video or text content, shall ensure that the outputs of the AI system are marked in a machine-readable format and detectable as artificially generated or manipulated and (ii) deployers of an AI system that generates or manipulates image, audio or video content constituting a deep fake, shall disclose that the content has been artificially generated or manipulated).

2. Sanctions

Market surveillance authorities have the power to impose sanctions on operators, including warnings, reprimands, or administrative fines. Notifying authorities can impose similar sanctions on notified bodies, including warnings, reprimands, or administrative fines.

Sanctions must be effective, proportionate, and deterrent, considering the interests of small and medium-sized enterprises (SMEs) and startups.

In a nutshell, the following administrative fines can be imposed:

  • Non-compliance with the prohibition of the AI practices referred to in Article 5 of the AI Act can lead to fines up to €35 million or 7% of the company’s global turnover, whichever is higher;
  • Non-compliance with other obligations related to high-risk AI systems can result in fines up to €15 million or 3% of the company’s global turnover, whichever is higher;
  • Providing inaccurate or misleading information to authorities can lead to fines up to €7.5 million or 1% of a company’s global turnover, whichever is higher.

For SMEs and startups, fines are capped at the lower of the percentages or amounts mentioned in the above bullet points. Decisions made by national authorities are published on their websites, though confidentiality may be maintained for sensitive business information.

An appeal against decisions made by the competent authority may be filed with the Administrative Court, which will rule on the merits of the case.

3. Establishment of AI regulatory sandboxes

The Bill mandates the aforementioned Luxembourg market surveillance authorities to create AI regulatory sandboxes within their areas of oversight and to collaborate both among themselves and with other European market surveillance authorities, as applicable. These sandboxes will serve as innovation hubs, allowing businesses and developers to test AI systems in a controlled and supportive environment.

III.Conclusion

The Bill represents a significant step toward aligning Luxembourg’s legal framework with the EU AI Act.

It is important to note that the CNPD will be designated as the single point of contact for the AI Act and must be notified to the European Commission in accordance with Article 70(2) of the AI Act. This role requires the CNPD to coordinate with all national competent authorities, although these authorities remain obliged to cooperate with each other, the CNPD, the European Commission, and other European authorities.

References:

dotted_texture