

The Chamber of Commerce would like to inform its members about the recently published European Commission’s draft guidance and incident report template for serious incidents under the AI Act[1].
The AI Act follows a risk-based approach classifying AI systems into different risk categories, one of which is the high-risk AI systems category. For those systems, Article 73 AI Act imposes an obligation to report serious incidents[2].
To support providers in fulfilling this obligation[3], the Commission has published a first draft guidance alongside a template for the report to the market surveillance authorities[4]. While the guidance aims to enable providers to clearly determine whether they are obligated to report, the template aims to standardize the process of reporting and to harmonise the information transmitted to market surveillance authorities.
This reporting obligation aims to create an early warning system that allows market surveillance authorities to identify potentially harmful patterns or important risks of high-risk AI system at an early stage. It also establishes clear accountability for providers (and to a certain extent users). It enables quick corrective measures from authorities when incidents occur. Finally, it fosters transparency in how high-risk systems operate, ultimately building public trust in AI technologies.
The AI Act incident monitoring also seeks alignment with other international reporting regimes, such as the OECD’s AI Incident Monitor and Common Reporting Framework[5].
In parallel, the Commission has launched a targeted consultation to collect input from stakeholders on practical examples of AI systems incidents that will have to be reported and issues to be clarified in the guidance. Respondents are encouraged to provide explanations and practical cases to support the practical usefulness of the guidelines.
The consultation is targeted to stakeholders of different categories, such as (but not limited to) providers and deployers of AI systems, other industry organisations, as well as academia, other independent experts, civil society organisations, and public authorities.
Participation details
The consultation is available in English only and will be open until 7 November 2025.
If interested, the Chamber of Commerce invites you to share your view and participate directly in the initiative via the following link:
Targeted stakeholder consultation on the Commissions guidance on incident reporting
Any company or person interested in sharing its opinion or questions with the Chamber of Commerce on the above-mentioned topic is invited to contact juridique@cc.lu.
[1] Regulation (EU) 2024/1689 of the European Parliament and of the Council of 13 June 2024 laying down harmonised rules on artificial intelligence and amending Regulations (EC) No 300/2008, (EU) No 167/2013, (EU) No 168/2013, (EU) 2018/858, (EU) 2018/1139 and (EU) 2019/2144 and Directives 2014/90/EU, (EU) 2016/797 and (EU) 2020/1828 (Artificial Intelligence Act)
[2] The reporting obligation of Article 73 AI Act applies to serious incidents (defined under Article 3(49) AI Act) and widespread infringements (defined under Article 3(61) AI Act).
[3] The serious incidents reporting obligation will become applicable from August 2026.
[4] The European Commission is mandated to issue guidance on the topic under Article 73(7) AI Act.