The much-needed harmonisation between the AI Act and MDR/IVDR

by ,

Alejandro Bes, head of legal, Novartis, Francisco Javier García Pérez, counsel, Uría Menéndez look at the proposed EU AI Act. This new proposal has raised many concerns such as if medical devices using AI shall also be notified to other bodies and the liability regime applicable to medical device manufacturers.

Shutterstock

The European Union is currently making considerable efforts to update its current legislative framework and adapt it to the digital era. On 21 April 2021, the European Commission issued a Proposal for a Regulation of the European Parliament and of the Council laying down harmonised rules on artificial intelligence and amending certain union legislative acts (AI Act). The AI Act aims to improve the functioning of the internal market by establishing a uniform legal framework to govern the development, marketing, and use of artificial intelligence (AI) in conformity with European Union values. 

The AI Act will not only affect European Union-based companies, given that Article 2 provides that it would apply to (a) providers placing on the market or putting into service AI systems in the Union, irrespective of whether they are established in the European Union or a third country; (b) users of AI systems located in the European Union; and (c) providers and users of AI systems located in a third country, where the system’s output is used in the European Union. In other words, the AI Act may affect many companies worldwide as long as they are in any way connected (including only commercially) with the European Union. 

AI systems (understood as software that is developed with one or more of the techniques that can, for a given set of human-defined objectives, generate outputs such as content, predictions, recommendations, or decisions influencing the environments they interact with) becoming increasingly important in the life sciences field. Nowadays it is common for medical devices to incorporate AI components (medtech solutions) or to actually be an AI system per se (i.e. software as a medical device, which is an increasingly common phenomenon in the field of healthtech solutions). Several players in the medtech industry have been focusing on how AI fits into the current European Union legal framework (mainly the General Data Protection Regulation, Regulation (EU) 2017/745 of the European Parliament and of the Council of 5 April 2017 on medical devices (MDR), Regulation (EU) 2017/746 of the European Parliament and of the Council of 5 April 2017 on in vitro diagnostic medical devices and repealing Directive 98/79/EC and Commission Decision 2010/227/EU (IVDR)); see, for example, the comprehensive report on this complex subject-matter that the European Coordination Committee of the Radiological, Electromedical and Healthcare IT Industry issued in September 2020.

It is therefore undeniable that the approval of the AI Act would have a direct – and substantial – impact on the medtech industry. For this reason several respected voices in the sector have raised some concerning misalignments between the AI Act and MDR/IVDR (e.g. MedTech Europe, in its 6 August 2021 response to the open public consultation regarding the AI Act), which would not only make the existing procedure for developing and commercialising medical devices more complex, but could also lead to irreconcilable inconsistencies between the two pieces of legislation.

A first ground for criticism is that, under the AI Act’s current wording, most medical device software (placed on the market or put into service as a stand-alone product or component of a hardware medical device) would likely be classified as a “high-risk AI system”. Indeed, Article 6 of the AI Act, coupled with Annex II (paragraph 11), provides that all medical devices subject to a conformity assessment procedure by a Notified Body must be classified as high-risk AI systems. As medical devices consisting of AI systems are usually classed as software as a medical device, which in turn means that they are classified as class II or higher (Annex VIII, Chapter III, Rule 11 of the MDR) and therefore subject to the abovementioned conformity assessment, they must be classified as high-risk AI systems. This is no minor issue since medtech or healthtech service providers are subject to substantial obligations and formal requirements if the AI system is classified as high risk.

Moreover, this situation may lead to the contradictory situation whereby a medical device is not classified in the highest risk category under MDR/IVDR but is under the AI Act. This will most certainly result in additional administrative barriers to entry for new products that could negatively affect incentives for developing innovative technologies as well as the timelines for those technologies to reach citizens.

The AI Act also establishes certification requirements that differ substantially from the procedures under MDR/IVDR (see Chapter 4, Articles 30 and subsequent of the AI Act). The proposed text of the AI Act is unclear on how foreseeable simultaneous proceedings by the supervisory authorities – each applying its own, uncoordinated criteria – would be handled. If not properly addressed, this scenario would also increase uncertainty, complexity, legal costs, and delays for launching medical devices in European markets. Likewise, it is unclear whether the authorities mentioned in the AI Act will be specialised enough in the field of medical devices to provide sector-specific assessments and control, which is crucial given the complexity of the life sciences sector, or if they will only specialise in AI systems. 

Another controversial aspect is that it implements a dual post-market control to be carried out by two (potentially unconnected) supervisory authorities for medical devices relating to AI systems. MDR/IVDR already establishes strict and extensive vigilance and post-market controls on the safety, quality, and performance of medical devices, which are carried out by the manufacturers of the medical devices themselves and the competent authorities ex officio. In this regard, the AI Act also establishes its own – and rather strict – post-marketing control framework (see Articles 65 and 67.1). An uncoordinated multiplicity of post-marketing controls and authorities not only creates more legal uncertainty for medtech companies (in terms of applicable regulation, proceedings, reporting, etc.), but also hardly seems justifiable considering the high standards and severity of the mechanisms already in place under MDR/IVDR. 

Finally, as Medtech Europe has correctly pointed out, some of the definitions in the AI Act proposal (e.g. “provider”, “importer”, “serious incident”, “putting into service”, “user”) do not match those in the MDR; while the definition of “risk” – which is essential for conformity assessments – is missing from the proposal altogether.

In conclusion, although a common and updated legal framework for AI systems in the European Union is undoubtedly needed, it must be tailored to the specificities of the medical device industry and its regulatory landscape. Irreconcilable or illogical regulatory contradictions in the field of medical technologies will only create legal uncertainty, overcomplicate matters and hamper innovation. Therefore, to avoid substantially (and unnecessarily) disrupting the medtech sector, the existing misalignments between the proposed AI Act and the current MDR/IVDR would need to be urgently addressed.

Back to topbutton