shutterstock 2290650053 450x300

Leading MEPs make counter-proposal on AI rulebook’s law enforcement chapter

​ ​ 

The EU lawmakers spearheading the AI law circulated a possible compromise on the dispositions related to law enforcement, one of the most sensitive areas of the file.

The AI Act is a legislative proposal to regulate Artificial Intelligence based on its potential to harm people and their fundamental rights. The bill is currently being negotiated between the EU Parliament, Council and Commission in so-called trilogues.

On Tuesday (21 November), the MEPs involved in the law-making process are meeting to discuss the three main parts of the file that are still open: obligations for General Purpose AI, governance and law enforcement.

Ahead of the meeting, the European Parliament’s co-rapporteurs Dragoș Tudorache and Brando Benifei circulated a compromise text, seen by Euractiv, on the sensitive law enforcement chapter.

EU countries have been pushing to give their police forces more room for manoeuvre, whilst most MEPs have pushed back against it in the name of fundamental rights protection.

National security exemption

France has been pushing for a broad national security exemption within the EU Council. According to the co-rapporteurs’ proposal, there is unanimity across political groups for a more restrictive wording.

“This Regulation shall not apply to AI systems developed or used exclusively for military purposes. This Regulation is without prejudice to the competences of the Member States with regard to their activities concerning military, defence, or national security,” reads the text.

Remote Biometric Identification

As Euractiv previously reported, the European Parliament seems available to drop its ban on using remote biometric identification (RBI) systems in publicly accessible spaces in favour of narrow law enforcement exceptions.

In their text, the leading MEPs cancelled the reference to ‘real-time’, meaning the exceptions would also apply to ex-post usage. The exceptions include the targeted search for specific victims of a pre-determined list of serious crimes and localising a suspect of these crimes.

The usage should “only be deployed to confirm the specifically targeted individual’s identity, it shall not include automated matching of real time or ex-post video footage with existing databases,” reads the text.

Moreover, the application must be limited to what is strictly necessary regarding time, geographical and personal scope and subject to prior judicial authorisation. The lawmakers removed the possibility that the authorisation might come from an administrative authority.

The co-rapporteurs removed the exception that would have allowed law enforcement authorities that have not yet completed a fundamental rights impact assessment from still using the RBI systems under justified emergencies.

EU countries must pass national legislation regulating the use of RBI in public spaces and inform the Commission within 30 days of the adoption. A supervisory authority would have to be notified of each use of the technology.

Supervisory authorities should provide the Commission with annual reports. In turn, the EU executive will publish annual reports with aggregate data and exercise “systemic oversight and control” on RBI usage.

Additional bans

In exchange for dropping the ban on RBI systems, MEPs are set to obtain the extension of the list of prohibited AI applications.

The EU Parliament is pushing for forbidding biometric categorisation systems that infer people’s sensitive information, like political orientation. The office of Brando Benifei proposed a new text meant to keep the ban while making commercial services out of the scope.

Predictive policing is still an open discussion, as a previous iteration of the text kept it as a high-risk use case. According to the note, most political groups rejected that compromise and insisted on the European Parliament’s mandate to ban this application.

Concerning emotion recognition, the Council has so far shown openness to accept a ban in the areas of workplace and education. Still, the leading MEPs insist on including law enforcement and migration in exchange for flexibility in other file parts.

High-risk use cases

The AI Act includes a list of use cases considered at significant risk to cause harm to people’s safety and fundamental rights. The co-rapporteurs suggest broadening the category of RBI to cover all uses not covered under the ban, including privately accessible spaces.

Similarly, the two lawmakers want all applications of biometric categorisation and emotion recognition technologies not covered under the bans to be deemed high-risk.

The MEPs agree to have AI-powered polygraphs and similar tools in the high-risk list rather than prohibiting them, but only if used directly by police forces and per EU and national law.

Law enforcement and migration uses of AI systems like crime analytics, deep fake detection, travel document verification and migration trends predictions were removed. Any use of AI for border control would be considered high-risk except for the verification of travel documents.

Law enforcement exemptions

European governments have been asking for a law enforcement carve-out to the four-eyes principles, which requires a decision based on a high-risk system to be verified by at least two people. On this point, the co-rapporteurs are pushing back against most political groups.

Similarly, most lawmakers oppose exempting law enforcement authorities from the obligation to register their high-risk AI system to the EU database. A possible compromise could be to register them in a non-public section of the database, but some MEPs insist on having at least some information publicly available.

The derogation for putting into service an AI system that did not undergo the conformity assessment might be accepted in exchange for concessions elsewhere, with the caveat that authorisation must be requested within 48 hours.

[Edited by Nathalie Weatherald]

Read more with EURACTIV



Leave a Reply

Your email address will not be published. Required fields are marked *