Click to expand Image
Police detain individuals assumed to be migrants in central Athens, on Sunday, August 5, 2012. Between August 4, 2012, and February 22, 2013, Greek police detained almost 85,000 people of foreign origin on the streets of Athens to check their identification papers and legal status.
© 2012 Associated Press/Thanassis Stavrakis
(Athens) – Greece is planning a new police program to scan people’s faces and fingerprints that is inconsistent with international human rights standards on privacy and likely to amplify ongoing discrimination, Human Rights Watch and Homo Digitalis said today. Under the EU-funded program, the police would use hand-held devices to gather biometric information from people on a vast scale and cross check it against police, immigration, and private sector databases primarily for immigration purposes.
In recent years, Greek police have carried out abusive, and often discriminatory, stops and searches of migrants and other marginalized populations, including to enforce Covid-19 movement restrictions. This program would most likely facilitate and increase the unlawful practice of racial profiling. Greece should halt plans for the program.
“The European Commission is funding a program that will help Greek police to target and harass refugees, asylum seekers, and minority groups,” said Belkis Wille, senior crisis and conflict researcher at Human Rights Watch. “In a country where the police frequently demand to see documents without reasonable cause, this program would deliver a tech-driven tool to ramp up abuse.”
The Greek police signed a contract some time in the first half of 2019 with Intracom Telecom, a global telecommunication company, to help create the “smart policing” program. The police announced the signing of the contract in December 2019.
The program will cost an estimated €4.5 million, 75 percent of it funded by the EU Commission’s Internal Security Fund. The start of the program was initially planned for early 2021 and then delayed to August due to Covid-19 related restrictions. On September 15, the police paid Intracom in full, but there were no indications in the streets by the end of the year that it was being used.
Under the program, police will receive smart devices with integrated software to enable them to scan vehicle license plates, collect fingerprints, and scan faces. People’s biometric data can be immediately compared with data already stored in 20 databases held by national and international authorities. At least one of the databases and systems cited may already be collecting massive amounts of biometric data in public spaces.
Based on the technical specifications, the system will delete fingerprint scans immediately if there is no match but will store photographs for seven days. If the system finds a match for the fingerprints, photographs, or facial scans, the data will be retained for an unspecified amount of time.
Greece is touting the new program as a more “efficient” way, among other things, to identify undocumented or improperly documented migrants.
The police are currently authorized to ask people for their identification documents. However, Human Rights Watch research in Greece has shown that even when migrants, asylum seekers, and other marginalized groups have documents, police often stop them and detain them in a police station for hours pending verification of their identity and legal status. The Greek police have used these powers in a discriminatory manner to target people based on their race, perceived nationality or ethnicity, or physical appearance.
Combined with police orders to target specific social groups, these powers have enabled repetitive, unjustified identity stops of migrants, asylum seekers, and marginalized groups such as homeless people, people who use drugs, and sex workers. The use of facial recognition technology, which in some studies has been found to more likely misidentify people of color, and the collection of biometrics could exacerbate these abusive police tactics, which constitute racial and other forms of profiling and harassment, the organizations said.
The new program also risks expanding these practices by allowing police to target these communities with stops at a wider scale. The Greek police themselves say that under the new program the number of daily police stops will increase.
Human Rights Watch has said that the Greek police should use their authority to stop people and require them to show identity documents only when based on a reasonable suspicion that the person is involved in an illegal activity. Human Rights Watch also has said that the police should develop and put in place systems to check the validity of identity documents without detaining people or gathering personal biometric data.
In its current form, the new program would not comply with Greek and European law. EU Directive 2016/680 – known as the Police and Criminal Justice Authorities Directive or the Law Enforcement Directive (LED) – is legislation parallel to the GDPR, the EU’s data protection and privacy regime, which deals with the processing of personal data for law enforcement purposes. Article 8 of the directive stipulates that personal data must be collected and processed in a lawful manner. Member states must enact laws governing data collection and processing that specify “the objectives of processing, the personal data to be processed and the purposes of the processing.”
Article 10 of the directive provides further protections for special categories of personal data, including data that reveals racial or ethnic origin. The collection of biometric data, such as fingerprints and facial scans, must be strictly necessary and subject to appropriate safeguards for the rights and freedoms of the data subject and have a legal basis. The only exceptions are data collection to protect a person’s vital interests, or when a person has made the data public.
Since August 2020, the Hellenic Data Protection Authority (DPA) has been investigating the lawfulness of this “smart policing” program, following a related request filed by the Greek digital rights organization Homo Digitalis.
On December 8, Human Rights Watch wrote to the Greek police, European Commission and Intracom, flagging its concerns with the program and asking questions about any measures taken to ensure that the program would not lead to human rights abuses. Only Intracom replied, stating that any questions around the program should be transmitted to the police.
Under international and European law, government collection or use of personal and sensitive data, including vehicle license plate numbers and biometrics, must comply with international human rights standards, specifically article 17 of the International Covenant on Civil and Political Rights (ICCPR) and article 8 of the European Convention on Human Rights on the right to privacy. Interference with the right to privacy is only permissible when it is neither unlawful nor arbitrary, that is, it must be both necessary and proportionate to the end sought.
Greek authorities have other tools at their disposal to enforce immigration laws and conduct policing, the groups said. Collecting this biometric information via the “smart policing” program – and the significant intrusion on privacy and threat to nondiscrimination rights that that it represents – is neither necessary nor proportionate. Greece should not move forward with the program.
The European Commission should not fund any policing programs that collect personal and biometric data in ways that violate international human rights standards or the data protection standards enshrined in EU Directive 2016/680.
As Intracom is playing an integral role in the design and deployment of the “smart policing” program, the company should fulfil its due diligence responsibilities by suspending its support of the program given the serious human rights implications, and conduct and publish a human rights impact assessment of the program.
Under the United Nations Guiding Principles on Business and Human Rights, technology companies have a responsibility to ensure that their products and services do not contribute to human rights abuses, including violations of privacy and nondiscrimination protections.
“This policing program is in fundamental conflict with the essence of human dignity and the protection of fundamental rights and freedoms in public spaces,” said Konstantinos Kakavoulis, co-founder of Homo Digitalis, a Greek digital rights organization. “The Greek government should not ignore the high risk this program will pose for enabling unchecked control if it is launched.”
For additional information, please see below.
“Smart Policing” Program
The “Smart Policing” program was announced in a 2017 publication that advertised upcoming programs of the European Commission’s Internal Security Fund, which existed from 2014 to 2020 to promote “the implementation of the Internal Security Strategy, law enforcement cooperation and the management of the Union’s external borders.” The commission said that the program was intended to “strengthen preventive policing” and “prevent unnecessary suffering for citizens.”
The Greek police published a 177-page technical specifications document about the program on April 18, 2021, stating that it aims, among other things, to facilitate the “identification and verification of citizens identity when stopped by the police” and the “effective control of third country nationals.” The Greek police say that the new policing tool will be a more “efficient” way to identify people with irregular immigration status than the current protocol of taking people without identification documents to the nearest police station.
The program’s smart devices will be deployed for use in everyday police work, offering thousands of officers access to biometric data from national and international databases. The program’s interface will allow police officers to collect fingerprints and take photos and facial scans, which the authorities can immediately compare with the other databases.
These databases are maintained by institutions that include Interpol, the US Federal Bureau of Investigation, the Greek Ministries of the Interior, Transport, and Foreign Affairs, and private firms including Teiresias, a Greek credit bureau.
Data Protection Authority Investigation
In July 2019, the Greek government passed data protection legislation, Law 4624/2019, to enshrine the EU General Data Protection Regulation (GDPR) and to transpose the provisions of EU Directive 2016/680 into national law. In December 2019, Homo Digitalis wrote to the Greek Minister of Citizen Protection inquiring whether the “smart policing” program had a legal basis and whether it complies with relevant data protection laws, including Greek Law 4624/2019. The police responded in February 2020, citing Law 3917/2011, which covers the police use of CCTV cameras in public spaces.
In June 2020, the Data Protection Authority issued a decision saying that Law 3917/2011 does not cover the use of facial recognition or other identification technologies in the cameras used in public spaces. The agency noted that facial recognition and identification technologies should be considered a separate data processing activity, and such a use could conflict with EU data protection standards.
Following a March 2020 Homo Digitalis request to examine and issue an opinion on the legality of the proposed new policing program, the Hellenic Data Protection Authority in August 2020 opened investigation into the matter but has yet to issue an opinion. In its request, Homo Digitalis pointed out that Greece’s data protection law requires government bodies to conduct a data protection impact assessment and to consult the agency if they plan to initiate processing activities using new technologies with a high risk to the rights and freedoms of data subjects.
In their multiple engagements with Homo Digitalis, the police have failed to respond to questions about whether this impact assessment has taken place. On October 11, the Data Protection Authority confirmed to Homo Digitalis that it is currently reviewing the “smart policing” program.
EU Information Request
On October 14, based on the European Union’s public access to documents, Human Rights Watch and Homo Digitalis requested access to all documents related to the Internal Security Fund decision to fund 75 percent of the program, any data protection impact assessment or other assessment carried out related to it, and any documents related to internal discussions or assessments of its legality under European and International law. On October 28 the European Commission responded to the request, stating that it did not hold any documents related to the program. On November 10, Human Rights Watch submitted a confirmatory application asking the Commission to reconsider its position. As of January 18 the commission had not responded substantively to the application, but informed Human Rights Watch that they are continuing to review it.
International Law
Article 17 of the International Covenant on Civil and Political Rights (ICCPR) affirms the right to privacy, which may not be subject to arbitrary or unlawful interference.
The UN Human Rights Committee, the authoritative body charged with interpreting the ICCPR, has held that “any interference with privacy must be proportional to the end sought and be necessary in the circumstances of any given case.” It has also stated that “gathering and holding of personal information in computers, data banks, and other devices, whether by public authorities or private individuals, must be regulated by law” and that every individual should have the right to know “what personal data is stored…and for what purposes” and “which public authorities or private individuals or bodies control or may control their files.” If a person is concerned that data has been collected or used incorrectly, they should have recourse to remedy the problematic information.
Article 8 of the European Convention on Human Rights protects the right to respect for private and family life, home, and correspondence. The European Court of Human Rights has held that protecting personal data is of fundamental importance to respect for the right to privacy. The court has developed a significant body of case law addressing the collection, retention, and use of personal data, as an interference in a person’s private life, standards that are largely reflected in the EU Directive 2016/680.
The court has stressed that it will exercise careful scrutiny of any measure that authorizes taking, retaining, or using personal data without consent, including to ensure that appropriate safeguards are in place to protect against compiling or using data “in a manner or degree beyond that normally foreseeable”.
The International Convention on the Elimination of All Forms of Racial Discrimination prohibits policies and practices that have either the purpose or effect of restricting rights on the basis of race. The Committee on the Elimination of Racial Discrimination, which interprets the convention, has specifically stated that “indirect – or de facto – discrimination occurs where an apparently neutral provision, criterion or practice would put persons of a particular racial, ethnic or national origin at a disadvantage compared with other persons, unless that provision, criterion or practice is objectively justified by a legitimate aim and the means of achieving that aim are appropriate and necessary.”
The UN Guiding Principles on Business and Human Rights say that companies should respect the right to privacy. To execute this responsibility, companies should have “a policy commitment” to observe and respect human rights, a “due diligence process” to identify, reduce, and prevent human rights-related impact, and a remediation process to rectify any adverse human rights effects which they participate in or cause.