Most of the approaches regarding facial recognition (“FR”) are based in the same mistake: what a fundamental right is and how must be protected. The whereas 4 of the Regulation (EU) 2016/679 (“GDPR”) defines privacy as a fundamental right whose scope is limited, shaped by all other freedoms and rights colluding with it.
But perhaps the best way to understand such a mistake is to use as an example a fundamental right not involved, prima facie, in FR apps, such as the freedom of expression. Let’s put and answer some questions regarding this right:
No, it’s not. There are other fundamental rights at stake, such as the right to privacy and family life, the right to a truthful information and of course the freedom of expression of others, which may be impaired in case we protect unlimitedly the freedom of a sole person.
In the same way than all other rights and freedoms. All of them are protected within the foggy boundaries marked by all other rights acting at the same time and at the same place, as it is said in article 29.2 of the Universal Declaration of Human Rights.
Coming back to FR apps, the previous reflections show that a legal and ethical approach must balance all rights at stake, not only the right to privacy or to personal data. So that, FR shall take into account the protection of, at least, the following fundamental rights:
Ø The right to the protection of personal data. Known as informational self-determination, it is included in the broader category of privacy, honour and reputation rights. It is protected in the article 8.2 of the Charter of Fundamental Rights of the European Union (the “Charter”).
Ø The right to liberty and security. It is protected in the article 6 of the Charter.
Ø The rights to life and to health. The right to life is protected in article 2 and 3, which refers to the respect for physical and mental integrity, and the right to health is protected in article 34, guaranteeing the right to Social Security, and in article 35.
Ø The right to an effective remedy and to a fair trial. Protected in article 47 of the Charter, includes the right to obtain evidences and to use them in a fair trial.
For instance, a security FR application within an airport may protect the liberty and security of the commuters or even their rights to life and to health in situations of terrorist attack threat or health risk (because of a pandemic or for other medical reasons). Meanwhile, forensic uses of FR, that is to say the application of FR techniques to already recorded footage within the limited time and space where a crime has been committed, will provide substantial evidences (the face and perhaps the identity of the perpetrator), thus protecting the right to an effective remedy and to a fair trial of the victims.
Article 9.2 of GDPR balances all rights above described and set a series of cases where FR can be used. Therefore, for EU privacy laws, to implement a FR system is not a matter of applying some security measures, but of determining if the controller is entitled to install such a system. Of course, the first exception allowing to use FR is the explicit consent of the subjects, but such consent is only applicable to FR systems for access control. In case of security solutions identifying people on the fly, there are other applicable exceptions:
a) Substantial public interest, article 9.2.g of GDPR
The expression “substantial public interest” is not defined in the GDPR. Up to now, neither the European Data Protection Board nor any member state data protection agency has interpreted this concept. In spite of this lack of references, we can infer that, because of the addition of the word “substantial”, there should be serious reasons of public interest. In other words, from a point of view of security, of public security, this kind of systems must be available at least for the most threatened infrastructures and facilities.
Following this path, we could rely on the Directive 2008/114/EC of 8 December 2008 on the identification and designation of European critical infrastructures, which requires to the member states to list these so-called critical infrastructures and to establish specific protective measures for them. The concrete list of such infrastructures is a secret matter, but the criteria to define as critical a facility are included in article 3.2 of the Directive 2008/114, according to which facilities related to public administration, alimentation, chemicals, water, health, investigation, space, etc., may be considered critical.
Therefore, all buildings and surroundings related to strategic industries or supplying (energy, chemicals, water, etc.) transport, cultural value, or those facilities or places where a lot of people can gather (such us stadiums, concert halls, fairs, shopping centres, etc.) are suitable for the use of FR, in order to protect the people from intentional attacks, such as terrorism, but also to prevent other kind of crimes, to gather evidences about them, to find lost people or people in search, etc. This interpretation has been recently confirmed by the Spanish Agencia de Protección de Datos (AEPD) in its report 0031/2019.
b) Reasons of public interest in the area of public health, article 9.2.i of GDPR
As abovementioned with regard to the right to life and health care, the article 9.2.i of GDPR is tailor made for a pandemic. According to it, it is permitted to process the so-called special categories of personal data (biometric data, health or genetic data, etc.) in order to fight against “serious cross-border threats to health”. For instance, for searching persons not wearing a mask, or individuals, wearing or not a mask, that are not respecting a quarantine or are in search by security forces for any reason.
c) Establishment, exercise or defence of legal claims, article 9.2.f of GDPR
The aforesaid right to an effective remedy and to a fair trial may prevail over the right to the protection of personal data, where the processing of biometric data (using FR for instance) is proportional and necessary to prepare, issue or defend against any kind of legal claims “whether in court proceedings or in an administrative or out-of-court procedure”, as stated in Whereas 52 of GDPR.
But how can we decide if FR is proportional and necessary? Again, no data protection agency has interpreted article 9.2.f of GDPR. Surely the forensic use of FR, that is to say the application of FR tools to already filmed footage, for instance to recover evidence in a criminal or civil case, can be deemed proportional and necessary, due to the limited amount of biometric data that will be processed. Nevertheless, the use of FR in real time for recovering evidences of upcoming occurrences, will be more difficult to argue.
But beyond the legal constraints at EU level, FR implies also ethical constraints that, in fact, bears the same relation to the above-mentioned fundamental rights. In other words, the ethical debate could be also focused in balancing all the rights of the people directly or indirectly affected by the data processing, however, it cannot be resolved just fulfilling legal requirements, but using the principles of proportionality and necessity.
FUTURE has developed several tools allowing the system manager not only to fulfil legal provisions, but to implement data protection by design; that is to say, an ethical perspective in FR apps that respects the data subject rights even before the installation of the solution:
⮚ FR accuracy without demographic bias. FUTURE uses a curated and refined training dataset of more than 10 million images (from public and private databases) particularly focused on collecting images from underrepresented groups. Furthermore, we use the following additional methods to reduce ethnicity and gender bias during training:
Such training methods protect the proportionality and necessity principles by means of:
o Accuracy, article 5.1.d of GDPR.
o Limitation of the purposes for which data are collected, article 5.1.b of GDPR.
⮚ Not processing biometric data when not needed. In cases where facial identification is not needed, the system will only manage images without crossing biometric data against the database. This protects the proportionality and necessity principles by means of:
o Data minimisation, article 5.1.c of GDPR.
o Limitation of the purposes for which data are collected, article 5.1.b of GDPR.
⮚ Face blurring or hiding. All faces detected by the system and not corresponding to individuals enrolled in the database, will be blurred or hidden in real-time. This permits to achieve proportionality and necessity principles by means of:
o Data minimisation, article 5.1.c of GDPR.
o Limitation of storage and conservation of data, article 5.1.e of GDPR.
⮚ Deletion of detected or identified images. The user can choose a range of time for the conservation of faces detected or identified by the system. The next benefits are achieved:
o Limitation of the purposes for which data are collected, article 5.1.b of GDPR.
o Limitation of storage and conservation of data, article 5.1.e of GDPR.
⮚ Encryption. System communications (between the camera, the edge station, the server, etc.) and also the database, can be encrypted and this provides:
o Integrity and confidentiality of the processed data, article 5.1.f of GDPR.
o Security of processing, article 32.1.a of GDPR.
⮚ Centralisation of enrolled subjects’ database. Keeping the database in a sole server instead of in several devices permits to achieve the next goals:
o Security of processing, article 32.1.a of GDPR.
o Integrity and confidentiality of the processed data, article 5.1.f of GDPR.
⮚ Limited access to data in peripherals. The peripherals connected to the system, such as mobile phones, tablets, PDAs, etc. will receive alerts but will not be able to download or retrieve any data from the database. The next benefits are achieved:
o Data minimisation, article 5.1.c of GDPR.
o Limitation of the purposes for which data are collected, article 5.1.b of GDPR.
Integrity and confidentiality of the processed data, article 5.1.f of GDPR.
FUTURE
Copyright © 2023 FUTURE - All rights reserved
Con tecnología de GoDaddy