On the 22nd of last month, the São Paulo Court ruled that the São Paulo Metro be prevented from running the facial recognition system that it has been implementing in its facilities. The decision was released after the filing of a civil action by the Public Defender’s Office of the State of São Paulo, the Public Defender’s Office of the Union and several civil society entities (Idec, CADHu, Intervozes and Article 19), in which they demand that the Metro be prevented from capturing users’ biometric data and sentenced to pay compensation for collective pain and suffering in an amount in excess of 42 million reais.
Proponents of the action point out that the implementation of a facial recognition system in public transport violates several fundamental rights of users, in addition to being incompatible with the rules set by the LGPD for the processing of personal data.
Metrô, in turn, argues that the use of technology to guarantee the safety of users is legitimate, also raising the possibility of application for the location of missing children.
Although these arguments may, at first glance, make it seem that the implementation of facial recognition cameras is in line with the protection of children and adolescents, ensuring greater security and the eventual location of the disappeared, a deeper analysis of the issue reveals that the consequences adherence to vigilantist practices and massive monitoring of the population violate the most basic principles and rights, guaranteed by Brazilian legislation and international standards.
Child protection, therefore, cannot be articulated in favor of defending these practices – it must, in fact, serve as an additional argument to repel them. For the reason for this to be evident, it is important to mention the very high discriminatory potential of facial recognition technologies, especially when applied to the black population or belonging to other ethnic-racial minorities.
It is widely documented that the development of these technologies is permeated by unconscious biases (generalizations based on stereotypes) that affect their accuracy when applied to non-white faces. This is because, in general, the phenotypes used to train artificial intelligence to identify faces are precisely those of Caucasian men, greatly increasing the possibilities of these technologies erroneously identifying a black person as an offender, for example.
Examples of this phenomenon, which has been called algorithmic racism, abound. Last year, the case of a black American teenager who was prevented, by facial recognition technology, from accessing a skating rink, for allegedly having previously fought at the establishment, which, in reality, she had never entered, gained notoriety. . Even more alarmingly, a number of people of color were wrongly arrested in the US because of problems with facial recognition.
The chances that the implementation of these technologies in public transport will result in an aggravation of the already intolerable racial discrimination and inequalities that plague Brazil are, therefore, immense. It is no exaggeration to consider, for example, an increase in undue approaches of black adolescents for infractions they did not commit – situations that, if they are now routine in the Brazilian context, would certainly become even more frequent when legitimized by technologies mistakenly considered neutral. and high precision.
It is worth remembering that non-discrimination is one of the structuring principles of the Convention on the Rights of the Child, of which Brazil is a signatory, in addition to constituting itself as one of the pillars of our Federal Constitution, which, in addition to listing as one of the fundamental principles of the Republic “to promote the good of all, without prejudice as to origin, race, sex, color, age and any other forms of discrimination” (art. 3, IV), guarantees to children, adolescents and young people, in its article 227, protection against any form of discrimination and oppression – a provision that is echoed by art. 3, sole paragraph of the ECA.
These are the same instruments that guarantee children and adolescents a series of rights that are absolutely incompatible with the development of a social model guided by the ubiquitous monitoring of the population. The freedoms of association, expression, development and locomotion are all prerogatives guaranteed by the rules of protection of children which, if not absolutely hampered immediately by the use of facial recognition technologies, certainly find less space to be fully exercised in a State increasingly marked by vigilantism.
It is also important to highlight the privacy and protection of personal data as the rights of children and adolescents strongly threatened by the installation of facial recognition systems in public transport. Leaking or misuse of the data of these individuals can have harmful consequences for several of their rights, which is why the handling of this data must always be guided by the highest standards of protection and security.
Also, the lack of transparency about the treatment of the data collected raises concerns about the possibility that they will be used to meet commercial interests, in violation of what the LGPD determines for the treatment of personal data of children and adolescents.
Further aggravating the situation, there are studies that indicate that the accuracy of facial recognition technologies is also reduced when it comes to the correct identification of children and adolescents, people whose faces still change over time. Thus, the argument that these technologies could be used to locate missing children falls to the ground.
For all these reasons, General Comment No. 25 of the UN Committee on the Rights of the Child (a document that details how the Convention on the Rights of the Child should be interpreted and applied in relation to the digital environment), which had a recently commented version launched by a partnership between Instituto Alana and the Public Ministry of São Paulo, provides that “any digital surveillance of children, associated with any automated processing of personal data, must respect the child’s right to privacy and must not be carried out routinely, indiscriminately or without the knowledge of the child or, in the case of very young children, that of his mother, father or caretaker; nor shall it occur without the right to object to such surveillance”. As is evident, the surveillance intended by the Metro is completely incompatible with what the UN determines.
In summary, although the implementation of facial recognition cameras may seem tempting from the perspective of child protection, this conclusion does not stand up to a deeper reflection on its real impacts and consequences. Children and adolescents have the right to grow up in a free, egalitarian society that gives absolute priority to their rights and best interests – and the presence of cameras on public transport, without any doubt, goes in the diametrically opposite direction to this.
* João Francisco de Aguiar Coelho is a lawyer for the Instituto Alana’s Child and Consumption program