One region, all voices

L21

|

|

Read in

Racist Algorithms against Refugees

Today, global migration dynamics are permeated by the development and application of increasingly sophisticated and invasive digital technological devices, which largely condition people’s destinies and influence power strategies. Within the framework of social engineering, digital technologies enhance discrimination through resources for measuring biological dimensions and human behavior. They also make it possible to link political strategies ranging from the micro-reality of the individual to the macro-reality of transnational power relations.

The war between Russia and Ukraine has already caused the displacement of more than eight million people to the European Union. Faced with the looming disaster, the European Commission proposed to activate, in February 2022, the Temporary Protection Directive which aims to provide immediate protection to displaced persons. This Directive, created in 2001, had never been applied before despite the fact that masses of refugees had already knocked on the EU’s doors. For some analysts, there is political resistance to implementing certain legal mechanisms, based on prejudice against people from non-European countries, as attested by Professor Dr. Meltem İneli Ciğer of the Faculty of Law at Suleyman Demirel University.

The decision to implement the Directive converges with the EU’s own information policy on border management, human mobility, and asylum, which is based on permanent surveillance of groups deemed undesirable. This political-legal framework is expressed in the very technological devices used in order to create barriers, instead of facilitating people’s access and guaranteeing them their rights as established by international treaties regulating asylum and refuge.

Taking into account the difference in refugees’ treatment according to their race, ethnicity, culture, and nationality, it gives the impression that there is a certain suspension of identification procedures for Ukrainian refugees, due to the rapid mobilization of EU members in receiving them.

On the other hand, the waves of people from Central American countries heading for the United States are fleeing violence, hunger, and climate change. The treatment accorded to Latin American migrants seeking refuge depends on the government in power. Still, they are invariably subject to containment policies through security surveillance devices that operate in silence and express contempt, social prejudice, and racism. In 2019, former President Donald Trump reinforced the idea of building the wall between the U.S. and Mexico but presenting an alternative solution, referring to a smart wall: “The walls we are building are not medieval walls. They are smart walls designed to meet the needs of front-line border agents.”

In fact, surveillance rules and procedures are still in place and reveal the unequal treatment between Ukrainian refugees and those from non-European countries. The same is true when it comes to singling out Latin American migrants. This is evident in the impact of algorithms on the dynamics of migration, in what is presented as an essential aspect in any identification process, which is to classify, segregate, privilege, or punish certain social groups.

However, the bias of the codes programmed for the identification of personal profiles, characterized by racial conditions, behavioral characteristics, and their relationship with the governance of migrations as a whole, is evident. This trend encompasses a wide range of actions and institutions involved in surveillance for monitoring and control. It is a situation identified by experts and activist movements such as the academic research of Joy Buolamwini, from the Massachusetts Institute of Technology (MIT), which was presented in the documentary Coded Bias.

In research conducted at the MIT Media Lab, Buolamwini, who is black, placed her face in front of facial recognition devices but was not identified. However, when a white mask was placed on her face she was immediately recognized. The conclusion is that there is algorithmic bias in face recognition systems, based on Artificial Intelligence. That is, the algorithms are driven by classification processes, dividing groups of people who deserve to be recognized from those who are literally excluded by the system.

There is, therefore, a bias in the EU’s political strategy to discriminate against entire groups, mostly from poor countries, often fleeing conflict and seeking refuge on European territory. In relation to this issue, the EU articulates several legal, political, and technological devices to reinforce its external borders and prevent these groups from accessing the EU, including agencies dedicated to the collection, organization, and exchange of information, such as the European Asylum Dactyloscopy Database (EURODAC), the Schengen Information System (SIS II); and the Visa Information System (VIS). These agencies link up with others that exercise effective coercive power, such as the European Border Surveillance System (EUROSUR) and the Passenger Name Record (PNR) systems.

At the same time, refugee assistance programs depend on national policies and systems aimed at identifying the profile of people entering host countries. In other words, algorithmic racialization would be associated with national security interests.

There is thus a phenotypic and social classification for the purpose of making visible individuals and groups who are allowed access to the common goods of citizenship, or the invisibility of undesirables. This concern was laid out in the White House report, published in 2014, during the Obama administration, entitled Big Data: Seizing Opportunities, Preserving Values, on the uses of personal data for the purpose of privileging or excluding groups of people based on their racial and class status, primarily in relation to “housing, credit, employment, health, education, and the marketplace.”

According to Canada’s Center for International Governance Innovation, this may extend to the fields of immigration, public safety, policing, and the justice system, “as additional contexts in which algorithmic big data processing affects civil rights and liberties.”

Color and other phenotypic elements anticipate, or exercise, a pre-classification of human beings who deserve privileges, differentiating them from those who do not even have the possibility of being assisted and cared for. This is even before immigrants cross borders. The clarity of color, as well as the national origin of migrants, anticipates the detection of people even before their data appear in the databases.

Other more sensitive personal data, such as political opinions and religious beliefs, are identified later when the information is checked against other databases and may therefore enter the assessment procedure at a later stage. These databases are structured in a biased way, legitimizing differences between human beings and reproducing inequalities between individuals.

The current procedures are not characterized by being actions of exception, but permanently promote the division between privileged groups, on the one hand, and, on the other, the classification, segregation, isolation, punishment, and banishment of the undesirables, under the justification of the risk to the State and society, by not being part of the white and Christian European social “club”.

It is essential to understand and rethink the role of information policies aimed at surveillance systems for the control of human mobility. Because, contrary to what it may seem, even white-skinned, blue-eyed, and Christian immigrants can also suffer, in one way or another, at some point, the discrimination of systems that bet, above all, on the foreigner’s fear.

*Translated from Spanish by Janaína Ruviaro da Silva

Autor

Sociologist and document manager, with a doctorate in Information Sciences, IBICT - UFRJ Researcher of the group Critical Studies in Information, Technology and Social Organization of the IBICT.

spot_img

Related Posts

Do you want to collaborate with L21?

We believe in the free flow of information

Republish our articles freely, in print or digitally, under the Creative Commons license.

Tagged in:

Tagged in:

SHARE
THIS ARTICLE

More related articles