In 1956, author Philip K. Dick’s mind envisioned a world in which the police could do it in ‘Minority Report’. guess before crimes happen. in 2023, European Union (EU) plans to start adopting a system at its borders. Artificial intelligence (IA) to predict migrant flows and possible derived tensions towards the continent, a controversial algorithmic tool being developed with outstanding Catalan participation.
the project known as MITETries to gather information about migrant flows to more optimally manage the arrival, relocation and integration of people trying to reach Europe Avoiding the dangers of the Mediterranean, where about 1,924 people drowned or disappeared last year, according to the UN. There is a public subsidy of 4.87 million euros for this.
dir-dir technology will complete the making of his call’smart border‘, system surveillance where does it open Spain, Italy and Greece, EU countries that border with africa. This modernization of European borders, the use of facial recognition cameras, drones video surveillance or extraction of personal and biometric data.
led by the UAB
in September 2020, Brussels To develop the project, coordinated by the Universitat Autònoma de Barcelona, he formed a consortium of 13 universities, economic institutes, humanitarian organizations and the Greek technology company Terracom (UAB), was financed with approximately 761,000 Euros. nickname EUMigraToolThe system aims to predict future flows to Europe based on public immigration, economic and demographic data from organizations such as Eurostat, the International Monetary Fund (IMF) or the United Nations. It also aims to “identify tension risks arising from migration” by analyzing what is said in the report. social networks.
When finished, this tool prediction only accessible NGO in any case “neither national governments nor European institutions have access to it”, as the Project coordinator of the EU and municipalities in the migrant arrival areas told EL PERIÓDICO, to help expedite their response, Cristina Blasi. Estimates made by the machine will not be strictly adhered to, but will be supervised by a user and assist in decision making.
On the other hand, there are three NGOs participating in the project that made over a hundred interviews. immigrants and applicants asylum to find the driving factors. migration. They say this information will not be “used to predict flows” but will be used for public reports where these NGOs will periodically prepare integration policy recommendations for national governments and the European Commission, as disclosed in this paper. Irene Viti, ITFLOWS managing director Open Cultural Center. Based in Barcelona and Greece, this humanitarian organization received a public aid of 116,800 euros.
multiple risks
The system raises some doubts and risks. “The consortium is fully aware risks possible effects of endangerment human rights”reads a January 2021 document prepared by the ITFLOWS ethics council and released by NGO Disclosure. Other reports warn of the danger of “Member States using the data provided to create ghettos”. immigrants” as well as “stigmatizing, discriminating, harassing or intimidating”.
Access of NGOs to personal information refugees – which may include details of an ethnic, religious or sexual nature – are particularly sensitive. Therefore, measures have been proposed to prevent it. wrong use from interviews, such as obtaining consent from immigrants and anonymizing this data so that they cannot be identified. An internal ethics committee document states that these in-person interviews will serve to “build or perfect predictive capacity,” while Emma Teodoro, leader of the ITFLOWS ethics committee, said “they will not be useful to the model, but useful information to consider when preparing reports.”
The use of AI to make predictions is also waking up doubts in the industry. “It will serve to analyze migration patterns of the past, not to predict the future,” he explains to the newspaper. Ana ValdiviaProfessor of Artificial Intelligence, Government and Policy at the Oxford Internet Institute. The expert points out that it is problematic that ITFLOWS also uses Google searches for its supposed prediction. “Correlation does not imply causation, just because people look for migration problems does not mean they will migrate,” he adds.
Undoubtedly another doubt is what will happen when the financing of the project ends in August 2023. It is the intention of its supporters that ITFLOWS and the foresight tool fall into the hands of NGOs to coordinate their actions. The interviewees do not agree on whether the vehicle will be a prototype or ready for use, but fate uncertain. “We couldn’t determine which was the best way, but What we’re not going to do is give it to governments”Blasi says.
militarization of borders
Fear that ITFLOWS could be used for discriminatory purposes also stems from interest in the tool. front face, has already been used to test the data. The European Border and Coast Guard Agency, which spearheads the militarization of borders with the sponsorship of Brussels, is included in the internal documents of the project as a member of the User Board. In addition, the European Asylum Agency (United States of America) – Consists of representatives of the Member States and the European Commission and involves them in the “design” and “verification” of the estimator. “We know they’re interested in the project,” says Teodoro.
Frontex’s track record is worrying. former CEO, Fabrice LeggeriHe was forced to resign in April after being accused of covering up. abuses Human rights against refugees and immigrants A journalistic investigation by the German ‘Spiegel’ in August revealed that the Greek coast guards illegal hot returns On the high seas, he said Frontex was aware of this and kept the scandal quiet, even by lying to the European Parliament. All paid with taxpayer money.
The project coordinator denies that Frontex is a user of this transition forecasting system. “We can give you reports at most every two years to explain the results we found,” explains Blasi, a law professor and researcher at UAB.
The ethics committee leader points to other options. “Today, Frontex can only use it for humanitarian purposes (…) limits. If someone flirts with the use of safety, we have taken measures to limit the risks,” he says, warning that misuse of the tool will lead to “legal problems.” potential harm use of technology when the project is finished”.