Policy brief
17.11.2025

Digital Evidence in Refugee Status Determination

The latest AFAR Policy Brief by William Hamilton Byrne and Professor Thomas Gammeltoft-Hansen examines the rapid rise of digital evidence in asylum procedures and calls for a balanced, human rights–compliant framework that recognises both the risks and the potential benefits for asylum seekers.

In this policy brief published under the Algorithmic Fairness for Asylum Seekers and Refugees (AFAR) Project, Assistant Professor William Hamilton Byrne and Professor Thomas Gammeltoft-Hansen from the MOBILE Centre of Excellence for Global Mobility Law at the University of Copenhagen explore how states are increasingly using digital forms of evidence, ranging from mobile phone data and biometrics to open-source information, in refugee status determination (RSD). These practices are emerging in a decision-making environment already marked by uncertainty, limited verifiable documentation, and significant power asymmetries.

The authors show that governments across Europe have begun adopting digital tools for identification, credibility assessment, and fraud detection, including automated document checks, dialect recognition software, and large-scale data extraction from applicants’ devices. Yet these developments have outpaced the establishment of common standards or safeguards. Current research has focused primarily on the human rights implications, especially privacy concerns linked to phone searches and the opacity of algorithmic decision-support systems. Ongoing litigation before the European Court of Human Rights and the classification of asylum and migration as “high-risk” domains under the EU AI Act highlight the urgent need for clearer legal and procedural limits on digital evidence in RSD.

At the same time, the brief argues that digital evidence also holds important potential to support asylum seekers, their legal representatives, and civil society organisations. New uses of satellite imagery, geolocation, open-source investigations, and remote sensing can help document risks, contest internal protection assumptions, and challenge unjust decisions. Large-scale computational analyses of asylum decisions—what the authors term digital meta-evidence—offer new possibilities for identifying bias, structural inconsistencies, and patterns of unfairness in adjudication, with significant implications for individual cases and broader policy reform.

The authors conclude that policies on digital evidence must avoid simplistic assumptions that technological tools will solve evidentiary gaps or, conversely, that digitalisation is inherently harmful. A balanced, rights-based framework is required: one that regulates extraction and use, strengthens accountability and transparency, and recognises the constructive role that digital tools can play in levelling the evidentiary playing field for asylum applicants.

The 4-year Algorithmic Fairness for Asylum Seekers and Refugees (AFAR) project is a collaborative research initiative hosted at the Centre for Fundamental Rights from 2021 to 2025. Funded by the Volkswagen Foundation through its Challenges for Europe programme, AFAR brings together six institutions across Europe to examine the fairness of automation and decision-making in migration and asylum governance.