top of page
Findings
1. Digital identity systems are ubiquitous in refugee camps. We are witnessing a trivialisation of the use of biometrics well beyond the well-established uses for registration and resettlement processes. In Mae La camp, refugees need to authenticate themselves biometrically in order to access food assistance, basic resources (such as charcoal) and medical care.
2. NGOs introduce biometric technologies in order to comply with donors’ ‘zero tolerance to fraud’ policies.
​3. The vast majority of our refugee participants report that they were not asked to consent when registering their biometric data. Crucially, our participants were not offered an alternative system in case they had concerns, nor were they offered a chance to opt out.
4. The vast majority of our refugee participants were not given opportunities to ask questions about the uses of their biometric data. Refugees expressed concerns during our fieldwork, but did not feel comfortable asking these questions during the process of their biometric registration with NGOs or the UN. Refugees expressed concerns regarding the safeguarding of their data (and the data of their children) and concerns about whether the biometric scanners could cause bodily harm. Refugees were unclear why biometric registrations and
authentication are necessary, but again they reported that they had been unable to ask these questions during the registration process. Without full understanding of digital systems and without being given the possibility to opt out without detriment, we question whether consent – when it is taken – can be meaningful.
5. Several of our refugee participants reported significant errors especially in the uses of facial recognition technology which is routinely used for food assistance. This is not surprising as biometric technologies have known biases when measuring othered bodies, whether in terms of gender, race, ethnicity or age. Yet, some of our participants internalise such errors as their personal failure – as opposed to a machine error. This ‘internalisation of inferiority’ is an example of how biometric technologies produce race (Browne, 2015; Fanon, 1958) and gender and therefore amplify existing inequities within the refugee camp and the humanitarian context more broadly.
6. Facial recognition technologies have a clear gendered dimension with female participants told ‘not to wear makeup’ or ‘stop making themselves look different’ in cases of authentication failure. We observe that digital systems compound feelings of humiliation that are already present in situations of encampment.
7. Digital identity is not simply a form of identity provided in digital format: the conversion of human beings into data has significant consequences for dignity, privacy and freedom.
8. The transactional system of digital identity systems cannot be a substitute for the UN Sustainable Development Goal 16.9 ‘a legal identity for all’. While some of the digital identity systems present efficiencies, they do not respond to our participants’ needs for equality, recognition, mobility and dignity.
9. In the visual participatory research, our refugee interlocutors express the values on which their ideal identification systems should be based. This process of reimagining, articulated through artworks, will inform our policy recommendations to NGOs and other stakeholders.
Please note: The data presented here is part of an ongoing research project involving collaborative fieldwork. If you wish to cite or reproduce any part of this material, please contact us first for permission and guidance, in line with our ethical commitments to participants and data protection protocols.
bottom of page
