2020
DOI: 10.1080/1369118x.2020.1792530
|View full text |Cite
|
Sign up to set email alerts
|

The politics of deceptive borders: ‘biomarkers of deceit’ and the case of iBorderCtrl

Abstract: This paper critically examines a recently developed proposal for a border control system called iBorderCtrl, designed to detect deception based on facial recognition technology and the measurement of micro-expressions, termed 'biomarkers of deceit'. Funded under the European Commission's Horizon 2020 programme, the system is analysed in relation to the wider political economy of 'emotional AI' and the history of deception detection technologies. We then move on to interrogate the design of iBorderCtrl using pu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
14
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
3
1

Relationship

1
8

Authors

Journals

citations
Cited by 45 publications
(14 citation statements)
references
References 33 publications
0
14
0
Order By: Relevance
“…For example, the border control and immigration budget of the United States has increased by 6,000 per cent since 1980, with US$223 million of the 2019 budget set aside to Homeland Security for the development of border security technologies. The EU, on the contrary, has allocated €34.9 billion to this field for the 2021–27 period, up from €13 billion for the 2014–20 period (Achiume, 2020; Daniels, 2018; Feldstein, 2019; Sánchez‐Monedero & Dencik, 2020).…”
Section: “Smart” Bordersmentioning
confidence: 99%
See 1 more Smart Citation
“…For example, the border control and immigration budget of the United States has increased by 6,000 per cent since 1980, with US$223 million of the 2019 budget set aside to Homeland Security for the development of border security technologies. The EU, on the contrary, has allocated €34.9 billion to this field for the 2021–27 period, up from €13 billion for the 2014–20 period (Achiume, 2020; Daniels, 2018; Feldstein, 2019; Sánchez‐Monedero & Dencik, 2020).…”
Section: “Smart” Bordersmentioning
confidence: 99%
“…For example, a lie detector algorithm was developed by QinetiQ, a British defence technology company and funded by the UK Engineering and Physical Sciences Research Council. Another project is the iBorderCtrl conducted by Manchester Metropolitan University researchers, which benefited from a €4.5 million grant provided by the European Research Council (Sánchez‐Monedero & Dencik, 2020). This project made use of affect recognition technologies and was tested in Greece, Hungary and Latvia (Feldstein, 2019), while another highly criticized pilot study is the UK Border Agency's Human Provenance Pilot (HPP) Project, which aims to detect the nationality of asylum seekers through a DNA and isotope analysis (Privacy International & No Tech for Tyrants, 2020).…”
Section: Pilot Projects At the Intersection Of Academia And Surveilla...mentioning
confidence: 99%
“…Furthermore, the reductionist categories of binary labels inferred by soft biometrics rely on criteria based on already available data that is unlikely to match the cultural and social diversity of the general population. Such uses of soft biometrics have been shown to lack scientific validation and often rely on deeply contested assumptions about physiological attributes and the relationship between people and data (Sánchez-Monedero and Dencik, 2020). Pre-selecting the digital representations of some voices collected from data infrastructures to inform the labelling of criminals or terrorists invokes a particular regime of recognition, in which institutional priorities shape what dispersed groups of individuals are recognized and algorithmically processed by law enforcement on the basis of sharing certain voice characteristics.…”
Section: The Politics Of (Voice) Recognitionmentioning
confidence: 99%
“…Application of emotion tracking has progressed from in-house research facilities (such as those used in neuromarketing to detect responses to adverts) to online and physical contexts. Including emojis (Davies, 2016;Stark and Crawford, 2015), wearables (Lupton, 2016;Neff and Nafus, 2016;Picard, 1997), human-robot interaction (Bryson, 2018), education (Williamson, 2017), retail (Turow, 2017), employee behaviour (Davies, 2015;Grandey et al, 2013) and border control (Sa´nchez-Monedero and Dencik, 2019), the broader business strategy for emotional AI companies is ubiquitous usage of automated emotion detection in all personal, commercial and public contexts. To suggest a broad rule, if there is any form of value in understanding emotion in a given context, emotional AI has scope to be employed.…”
Section: On Emotion Sensingmentioning
confidence: 99%