2021
DOI: 10.1177/09717218211003411
|View full text |Cite
|
Sign up to set email alerts
|

Suspect AI: Vibraimage, Emotion Recognition Technology and Algorithmic Opacity

Abstract: Vibraimage is a digital system that quantifies a subject’s mental and emotional state by analysing video footage of the movements of their head. Vibraimage is used by police, nuclear power station operators, airport security and psychiatrists in Russia, China, Japan and South Korea, and has been deployed at two Olympic Games, a FIFA World Cup and a G7 Summit. Yet there is no reliable empirical evidence for its efficacy; indeed, many claims made about its effects seem unprovable. What exactly does vibraimage me… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
15
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 26 publications
(15 citation statements)
references
References 30 publications
0
15
0
Order By: Relevance
“…This finding implies that students who rated themselves to have more knowledge of AI might be unaware of the biases in and inaccuracy of emerging technologies. Taken together, these facts indicate many students might be ignorant of the ways in which social biases and privileges can lead to harmful EAI's use in the workplace, as shown in various studies on algorithmic biases (Rhue 2019;Crawford 2021, Moore andWoodcock, 2021;Buolamwini and Gebru 2018). Even though the problem of algorithmic bias has now moved to the center of public discourse in Western media (Singh 2020), when it comes to a multi-national sample, this study indicates a clear lack of knowledge as 51% of the respondents rated themselves below average in AI knowledge (Table 3).…”
Section: Biases and Privilegesmentioning
confidence: 96%
See 2 more Smart Citations
“…This finding implies that students who rated themselves to have more knowledge of AI might be unaware of the biases in and inaccuracy of emerging technologies. Taken together, these facts indicate many students might be ignorant of the ways in which social biases and privileges can lead to harmful EAI's use in the workplace, as shown in various studies on algorithmic biases (Rhue 2019;Crawford 2021, Moore andWoodcock, 2021;Buolamwini and Gebru 2018). Even though the problem of algorithmic bias has now moved to the center of public discourse in Western media (Singh 2020), when it comes to a multi-national sample, this study indicates a clear lack of knowledge as 51% of the respondents rated themselves below average in AI knowledge (Table 3).…”
Section: Biases and Privilegesmentioning
confidence: 96%
“…Without the ability to backstage, empathic surveillance demands that a worker's persona must always be authentic and positive (Moore and Woodcock 2021). Under such conditions the regulation of emotion becomes work itself (Woodcock 2016;Cabanas and Illouz 2019). Indregard et al (2018) refer to this type of personal estrangement that occurs under empathetic control as 'emotional dissonance.…”
Section: Philosophical Background: From Taylorism To Empathic Surveillancementioning
confidence: 99%
See 1 more Smart Citation
“…Another example of dataveillance is VibraImage, a software marketed by the Russian company ELSYS capable of detecting various emotional states through the head's micro-movements (Wright, 2021). This software was deployed during the 2014 Sochi Olympics and the 2018 FIFA World Cup in Russia, and more recently, in nuclear stations and convenience stores in Japan.…”
Section: Introductionmentioning
confidence: 99%
“…This software was deployed during the 2014 Sochi Olympics and the 2018 FIFA World Cup in Russia, and more recently, in nuclear stations and convenience stores in Japan. The technology is being considered by Korean National Police Agency for lie-detecting purposes (Wright, 2021).…”
Section: Introductionmentioning
confidence: 99%