2022
DOI: 10.1007/s11846-021-00514-4
|View full text |Cite
|
Sign up to set email alerts
|

Can I show my skills? Affective responses to artificial intelligence in the recruitment process

Abstract: Companies increasingly use artificial intelligence (AI) and algorithmic decision-making (ADM) for their recruitment and selection process for cost and efficiency reasons. However, there are concerns about the applicant’s affective response to AI systems in recruitment, and knowledge about the affective responses to the selection process is still limited, especially when AI supports different selection process stages (i.e., preselection, telephone interview, and video interview). Drawing on the affective respon… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
30
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 54 publications
(33 citation statements)
references
References 72 publications
3
30
0
Order By: Relevance
“…While applicant reactions to algorithmic selection assessments, particularly game-based assessments, are generally positive ( Atkins et al, 2014 ; Lieberoth, 2015 ; Georgiou and Nikolaou, 2020 ; Leutner et al, 2020 ), the same cannot be said for fairness perceptions. At a general level, fairness perceptions are mixed and are not consistent between assessment tools ( Suen et al, 2019 ; Georgiou and Nikolaou, 2020 ; Köchling et al, 2022 ) but when looking at specific aspects of procedural justice, algorithmic tools are perceived as less fair than human ratings. Indeed, candidates perceive that there is less opportunity for behavioral control when assessments are automated compared to when they are judged by humans, meaning that they feel they are given less chance to perform and manipulate the raters to influence them toward a positive judgment ( Lee, 2018 ; Kaibel et al, 2019 ).…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…While applicant reactions to algorithmic selection assessments, particularly game-based assessments, are generally positive ( Atkins et al, 2014 ; Lieberoth, 2015 ; Georgiou and Nikolaou, 2020 ; Leutner et al, 2020 ), the same cannot be said for fairness perceptions. At a general level, fairness perceptions are mixed and are not consistent between assessment tools ( Suen et al, 2019 ; Georgiou and Nikolaou, 2020 ; Köchling et al, 2022 ) but when looking at specific aspects of procedural justice, algorithmic tools are perceived as less fair than human ratings. Indeed, candidates perceive that there is less opportunity for behavioral control when assessments are automated compared to when they are judged by humans, meaning that they feel they are given less chance to perform and manipulate the raters to influence them toward a positive judgment ( Lee, 2018 ; Kaibel et al, 2019 ).…”
Section: Discussionmentioning
confidence: 99%
“…In contrast, Suen et al (2019) report that although there is a preference for synchronous video interviews compared to asynchronous, fairness perceptions do not vary when asynchronous video interviews are judged by humans compared to an algorithm. More recent research reveals further contrasts, with algorithmic tools being seen as less fair when used in later stages of the recruitment funnel and equally as fair as human ratings when used in earlier stages, such as resume screening ( Köchling et al, 2022 ).…”
Section: Algorithmic Recruitment and Procedural Fairnessmentioning
confidence: 99%
See 1 more Smart Citation
“…Several different rules of procedural justice contribute to an individual's assessment of fairness (Cropanzano et al, 2015; Leventhal, 1980), especially during selection processes (Gilliland, 1993; Köchling et al, 2022; Langer, König & Fitili, 2018). Besides others, the list of the procedural rules consists of consistency, bias‐suppression, ethicality, and accuracy (Cropanzano et al, 2015; Leventhal, 1980).…”
Section: Theory and Hypothesesmentioning
confidence: 99%
“…Previous research showed that applicant reactions to AI‐supported selection tools are predominantly negative in terms of justice perceptions (e.g., Acikgoz et al, 2020; Köchling et al, 2022; Langer & Landers, 2021; Langer et al, 2020; Newman et al, 2020; Wesche & Sonderegger, 2021). While taking advantage of AI‐supported selection tools and not discouraging applicants and keeping them in the selection process at the same time, it is paramount to examine the possible actions that organizations can take to improve applicant reactions in personnel selection.…”
Section: Introductionmentioning
confidence: 99%