2020
DOI: 10.1007/s10606-020-09377-x
|View full text |Cite
|
Sign up to set email alerts
|

When the System Does Not Fit: Coping Strategies of Employment Consultants

Abstract: Case and knowledge management systems are spread at the frontline across public agencies. However, such systems are dedicated for the collaboration within the agency rather than for the face-toface interaction with the clients. If used as a collaborative resource at the frontline, case and knowledge management systems might disturb the service provision by displaying unfiltered internal information, disclosing private data of other clients, or revealing the limits of frontline employees' competence (if they ca… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
12
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
2

Relationship

2
7

Authors

Journals

citations
Cited by 15 publications
(12 citation statements)
references
References 51 publications
0
12
0
Order By: Relevance
“…All these applications bear risks of discriminating: these systems' accuracy may be higher for some types of crimes or for some ethnic groups. Similarly, digital technology was shown to restrict freedom of public administration—the mythical ‘computer’ rather than every officer or the organization was taking decisions about what a fair welfare subsidy is (Dolata, Schenk, Fuhrer, Marti, & Schwabe, 2020; Landsbergen, 2004). Although the publicity identified humans as decision‐makers accountable for fairness, in fact, the work was and continues to be distributed between social and technical components.…”
Section: A Sociotechnical Perspective On Algorithmic Fairnessmentioning
confidence: 99%
“…All these applications bear risks of discriminating: these systems' accuracy may be higher for some types of crimes or for some ethnic groups. Similarly, digital technology was shown to restrict freedom of public administration—the mythical ‘computer’ rather than every officer or the organization was taking decisions about what a fair welfare subsidy is (Dolata, Schenk, Fuhrer, Marti, & Schwabe, 2020; Landsbergen, 2004). Although the publicity identified humans as decision‐makers accountable for fairness, in fact, the work was and continues to be distributed between social and technical components.…”
Section: A Sociotechnical Perspective On Algorithmic Fairnessmentioning
confidence: 99%
“…This measurement instrument is appropriate to determine what hinders individuals from learning in their organizations. In line with Dolata et al (2021) The field of consultancy is a challenging work environment with highly complex tasks (O'Leary, 2020) and with a constant need for professional development. While studies often focus on the diverse facilitating factors for learning (Jeong et al, 2018), less is known about how to determine what hinders individuals from learning at the workplace.…”
Section: Practical Value Of the Studymentioning
confidence: 99%
“…The consultancy, is suitable in this regard due to its special focus on multiple, complex and challenging work environments; the complexity of tasks; the high variety of solution strategies; and its diverse career development paths (Korster, 2022). These characteristics fall in line with a constant need for professional development and lifelong learning to keep up with the magnificent and ill-structured problems occurring at the workplace (O' Leary, 2020;Dolata et al, 2021). The required competences of consultants are diverse and dynamic as well (Wißhak and Hochholdinger, 2020;Helens-Hart and Engstrom, 2021;van der Baan et al, 2022).…”
Section: Introductionmentioning
confidence: 99%
“…AI-based decision-support tools require "considering digital data as reliable and complete representations of the phenomena" [23], however, as before mentioned, research shows that classifications of people are not stable [30,31], but human constructs that arise through interpretative work [42]. Additional value may be added in the interaction between people, which could cause problems to computers as they cannot experience the world as human beings and the dynamics involved in these contexts [28]. Examples from medical research include the development of ML tools to predict sepsis in patients.…”
Section: From Expert Systems To Machine Predictionsmentioning
confidence: 99%