2023
DOI: 10.1111/1467-8551.12781
|View full text |Cite
|
Sign up to set email alerts
|

ChatGPT Undermines Human Reflexivity, Scientific Responsibility and Responsible Management Research

Dirk Lindebaum,
Peter Fleming

Abstract: With ChatGPT being promoted to and by academics for writing scholarly articles more effectively, we ask what kind of knowledge does ChatGPT produce, what this means for our reflexivity asresponsiblemanagement educators/researchers, and how an absence of reflexivity disqualifies us from shaping management knowledge inresponsibleways. We urgently need to grasp what makes human knowledge distinct compared with knowledge generated by ChatGPTet al.Thus, we first explain how ChatGPT operates and unpack its intrinsic… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
7
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 21 publications
(7 citation statements)
references
References 41 publications
0
7
0
Order By: Relevance
“…As an editor, I see this as being where the real scholarly contribution lies. There is a significant difference between humangenerated texts versus AI-generated texts (see Lindebaum & Fleming, 2023). I also fear that I could become disengaged without such unique human insights that both inspire and entertain me.…”
Section: Concluding Commentsmentioning
confidence: 99%
“…As an editor, I see this as being where the real scholarly contribution lies. There is a significant difference between humangenerated texts versus AI-generated texts (see Lindebaum & Fleming, 2023). I also fear that I could become disengaged without such unique human insights that both inspire and entertain me.…”
Section: Concluding Commentsmentioning
confidence: 99%
“…Furthermore, even when a form of a theorizing style can be copied and can generate reasonably creative outputs to start with, many of these theorizing styles presuppose human reflexivity (Lindebaum & Fleming, 2023), the meta-representational ability of human beings to reflect on and interrogate the assumptions, biases and (consequential) privileges associated with particular styles of reasoning and knowing. ChatGPT has, as we have highlighted, no ability to reflect on, let alone question for itself, what it produces, or, for instance, its ethical implications.…”
Section: Abstract Reasoning and Creativitymentioning
confidence: 99%
“…So far, the debate in the field around LLMs has -perhaps unsurprisingly -been divided between critics and evangelists: those who, from a normative standpoint, are staunchly against LLMs based on what it would undermine and take away from our inherently human scholarship (e.g. Lindebaum & Fleming, 2023), and those who, convinced of its technological prowess, see its productive advantages and argue that, when adopted, it will augment, rather than undercut, human reasoning, creativity and innovation (e.g. Dwivedi et al, 2023).…”
mentioning
confidence: 99%
“…Regarding research with AI, academics in journals across disciplines (including JMS) are vigorously debating whether, how, and when, AI may benefit research – and at what point it might damage academic integrity (see Cotton et al., 2023; Dupps Jr, 2023; Haenlein et al., 2019; Kulkarni et al., 2023; Leonardi, 2021; Raisch and Krakowski, 2021). Questions are raised regarding to what extent AI solutions may replace human reflexivity and how far AI might undermine responsible management research (Lindebaum and Fleming, 2023; Lindebaum et al. 2023; Moser et al., 2022).…”
Section: Introductionmentioning
confidence: 99%