2021
DOI: 10.31273/eirj.v8i3.788
|View full text |Cite
|
Sign up to set email alerts
|

Use of Artificial Intelligence in Legal Technologies

Abstract: The use of artificial intelligence in the legal sector flourished in recent years. This development is often met with excitement and unease. In this critical reflection, we analyse how artificial intelligence functions in modern legal technologies, and what its future implications are for the legal sector and critical legal thinking. We firstly discuss how machine learning and ‘Narrow AI’ are pertinent in this discussion, and how misleading the ‘hype’ on robot lawyers is. We then show how legal technologies ar… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 2 publications
0
3
0
Order By: Relevance
“…In this context, the state of the art in NLP is represented by DL architectures such as GPT, XLNet, or BERT [61]. Among these, BERT has been found to be particularly widely used in the medical field in general, and for stroke in particular, along with specialized versions fitted to these applications that improve their performance [22,41].…”
Section: Discussionmentioning
confidence: 99%
“…In this context, the state of the art in NLP is represented by DL architectures such as GPT, XLNet, or BERT [61]. Among these, BERT has been found to be particularly widely used in the medical field in general, and for stroke in particular, along with specialized versions fitted to these applications that improve their performance [22,41].…”
Section: Discussionmentioning
confidence: 99%
“…Developed by OpenAI, GPT stands as a remarkable example of the transformative capabilities of large-scale neural language models [ 44 ]. At its core, GPT is founded upon the innovative Transformer architecture, a model that has revolutionized the field by effectively capturing long-term dependencies within sequences, making it exceptionally well-suited for tasks involving language understanding and generation [ 45 , 46 ]. The GPT family has multiple versions: The GPT-2 model, with 1.5 billion parameters, is capable of generating extensive sequences of text while adapting to the style and content of arbitrary inputs [ 47 ].…”
Section: Methodsmentioning
confidence: 99%
“…Different transformer-based models are used in different natural language processing (NLP) tasks, including text generation tasks such as novel/story writing, customer services. These models produce state of the art performances based on their parameters (Topal et al, 2021). The transformers have a better way of dealing with RNN issues because they use an attention mechanism to eliminate the sequential processing of a text that RNN performs by focusing on the current word (Bin et al, 2021).…”
Section: Related Workmentioning
confidence: 99%