2021 20th IEEE International Conference on Machine Learning and Applications (ICMLA) 2021
DOI: 10.1109/icmla52953.2021.00041
|View full text |Cite
|
Sign up to set email alerts
|

KerasBERT: Modeling the Keras Language

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(2 citation statements)
references
References 22 publications
0
2
0
Order By: Relevance
“…When using BERT-base-uncased 1 as the pretrained model, we use the Keras-BERT (Shorten and Khoshgoftaar, 2021) module to implement the Tokenizer and import the model.…”
Section: Implementation Detailsmentioning
confidence: 99%
“…When using BERT-base-uncased 1 as the pretrained model, we use the Keras-BERT (Shorten and Khoshgoftaar, 2021) module to implement the Tokenizer and import the model.…”
Section: Implementation Detailsmentioning
confidence: 99%
“…The layers used and their function are described in the Keras API [11] . Rescaling layer: Preprocessing layer which rescales input values to a new range.…”
Section: Introductionmentioning
confidence: 99%