2021 20th IEEE International Conference on Machine Learning and Applications (ICMLA) 2021
DOI: 10.1109/icmla52953.2021.00200
|View full text |Cite
|
Sign up to set email alerts
|

Transformer Based Bengali Chatbot Using General Knowledge Dataset

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 4 publications
0
2
0
Order By: Relevance
“…Transformer [5] have emerged as the dominant architecture in natural language processing (NLP) tasks such as text generation and machine translation, outperforming convolutional models and recurrent neural networks (RNN) models [6]. The research conducted by Masum et al [7] found that a Bengali language chatbot using the Transformer model achieved a BLEU score of 85.00, surpassing the Seq2Seq and Bidirectional-RNN models, which achieved BLEU scores of 23.50 and 17.23, respectively. Generative pre-trained transformer (GPT) [8]- [10] is a transformer-based model that has emerged as the state of the art in language modeling tasks such as text generation and question-answering.…”
mentioning
confidence: 99%
“…Transformer [5] have emerged as the dominant architecture in natural language processing (NLP) tasks such as text generation and machine translation, outperforming convolutional models and recurrent neural networks (RNN) models [6]. The research conducted by Masum et al [7] found that a Bengali language chatbot using the Transformer model achieved a BLEU score of 85.00, surpassing the Seq2Seq and Bidirectional-RNN models, which achieved BLEU scores of 23.50 and 17.23, respectively. Generative pre-trained transformer (GPT) [8]- [10] is a transformer-based model that has emerged as the state of the art in language modeling tasks such as text generation and question-answering.…”
mentioning
confidence: 99%
“…When integrated into other generative models, they introduce essential mechanisms and techniques such as attention, self-1 https://ai.google/static/documents/google-about-bard.pdf , accessed Jan 3rd,2024 attention, multi-head attention, and positional encoding [101]. This integration has unlocked practical applications of transformers in various domains, including text generation for creative writing [102], chatbots [103], code generation [104], and programming assistance [105]. Notably, incorporating transformers into other GAI has led to significant advancements in image synthesis.…”
Section: B Typical Gai Modelsmentioning
confidence: 99%