2019
DOI: 10.1007/978-3-030-33110-8_1
|View full text |Cite
|
Sign up to set email alerts
|

Automatic Text Generation in Macedonian Using Recurrent Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
2
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 13 publications
0
2
0
Order By: Relevance
“…The models achieved optimum results with 100 epochs and a batch size of 32 after testing various combinations of hyperparameters. The model was compiled with the Adam optimiser [ 53 ] and sparse categorical cross-entropy loss function [ 54 ]. All three classifiers included 80% data for training and 20% for testing the models.…”
Section: Methodsmentioning
confidence: 99%
“…The models achieved optimum results with 100 epochs and a batch size of 32 after testing various combinations of hyperparameters. The model was compiled with the Adam optimiser [ 53 ] and sparse categorical cross-entropy loss function [ 54 ]. All three classifiers included 80% data for training and 20% for testing the models.…”
Section: Methodsmentioning
confidence: 99%
“…The loss function computes the negative log probability of the true class, where it equals zero if the predicted probability of the true class is one, and as the predicted probability approaches zero, the loss function approaches infinity [60]. The sparse categorical cross-entropy loss function from the Tensorflow library [56] was used, which calculates the cross-entropy loss between the labels and predictions [56,61].…”
Section: Categorical Cross-entropy Loss Functionmentioning
confidence: 99%
“…Papers Human-centric (HC) [50], [53], [100] Machine-centric (MC) [60], [63]- [67], [83], [84], [86] [3], [69]- [71], [74], [95], [99], [122] [72], [73], [75]- [80] [21], [82], [90]- [92], [94], [102], [125] [6], [48], [49], [51], [52], [55], [56], [126] [18], [57], [58], [115], [119], [127], [128] [104], [112]- [114], [120], [129] [105], [106], [108], [110], [116], [117], [123] Both [5], [54], [59],…”
Section: Metrics Groupmentioning
confidence: 99%