Findings of the Association for Computational Linguistics: NAACL 2022 2022
DOI: 10.18653/v1/2022.findings-naacl.86
|View full text |Cite
|
Sign up to set email alerts
|

Speeding Up Entmax

Abstract: Softmax is the de facto standard for normalizing logits in modern neural networks for language processing. However, by producing a dense probability distribution each token in the vocabulary has a nonzero chance of being selected at each generation step, leading to a variety of reported problems in text generation. αentmax of Peters et al. ( 2019) solves this problem, but is unfortunately slower than softmax.In this paper, we propose an alternative to αentmax, which keeps its virtuous characteristics, but is a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 14 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?