Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 4: Student Research Workshop) 2023
DOI: 10.18653/v1/2023.acl-srw.31
|View full text |Cite
|
Sign up to set email alerts
|

Towards Efficient Dialogue Processing in the Emergency Response Domain

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 0 publications
0
0
0
Order By: Relevance
“…Adapters represent a small amount of additional parameters that can be added as trainable task-specific weights at each layer of the transformer architecture (Vaswani et al, 2017). They have been successful on a variety tasks including speech recognition (Hou et al, 2021), cross-lingual transfer (Parovic et al, 2022) and classification tasks (Lee et al, 2022;Anikina, 2023;Metheniti et al, 2023) but there is very little research on using adapters for coreference resolution and the only work that we are aware of uses parallel data for training (Tang and Hardmeier, 2023).…”
Section: Related Workmentioning
confidence: 99%
“…Adapters represent a small amount of additional parameters that can be added as trainable task-specific weights at each layer of the transformer architecture (Vaswani et al, 2017). They have been successful on a variety tasks including speech recognition (Hou et al, 2021), cross-lingual transfer (Parovic et al, 2022) and classification tasks (Lee et al, 2022;Anikina, 2023;Metheniti et al, 2023) but there is very little research on using adapters for coreference resolution and the only work that we are aware of uses parallel data for training (Tang and Hardmeier, 2023).…”
Section: Related Workmentioning
confidence: 99%