2018 IEEE Symposium Series on Computational Intelligence (SSCI) 2018
DOI: 10.1109/ssci.2018.8628625
|View full text |Cite
|
Sign up to set email alerts
|

Structured Prediction Networks through Latent Cost Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 8 publications
0
1
0
Order By: Relevance
“…While in optimization problems local solutions often produce optimal results, structured prediction represents a valid alternative to solve NLP tasks requiring complex output, such as syntactic parsing (Roth and Yih, 2004), co-reference resolution (Yu and Joachims, 2009;Fernan-des et al, 2014), and clustering (Finley and Joachims, 2005;Haponchyk et al, 2018). Nonetheless, relatively few works extend structured prediction theory to deep learning Durrett and Klein, 2015;Weiss et al, 2015;Kiperwasser and Goldberg, 2016;Peng et al, 2018;Milidiú and Rocha, 2018;Wang et al, 2019). In particular, when it comes to clustering, designing a differentiable loss function that captures the global characteristics of good clustering is particularly hard; for this reason, when dealing with coreference resolution -a closely related task - Lee et al (2017) use simple losses, which already perform well but do not strictly take into account the cluster structure.…”
Section: Structured Predictionmentioning
confidence: 99%
“…While in optimization problems local solutions often produce optimal results, structured prediction represents a valid alternative to solve NLP tasks requiring complex output, such as syntactic parsing (Roth and Yih, 2004), co-reference resolution (Yu and Joachims, 2009;Fernan-des et al, 2014), and clustering (Finley and Joachims, 2005;Haponchyk et al, 2018). Nonetheless, relatively few works extend structured prediction theory to deep learning Durrett and Klein, 2015;Weiss et al, 2015;Kiperwasser and Goldberg, 2016;Peng et al, 2018;Milidiú and Rocha, 2018;Wang et al, 2019). In particular, when it comes to clustering, designing a differentiable loss function that captures the global characteristics of good clustering is particularly hard; for this reason, when dealing with coreference resolution -a closely related task - Lee et al (2017) use simple losses, which already perform well but do not strictly take into account the cluster structure.…”
Section: Structured Predictionmentioning
confidence: 99%