2012
DOI: 10.1007/978-3-642-33185-5_4
|View full text |Cite
|
Sign up to set email alerts
|

Metaheuristics for Tuning Model Parameters in Two Natural Language Processing Applications

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2013
2013
2019
2019

Publication Types

Select...
2

Relationship

2
0

Authors

Journals

citations
Cited by 2 publications
(4 citation statements)
references
References 6 publications
0
4
0
Order By: Relevance
“…The value of τ 0 depends on the KSs, but it can be set to the smallest weight value that signals good triples in the KS of the biggest coverage. All threshold values can be also automatically optimised, e.g., as in [43].…”
Section: Algorithmmentioning
confidence: 99%
See 2 more Smart Citations
“…The value of τ 0 depends on the KSs, but it can be set to the smallest weight value that signals good triples in the KS of the biggest coverage. All threshold values can be also automatically optimised, e.g., as in [43].…”
Section: Algorithmmentioning
confidence: 99%
“…It can block some pre-defined junctions of links of selected types or at least decrease the amount of activation going through link junctions of certain types, e.g. holonymy, antonymy The value of was heuristically set to τ 0 /4, but it can be obtained during optimisation, as all other parameters, cf [43]. The parameters µ and together control the maximal distance on which the initial activation of a node can influence its local subgraph.…”
Section: Activation Replicationmentioning
confidence: 99%
See 1 more Smart Citation
“…However, the state-of-the-art results for Polish suggest that further improvements of measures of semantic relatedness are still possible, for example by using a constraint-based approach, a dependency parser, and testing more measures with more parameters. Similarly, the attachment algorithm could further be improved by optimizing parameters of the algorithms, for example by using meta-heuristics like in (Kłyk et al 2012), Slovenščina 2.0, 2 (2013 [108]…”
Section: O N C L U S I O N Smentioning
confidence: 99%