2014 IEEE 5th International Conference on Software Engineering and Service Science 2014
DOI: 10.1109/icsess.2014.6933696
|View full text |Cite
|
Sign up to set email alerts
|

Non-myopic active learning with performance guarantee

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 7 publications
0
1
0
Order By: Relevance
“…However, using only a single sample selection strategy in batch active learning can lead to poor results, as the selected samples may have a high information similarity (e.g., using the N-best method). To select the optimal subset of samples that represent the overall dataset, we optimized the sample selection problem using submodular function theory [35]. Specifically, we investigated the objective function of the near-optimal set of pronunciation dictionary samples and showed that our function had the submodularity property, which allowed for active learning so as to obtain a nearoptimal subset of the corpus using a greedy algorithm.…”
Section: Submodular Functionmentioning
confidence: 99%
“…However, using only a single sample selection strategy in batch active learning can lead to poor results, as the selected samples may have a high information similarity (e.g., using the N-best method). To select the optimal subset of samples that represent the overall dataset, we optimized the sample selection problem using submodular function theory [35]. Specifically, we investigated the objective function of the near-optimal set of pronunciation dictionary samples and showed that our function had the submodularity property, which allowed for active learning so as to obtain a nearoptimal subset of the corpus using a greedy algorithm.…”
Section: Submodular Functionmentioning
confidence: 99%