2001
DOI: 10.1016/s0893-6080(01)00104-6
|View full text |Cite
|
Sign up to set email alerts
|

The enhanced LBG algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
110
0
8

Year Published

2005
2005
2018
2018

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 196 publications
(119 citation statements)
references
References 21 publications
1
110
0
8
Order By: Relevance
“…One can see that results are not regular according to dictionary sizes. This is due to the learning of dictionary: even if we use a K-Means algorithm which gives almost stable dictionaries [31], they are still subject to some variations. However, one can see that there is emerging combinations.…”
Section: Global Dictionarymentioning
confidence: 99%
“…One can see that results are not regular according to dictionary sizes. This is due to the learning of dictionary: even if we use a K-Means algorithm which gives almost stable dictionaries [31], they are still subject to some variations. However, one can see that there is emerging combinations.…”
Section: Global Dictionarymentioning
confidence: 99%
“…When each pixel of an MS image is mapped onto a color space partitioned into a set of mutually exclusive and totally exhaustive hyperpolyhedra equivalent to a vocabulary of BC names, then a 2D multilevel color map (2D gridded dataset of a multilevel variable) is generated automatically (without human-machine interaction) in near real-time (with computational complexity increasing linearly with image size), where the number k of 2D map levels (color strata, color names) belongs to range {1, ColorVocabularyCardinality}. Popular synonyms of measurement space hyperpolyhedralization (discretization, partition) are vector quantization (VQ) in inductive machine learning-from-data (Cherkassky & Mulier, 1998; Elkan, 2003; Fritzke, 1997a, 1997b; Lee, Baek, & Sung, 1997; Linde, Buzo, & Gray, 1980; Lloyd, 1982; Patanè and Russo, 2001, 2002), and deductive fuzzification of a numeric variable into fuzzy sets in fuzzy logic (Zadeh, 1965). Typical inductive learning-from-data VQ algorithms aim at minimizing a known VQ error function, e.g., a root mean square vector quantization error (RMSE), given a number of k discretization levels selected by a user based on a priori knowledge and/or heuristic criteria.…”
Section: Introductionmentioning
confidence: 99%
“…For example, in a bag-of-words model applied to CV tasks, a numeric color space is typically discretized into a categorical color variable (codebook of codewords) by an inductive VQ algorithm, such as k -means; next, the categorical color variable is simplified by a 1st-order histogram representation, which disregards word grammar, semantics and even word-order, but keeps multiplicity; finally, the frequency of each color codeword is used as a feature for training a supervised data learning classifier (Cimpoi et al, 2014). Unlike the k -means VQ algorithm where the system’s free-parameter k is user-defined based on heuristics and the VQ error is estimated from the unlabeled dataset at hand, a user can fix the target VQ error value, so that it is the free-parameter k to be dynamically learned from the finite unlabeled dataset at hand by an inductive VQ algorithm (Patané & Russo, 2001, 2002), such as ISODATA (Memarsadeghi, Mount, Netanyahu, & Le Moigne, 2007). It means there is no universal number k of static hyperpolyhedra in a vector data space suitable for satisfying any VQ problem of interest if no target VQ error is specified in advance.…”
Section: Introductionmentioning
confidence: 99%
“…In VQ, the most important task is designing an efficient codebook. There are already several algorithms [6][7][8][9][10][11][12] published on how to generate a codebook. The LBG algorithm [6] is the most cited and widely used algorithm on designing the VQ codebook.…”
Section: Introductionmentioning
confidence: 99%
“…Due to the bad codebook initialization, it always converges to the nearest local minimum. This problem is called the local optimal problem [10]. In addition, it is observed that the time required to complete the iterations depends upon how good the initial codebook is.…”
Section: Introductionmentioning
confidence: 99%