2006
DOI: 10.1007/11957959_9
|View full text |Cite
|
Sign up to set email alerts
|

Comparison of Generative and Discriminative Techniques for Object Detection and Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0

Year Published

2007
2007
2021
2021

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 27 publications
(22 citation statements)
references
References 13 publications
0
22
0
Order By: Relevance
“…Using this structure, the negatives will be rejected quickly when they flow through the network while the positives traverse through several branches. By combining the evidence from these branches, we are able to compute p(O|I) exactly using (11).…”
Section: Probabilistic Boosting Network (Pbn)mentioning
confidence: 99%
See 1 more Smart Citation
“…Using this structure, the negatives will be rejected quickly when they flow through the network while the positives traverse through several branches. By combining the evidence from these branches, we are able to compute p(O|I) exactly using (11).…”
Section: Probabilistic Boosting Network (Pbn)mentioning
confidence: 99%
“…The above integrating rule is an instance of combining the generative mixture model with discriminative classification models [11]. To train the multiclass classifier and binary object detectors, PBN invokes boosting algorithm to perform feature selection based on a common feature pool.…”
Section: Introductionmentioning
confidence: 99%
“…It has shown that discriminative approaches such as SVM [1] outperform model based approaches due to their flexibility in decision boundaries estimation. Conversely, since that discriminative methods are concerned with boundaries, all the classes need to be estimated conjointly [2]. Complementary, one of the interesting characteristics, that generative models have over discriminative ones, is that they are learnt independently for each class.…”
Section: Problem Statementmentioning
confidence: 99%
“…of Theorem 3. 2 We simply need show that such an f 0 satisfies the condition of Lemma 3.2. Let w(x) = exp(−x η ), x ≥ 0.…”
Section: Dirichlet Mixture Of Normalsmentioning
confidence: 99%
“…In this paper, we give an attention to these aspects. In general, the machine learning classifiers are classified into two categories: generative classifiers and discriminative classifiers . The generative classifiers try to learn the density distribution of a class of data and understand its basic characteristics from the training data, and then find an optimal representation of this class, whereas the discriminative classifiers try to model the boundaries of different classes or to model the class membership probability distribution directly.…”
Section: Introductionmentioning
confidence: 99%