2005 Seventh IEEE Workshops on Applications of Computer Vision (WACV/MOTION'05) - Volume 1 2005
DOI: 10.1109/acvmot.2005.15
|View full text |Cite
|
Sign up to set email alerts
|

Accurate 3D Tracking of Rigid Objects with Occlusion Using Active Appearance Models

Abstract: In this paper we present a new method for tracking rigid objects using a modified version of the Active Appearance Model. Unlike most of the other appearance-based methods in the literature, such as [3,5,6,9,11] .

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
16
0

Year Published

2006
2006
2013
2013

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 18 publications
(16 citation statements)
references
References 14 publications
0
16
0
Order By: Relevance
“…For example, some methods report good results, without giving actual numbers on accuracy, such as [1,14,21,22]. Approaches described by [21,23,34,35] are capable of handling partial occlusion or changing lighting conditions but cannot differentiate between deteriorating tracking conditions and lost tracks. Some methods are restricted in their degrees of freedom, e.g.…”
Section: Related Workmentioning
confidence: 99%
“…For example, some methods report good results, without giving actual numbers on accuracy, such as [1,14,21,22]. Approaches described by [21,23,34,35] are capable of handling partial occlusion or changing lighting conditions but cannot differentiate between deteriorating tracking conditions and lost tracks. Some methods are restricted in their degrees of freedom, e.g.…”
Section: Related Workmentioning
confidence: 99%
“…Online estimation is essentially a stochastic optimization problem, since only noisy objective function evaluations are available due to restrictions on the sample set size at each instance. There are a number of advantages of online/stochastic learning over batch learning 4 . For example, it has been widely reported that it can reduce training time [21], has the ability to escape from shallow local minima [22], minimizes the true risk rather than the empirical risk [23], requires far less memory, and it does not require the number of samples to be chosen a priori.…”
Section: Stochastic Gradient Descentmentioning
confidence: 99%
“…where n b is the mini-batch size, stochastic optimization then proceeds by first 4 Batch learning minimizes the empirical risk over the whole training set simultaneously.…”
Section: Stochastic Gradient Descentmentioning
confidence: 99%
See 1 more Smart Citation
“…Since its advent by Edwards et al in [6] and their preliminary extension [5], the method has found applications in many image modelling, alignment and tracking problems, for example [7,8,11].…”
Section: Introductionmentioning
confidence: 99%