2014
DOI: 10.1016/j.measurement.2013.10.025
|View full text |Cite
|
Sign up to set email alerts
|

Optical flow background estimation for real-time pan/tilt camera object tracking

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
10
0
1

Year Published

2015
2015
2019
2019

Publication Types

Select...
8
2

Relationship

0
10

Authors

Journals

citations
Cited by 53 publications
(11 citation statements)
references
References 23 publications
0
10
0
1
Order By: Relevance
“…The methods are primarily distinguished according to application scenarios. For example, in [62], which could be one of the most influential works in this field, Doyle et al combine the methods of optical flow and Kalman filter.…”
Section: ) Kalman Filtermentioning
confidence: 99%
“…The methods are primarily distinguished according to application scenarios. For example, in [62], which could be one of the most influential works in this field, Doyle et al combine the methods of optical flow and Kalman filter.…”
Section: ) Kalman Filtermentioning
confidence: 99%
“…Many researchers have developed object tracking methods and systems that provide a visual representation to robustly describe the spatiotemporal characteristics of object appearance [2]. Object tracking methods using a global visual representation that reflects the global statistical characteristics of an image region to be tracked have been proposed on the basis of various global image features such as optical flows [3,4,5], color histograms [6,7,8], and texture histograms [9,10,11]. By encoding the object appearance information from the selected interest points in images, local-feature-based object tracking methods have also been proposed on the basis of local features such as scale invariant feature transform (SIFT) [12,13], Haar-like features [14,15], the histogram of oriented gradient (HOG) [16,17,18], and the local binary pattern (LBP) [19,20,21].…”
Section: Introductionmentioning
confidence: 99%
“…Many tracking methods exist including kernel-based methods (e.g., optical flow) and contour-based methods (e.g., snakes-based tracking). 3,4 Various techniques have been demonstrated for tumor tracking on 2D images across image modalities, including template matching and manifold learning, among others. [5][6][7][8] In this work, we evaluate the feasibility of tracking user-specified regions across cine frames using a matching objective for a well-established feature vector in image processing called a scale-invariant feature transform (SIFT) descriptor that is constructed for every pixel.…”
Section: Introductionmentioning
confidence: 99%