1998
DOI: 10.1007/bf01408559
|View full text |Cite
|
Sign up to set email alerts
|

Tracking for augmented reality on wearable computers

Abstract: Wearable computers afford a degree of mobility that makes tracking for augmented reality difficult. This paper presents a novel object-centric tracking architecture for presenting augmented reality media in spatial relationships to objects, regardless of the objects' positions or motions in the world. The advance this system provides is the ability to sense and integrate new features into its tracking database, thereby extending the tracking region automatically. A "lazy evaluation" of the structure from motio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

1998
1998
2004
2004

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 30 publications
0
2
0
Order By: Relevance
“…For tracking systems, many technologies require installation of sourcing devices, on which tracking devices physically depend. Although visionbased tracking systems do not require sourcing devices, they rely on pre-installed markers [4] [14]. Inertial trackers work as a stand-alone tracking system, but the tracking information they provide cannot be used to construct geometrical relations between the real and the virtual environment, which is required for image overlays.…”
Section: Lack Of Stand-alone Systemsmentioning
confidence: 99%
See 1 more Smart Citation
“…For tracking systems, many technologies require installation of sourcing devices, on which tracking devices physically depend. Although visionbased tracking systems do not require sourcing devices, they rely on pre-installed markers [4] [14]. Inertial trackers work as a stand-alone tracking system, but the tracking information they provide cannot be used to construct geometrical relations between the real and the virtual environment, which is required for image overlays.…”
Section: Lack Of Stand-alone Systemsmentioning
confidence: 99%
“…5, these 3D positions are the intersections of the rays from the laser DOE starting point to the DOE pattern features, and the rays from the camera focal point to the image features [14]. To calculate the intersection (more precisely, the closest point between the rays), the rays should be represented in a common coordinate system (in our implementation, camera coordinate system was used).…”
Section: Fig 5 Ar Pointer Coordinate Systemsmentioning
confidence: 99%