2015
DOI: 10.1134/s1995080215030063
|View full text |Cite
|
Sign up to set email alerts
|

Invariants of objects and their images under surjective maps

Abstract: We examine the relationships between the differential invariants of objects and of their images under a surjective map. We analyze both the case when the underlying transformation group is projectable and hence induces an action on the image, and the case when only a proper subgroup of the entire group acts projectably. In the former case, we establish a constructible isomorphism between the algebra of differential invariants of the images and the algebra of fiber-wise constant (gauge) differential invariants … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2016
2016
2020
2020

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 44 publications
0
2
0
Order By: Relevance
“…The computation of projective invariants is being investigated. Related works connect the differential invariants of 3D curves and their 2D projections through the method of moving frames [30,31].…”
Section: Future Directionsmentioning
confidence: 99%
“…The computation of projective invariants is being investigated. Related works connect the differential invariants of 3D curves and their 2D projections through the method of moving frames [30,31].…”
Section: Future Directionsmentioning
confidence: 99%
“…Then, the feature matching algorithms usually apply distance measurement to estimate the relationship between two images. The point feature can be the line intersection points (Liu and An, 2012), local curvature discontinuity points, curve inflection points (Kogan and Olver, 2015), wavelet transform local extremum and corner points (Serej et al, 2015). The segmentation line or object contour of the image are the line feature.…”
Section: Introductionmentioning
confidence: 99%