2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition 2018
DOI: 10.1109/cvpr.2018.00293
|View full text |Cite
|
Sign up to set email alerts
|

Decoupled Networks

Abstract: Inner product-based convolution has been a central component of convolutional neural networks (CNNs) and the key to learning visual representations. Inspired by the observation that CNN-learned features are naturally decoupled with the norm of features corresponding to the intra-class variation and the angle corresponding to the semantic difference, we propose a generic decoupled learning framework which models the intra-class variation and semantic difference independently. Specifically, we first reparametriz… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

2
38
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 50 publications
(40 citation statements)
references
References 27 publications
2
38
0
Order By: Relevance
“…This opens a new research space for artificial neural networks, although its performance on large dataset is still relatively weak. Decoupled network [38] interprets convolution as the product of norm and cosine angle of the weight and input vectors, resulting in explicit geometrical modeling of the intra-class and extra-class variation. To process graph inputs, Spline-CNN [14] extends convolution by using continuous B-spline basis, which is parametrized by a constant number of trainable control values.…”
Section: Related Workmentioning
confidence: 99%
“…This opens a new research space for artificial neural networks, although its performance on large dataset is still relatively weak. Decoupled network [38] interprets convolution as the product of norm and cosine angle of the weight and input vectors, resulting in explicit geometrical modeling of the intra-class and extra-class variation. To process graph inputs, Spline-CNN [14] extends convolution by using continuous B-spline basis, which is parametrized by a constant number of trainable control values.…”
Section: Related Workmentioning
confidence: 99%
“…Liu et al [28] concluded them as a deep hyperspherical learning framework which forces the norms to 1 in order to make the outputs reflected the angles. And Liu et al [29] prove the idea further that not all norms are equal to 1 via decoupling the convolutional operator. In such a way, different kinds of functions rely on norm and angle are considered to substitute the functions of magnitude and angular.…”
Section: Related Workmentioning
confidence: 99%
“…Even though the tradeoff between invariance and discriminativeness is unknown, we still have chances to balance the two properties above in the manner of deep metric learning. According to the recent work [22], [25], [28], [29], inter-class variation and intra-class variation can be decoupled by relating to the norm and the angle within an inner product operation respectively. They argue that the angle corresponds to the inter-class variation and the norm accounts for intra-class variation.…”
Section: Related Workmentioning
confidence: 99%
“…For the third difficulty, to pursue higher running speed, we also drop the decoder network and compress the high-dimensional feature map as above works. However, unlike they directly use 1 × 1 convolution to compress features, we utilize the idea of disentangled representation learning [ 21 , 22 ] to restain more information in compressed features and name this module as Disentangled Feature Compressor (DFC). Specifically, DFC divides the high-dimensional feature map into three groups and integrates features respectively through low-dimensional 1 × 1 convolution.…”
Section: Introductionmentioning
confidence: 99%