2017
DOI: 10.2355/isijinternational.isijint-2016-478
|View full text |Cite
|
Sign up to set email alerts
|

Learning a Class-specific and Shared Dictionary for Classifying Surface Defects of Steel Sheet

Abstract: An approach to a class-specific and shared dictionary learning (CDSDL) for sparse representation is proposed to classify surface defects of steel sheet. The proposed CDSDL algorithm is modelled as a unified objective function, covering reconstructive error, sparse and discriminative promotion constraints. With the high-quality dictionary, the compact, reconstructive and discriminative feature representation of an image can be extracted. Then the classification can be efficiently performed by discriminative inf… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
1
1

Relationship

2
4

Authors

Journals

citations
Cited by 8 publications
(6 citation statements)
references
References 40 publications
0
6
0
Order By: Relevance
“…In this way, the corresponding coding vectors will be more representative, discriminative as well as robust, which may directly improve subsequent classification performance. This paper is an extension of our previous works of class-specific and shared dictionary learning (CDSDL) 32) with significant new proposals and more experiments and comparisons: firstly, we separate the coding vector over a shared sub-dictionary and class-specific sub-dictionaries, and exploit a class-specific and shared discriminative dictionary learning (CASDDL) method for classifying surface defects of steel sheet, which not only encourages intra-class samples to deliver the similar feature representation vector, but also minimizes the inter-class samples correlations; secondly, we constrain the coding vectors corresponding to the shared dictionary is low-rank, which is able to improve the adaptivity of defect classification and greatly reduces computation time; finally, more experiments and comprehensive comparisons are reported. The remainder of this paper is organized as follows.…”
Section: Jointly Class-specific and Shared Discriminative Dictionary Learning For Classifying Surface Defects Of Steel Sheetmentioning
confidence: 93%
“…In this way, the corresponding coding vectors will be more representative, discriminative as well as robust, which may directly improve subsequent classification performance. This paper is an extension of our previous works of class-specific and shared dictionary learning (CDSDL) 32) with significant new proposals and more experiments and comparisons: firstly, we separate the coding vector over a shared sub-dictionary and class-specific sub-dictionaries, and exploit a class-specific and shared discriminative dictionary learning (CASDDL) method for classifying surface defects of steel sheet, which not only encourages intra-class samples to deliver the similar feature representation vector, but also minimizes the inter-class samples correlations; secondly, we constrain the coding vectors corresponding to the shared dictionary is low-rank, which is able to improve the adaptivity of defect classification and greatly reduces computation time; finally, more experiments and comprehensive comparisons are reported. The remainder of this paper is organized as follows.…”
Section: Jointly Class-specific and Shared Discriminative Dictionary Learning For Classifying Surface Defects Of Steel Sheetmentioning
confidence: 93%
“…With the explainable classification results and corresponding defect segmentation, JCS largely simplifies and accelerates the detection process for quality experts. This paper is an extension of our previous works of [36][37] with significant new proposals and more experiments. Our main contributions are summarized as follows:…”
mentioning
confidence: 88%
“…For this reason, Chu et al developed the Relief-F to solve the multiclass classification problem in feature optimization for strip steel surface defect recognition [39]. Other effective algorithms are also applied, such as suboptimal feature selection algorithms, which have better results than simple sequential methods [68], recursive feature elimination (RFE), which ranks all features in descending order, and an appropriate number of top features are selected [114]. Both are less commonly used methods, which are briefly presented here for reference.…”
Section: Feature Optimizationmentioning
confidence: 99%
“…For example, Masci et al [105] emulated a standard dictionary-based encoding strategy as an encoding layer to improve the recognition rate for generic steel defects. Furthermore, Zhou et al [114] utilized the high-quality dictionary to extract the compact, reconstructive and discriminative features of the test images. Testing results prove that this scheme has improved the classification performance efficiently due to the discriminative information obtained from the reconstructive error or the sparse vector.…”
Section: ) Classifiers Based On Sparse Representationmentioning
confidence: 99%