2023
DOI: 10.1177/00405175221149450
|View full text |Cite
|
Sign up to set email alerts
|

Attention-Gate-based U-shaped Reconstruction Network (AGUR-Net) for color-patterned fabric defect detection

Abstract: Color-patterned fabrics possess changeable patterns, low probability of defective samples, and various forms of defects. Therefore, the unsupervised inspection of color-patterned fabrics has gradually become a research hotspot in the field of fabric defect detection. However, due to the redundant information of skip connections in the network and the limitation of post-processing, the current reconstruction-based unsupervised fabric defect detection methods have difficulty in detecting some defects of color-pa… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
14
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 20 publications
(14 citation statements)
references
References 44 publications
0
14
0
Order By: Relevance
“…However, it is difficult to determine the appropriate pruning strategy when facing different datasets and models. Jing [6] , Zhang [7] and Wang [8] et al improve the backbone for U-Net and DeepLabV3+ by using a lightweight network represented by MobileNetV2 [9] as the backbone network for feature extraction, which effectively improves the detection speed while compressing the model size. However, the lightweight backbone implies poor feature extraction capability, which can easily lead to degradation of segmentation accuracy in semantic segmentation tasks where defects are similar to fabric patterns.…”
Section: Introductionmentioning
confidence: 99%
“…However, it is difficult to determine the appropriate pruning strategy when facing different datasets and models. Jing [6] , Zhang [7] and Wang [8] et al improve the backbone for U-Net and DeepLabV3+ by using a lightweight network represented by MobileNetV2 [9] as the backbone network for feature extraction, which effectively improves the detection speed while compressing the model size. However, the lightweight backbone implies poor feature extraction capability, which can easily lead to degradation of segmentation accuracy in semantic segmentation tasks where defects are similar to fabric patterns.…”
Section: Introductionmentioning
confidence: 99%
“…Cheng and Yu 10 proposed a DEA_RetinaNet model by embedding the channel attention mechanism module and adaptive spatial feature fusion (ASFF) module into the RetinaNet framework, which more excellently finished the steel surface defect detection task than others. Zhang et al 11 proposed an attention-based feature fusion generative adversarial network framework for unsupervised defect detection of yarn-dyed fabrics, which can fully extract shallow, high-frequency and high-level information. Liang et al 12 utilized the Faster-RCNN framework to detect more effectively and accurately defects in transmission lines.…”
mentioning
confidence: 99%
“…Zhang et al. 11 proposed an attention-based feature fusion generative adversarial network framework for unsupervised defect detection of yarn-dyed fabrics, which can fully extract shallow, high-frequency and high-level information. Liang et al.…”
mentioning
confidence: 99%
“…Experiments show that this method can effectively improve the classification accuracy of defect samples even when the number of defect samples is small. Zhang et al 17 proposed an attention-gate-based U-shaped reconstruction network (AGUR-Net) and a dual-threshold segmentation post-processing method. AGUR-Net consists of an encoder, an atrous spatial pyramid pooling module and an attention gate weighted fusion residual decoder.…”
mentioning
confidence: 99%
“…Zhang et al. 17 proposed an attention-gate-based U-shaped reconstruction network (AGUR-Net) and a dual-threshold segmentation post-processing method. AGUR-Net consists of an encoder, an atrous spatial pyramid pooling module and an attention gate weighted fusion residual decoder.…”
mentioning
confidence: 99%