Defects on the surface of yarn-dyed fabrics are one of the important factors affecting the quality of fabrics. Defect detection is the core link of quality control. Due to the diversity of yarn-dyed fabric patterns and the scarcity of defect samples, reconstruction-based unsupervised deep learning algorithms have received extensive attention in the field of fabric defect detection. However, most existing deep learning algorithms cannot fully extract shallow, high-frequency and high-level information, which limits their ability to reconstruct yarn-dyed fabric images. In this article, we propose an Attention-based Feature Fusion Generative Adversarial Network framework for unsupervised defect detection of yarn-dyed fabrics. The framework utilizes a modified Feature Pyramid Network to fuse multi-level information and utilizes an attention mechanism to enhance the model's feature representation capabilities. The Attention-based Feature Fusion Generative Adversarial Network consists of an attention fusion generator and a patch-level discriminator. In the attention fusion generator, the Feature Pyramid Network with EfficientNetV2 as the backbone is used as the core building block, and different feature fusion methods are used to avoid the loss of information in the process of network deepening. The attention mechanism is used to enhance the channel and spatial-wise correlation of features, which helps the model to focus on more meaningful information by recalibrating the feature maps. In the discriminator, the patch-level discriminator is used to calculate the similarity between the reconstructed image and the original image from a local perspective, thereby improving the model's attention to texture details. Experimental results on public datasets demonstrate the effectiveness of the proposed method compared to other methods.
Defect detection is an essential link in the fabric production process. Due to the diversity of patterns and scarcity of defect samples for colour‐patterned fabrics, reconstruction‐based unsupervised deep learning algorithms have received extensive attention in the field of fabric defect detection. Among them, unsupervised reconstruction models based on variational autoencoders (VAEs) have been shown to be effective. However, there is a problem of posterior collapse in the process of modelling parametric distributions of continuous variables by VAEs. Therefore, VAE‐based defect detection methods for colour‐patterned fabrics usually produce ambiguous reconstruction results, thereby affecting the defect detection performance. In this article, an attention‐based vector quantisation variational autoencoder (AVQ‐VAE) is proposed for colour‐patterned fabric defect detection. The method adopts autoregressive modelling of discrete variables to avoid the posterior collapse problem of traditional VAEs, and utilises attention mechanism to enhance the feature representation ability of the model. AVQ‐VAE consists of encoder, embedding space, decoder and attention mechanism. The encoder is used to map the input image into multiple feature vectors. Vector quantisation in embedding space is used for discretisation and autoregressive modelling of feature vectors. A decoder is used to decode discrete variables into images of the same size as the original input. Furthermore, an attention mechanism is used to capture channel and spatial correlations, which help the model focus on important information by adaptively recalibrating feature maps. Experimental results on public datasets demonstrate that the proposed method is robust and effective for colour‐patterned fabric defect detection.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.