Most existing Re-ID studies focus on the short-term cloth-consistent setting and thus dominate by the visual appearance of clothing. However, the same person would wear different clothes and different people would wear the same clothes in reality, which invalidates these methods. To tackle the challenge of clothes change, we propose a Universal Clothing Attribute Disentanglement network (UCAD) which can effectively weaken the influence of clothing (identity-unrelated) and force the model to learn identity-related features that are unrelated to the worn clothing. For further study of Re-ID in cloth-changing scenarios, we construct a large-scale dataset called CSCC with the following unique features: (1) Severe: A large number of people have cloth-changing over four seasons. (2) High definition: The resolution of the cameras ranges from 1920×1080 to 3840×2160, which ensures that the recorded people are clear. Furthermore, we provide two variants of CSCC considering different degrees of cloth-changing, namely moderate and severe, so that researchers can effectively evaluate their models from various aspects. Experiments on several cloth-changing datasets including our CSCC and short-term dataset Market-1501 prove the superiority of UCAD. The dataset is available at https://github.com/yomin-y/UCAD.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.