Writer identification has steadily progressed in recent decades owing to its widespread application. Scenarios with extensive handwriting data such as page-level or sentence-level have achieved satisfactory accuracy, however, word-level offline writer identification is still challenging owing to the difficulty of learning good feature representations with scant handwriting data. This paper proposes a new Residual Swin Transformer Classifier (RSTC), comprehensively aggregating local and global handwriting styles and yielding robust feature representations with single-word images. Local information is modeled by the Transformer Block through interacting strokes. Global information is featurized by holistic encoding using the Identity Branch and Global Block. Moreover, the pre-training technique is exploited to transfer reusable knowledge learned from a task similar to writer identification, strengthening the model's representation of handwriting features. The proposed method is tested on the IAM and CVL benchmark datasets and achieves state-of-the-art performance, which demonstrates RSTC's superior modeling capability on word-level writer identification.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.