This work explores human intervention to improve Automatic Signature Verification (ASV). Significant efforts have been made in order to improve the performance of ASV algorithms over the last decades. This work analyzes how human actions can be used to complement automatic systems. Which actions to take and to what extent those actions can help stateof-the-art ASV systems is the final aim of this research line. The analysis at classification level comprises experiments with responses from 500 people based on crowdsourcing signature authentication tasks. The results allow to establish a human baseline performance and comparison with automatic systems. Intervention at feature extraction level is evaluated using a selfdeveloped tool for the manual annotation of signature attributes inspired in Forensic Document Experts analysis. We analyze the performance of attribute-based human signature authentication and its complementarity with automatic systems. The experiments are carried out over a public database including the two most popular signature authentication scenarios based on both online (dynamic time sequences including position and pressure) and offline (static images) information. The results demonstrate the potential of human interventions at feature extraction level (by manually annotating signature attributes) and encourage to further research in its capabilities to improve the performance of ASV.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.