Title DaGAM-Trans: dual graph attention module-based transformer for offline signature forgery detection
Authors Tehsin, Sara ; Hassan, Ali ; Riaz, Farhan ; Nasir, Inzamam Mashood
DOI 10.1016/j.rineng.2025.106425
Full Text Download
Is Part of Results in engineering.. Amsterdam : Elsevier. 2025, vol. 27, art. no. 106425, p. 1-16.. ISSN 2590-1230
Keywords [eng] Attention visualization ; Channel attention ; Graph attention networks ; Offline signature verification ; Signature forgery detection ; Vision transformers ; Writer-independent verification
Abstract [eng] Objective: The detection of signature forgeries is a critical challenge in domains such as forensic investigations, financial security, and biometric authentication. This study aims to improve the offline detection of signature forgery, especially in cases of skilled forgeries where traditional methods relying on handcrafted features and statistical models often fail to distinguish between genuine and forged signatures. Material and Methods: To address these limitations, we propose DaGAM-Trans, a Dual Attention Graph Attention Module-based Transformer model that integrates Graph Attention Networks (GAT) with a Transformer backbone. The Transformer architecture plays a key role in modeling global contextual dependencies across the entire signature image, enabling the system to capture long-range structural information crucial for distinguishing genuine signatures from skilled forgeries. The architecture comprises a Graph Attention Module (GAM) to capture spatial dependencies using multi-head graph attention and graph convolution layers, and a Channel Attention Module (CAM) to amplify discriminative features and suppress irrelevant information. Additionally, a Graph Multi-Scale Adaptive Pooling (GMSAPool) layer is introduced to optimize feature aggregation by preserving salient details and reducing redundancy. The model is evaluated on four publicly available signature datasets: SigComp2011, BHSig260, CEDAR, and UTSig, within a cross-language verification setting. Results: DaGAM-Trans demonstrates superior performance across all datasets. Notably, it achieves 100% accuracy on the CEDAR dataset, with a False Acceptance Rate (FAR) as low as 0.001 and an Equal Error Rate (EER) reduced to 9.5%. Comparative experiments confirm that DaGAM-Trans outperforms existing state-of-the-art methods in terms of accuracy, FAR, FRR, and EER. Visualization through attention maps further validates the model's capacity to localize and highlight discriminative regions in signature images. Conclusion: The proposed DaGAM-Trans model effectively enhances offline signature forgery detection by leveraging dual attention mechanisms and adaptive graph-based pooling. Its outstanding performance, especially in reducing error rates and handling cross-linguistic signature variability, demonstrates its robustness and applicability for real-world biometric and forensic authentication tasks.
Published Amsterdam : Elsevier
Type Journal article
Language English
Publication date 2025
CC license CC license description