Content area

Abstract

In the context of modern software development characterized by increasing complexity and compressed development cycles, traditional static vulnerability detection methods face prominent challenges including high false positive rates and missed detections of complex logic due to their over-reliance on rule templates. This paper proposes a Syntax-Aware Hierarchical Attention Network (SAHAN) model, which achieves high-precision vulnerability detection through grammar-rule-driven multi-granularity code slicing and hierarchical semantic fusion mechanisms. The SAHAN model first generates Syntax Independent Units (SIUs), which slices the code based on Abstract Syntax Tree (AST) and predefined grammar rules, retaining vulnerability-sensitive contexts. Following this, through a hierarchical attention mechanism, the local syntax-aware layer encodes fine-grained patterns within SIUs, while the global semantic correlation layer captures vulnerability chains across SIUs, achieving synergistic modeling of syntax and semantics. Experiments show that on benchmark datasets like QEMU, SAHAN significantly improves detection performance by 4.8% to 13.1% on average compared to baseline models such as Devign and VulDeePecker.

Details

1009240
Title
Syntax-Aware Hierarchical Attention Networks for Code Vulnerability Detection
Author
Jiang, Yongbo 1 ; Huang, Shengnan 1 ; Feng, Tao 1 ; Duan, Baofeng 1 

 School of Computer and Communication, Lanzhou University of Technology, Lanzhou, 730050, China 
Publication title
Volume
86
Issue
1
Pages
1-22
Number of pages
23
Publication year
2026
Publication date
2026
Section
ARTICLE
Publisher
Tech Science Press
Place of publication
Henderson
Country of publication
United States
Publication subject
ISSN
1546-2218
e-ISSN
1546-2226
Source type
Scholarly Journal
Language of publication
English
Document type
Journal Article
Publication history
 
 
Online publication date
2025-11-10
Milestone dates
2025-06-23 (Received); 2025-09-24 (Accepted)
Publication history
 
 
   First posting date
10 Nov 2025
ProQuest document ID
3280657471
Document URL
https://www.proquest.com/scholarly-journals/syntax-aware-hierarchical-attention-networks-code/docview/3280657471/se-2?accountid=208611
Copyright
© 2026. This work is licensed under https://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Last updated
2025-12-09
Database
ProQuest One Academic