Treffer: Implementation of Gated Graph Neural Network (GGNN) on Bug Prediction in Java, Python, and C++ Programming Languages.

Title:
Implementation of Gated Graph Neural Network (GGNN) on Bug Prediction in Java, Python, and C++ Programming Languages.
Authors:
Juwono, Elroy1 (AUTHOR), Novarino Phoa, Matthew Farrell1 (AUTHOR) matthew.phoa@binus.ac.id, Wijaya, Michael1 (AUTHOR), Suryaningrumi, Kristien Margi1 (AUTHOR), Siswanto, Ricky Reynardo1 (AUTHOR)
Source:
Procedia Computer Science. 2025, Vol. 269, p1171-1180. 10p.
Database:
Supplemental Index

Weitere Informationen

Software bug is a defect on a system which can lead to incorrect results or failure. It can make the development costs more wasteful. A software defect prediction can help the development process to find bugs and prioritize testing efforts. Traditional bug prediction like CVDP (Cross-Version Defect Prediction) approach rely mostly on structure-based and predefined patterns that struggle to capture complex interaction across source code. This can create gaps to accurately identify bugs that appear from deep relationships between elements. To this end, we need a better approach for bug prediction which can capture long dependencies and code context to be more details. Therefore we use the GNN (Graph Neural Network) approach, a type of neural network made to learn over graph-structured data. This approach represents code structure into a node and using messaging mechanism on every node to precisely capture relationships between code lines and complex details. GNN has many models, but in this paper we specifically use GGNN (Gated Graph Neural Network). GGNN has a GRU mechanism which can control the information gained from neighbour node and remove the unrelated information during messaging mechanism continuously. We use this model to predict bugs in multiple programming languages that have a different structure: Python, Java, and C++. The result is that our model gains a significant result than CVDP approach with F1-score 0.9079, loss value of 0.2919. precision of 0.9175, and recall of 0.9058. [ABSTRACT FROM AUTHOR]