Publication Type
Conference Proceeding Article
Version
publishedVersion
Publication Date
10-2022
Abstract
Deep learning models have been successfully applied to a variety of software engineering tasks, such as code classification, summarisation, and bug and vulnerability detection. In order to apply deep learning to these tasks, source code needs to be represented in a format that is suitable for input into the deep learning model. Most approaches to representing source code, such as tokens, abstract syntax trees (ASTs), data flow graphs (DFGs), and control flow graphs (CFGs) only focus on the code itself and do not take into account additional context that could be useful for deep learning models. In this paper, we argue that it is beneficial for deep learning models to have access to additional contextual information about the code being analysed. We present preliminary evidence that encoding context from the call hierarchy along with information from the code itself can improve the performance of a state-of-the-art deep learning model for two software engineering tasks. We outline our research agenda for adding further contextual information to source code representations for deep learning.
Keywords
additional context, deep learning, Source code representation
Discipline
Software Engineering
Research Areas
Software and Cyber-Physical Systems
Publication
Proceedings of the 38th International Conference on Software Maintenance and Evolution, Limassol, Cyprus, 2022 October 3-7
First Page
374
Last Page
378
ISBN
9781665479561
Identifier
10.1109/ICSME55016.2022.00042
Publisher
IEEE
City or Country
Los Alamitos, CA
Citation
TIAN, Fuwei and TREUDE, Christoph.
Adding context to source code representations for deep learning. (2022). Proceedings of the 38th International Conference on Software Maintenance and Evolution, Limassol, Cyprus, 2022 October 3-7. 374-378.
Available at: https://ink.library.smu.edu.sg/sis_research/8824
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.
Additional URL
https://doi.org/10.1109/ICSME55016.2022.00042