Publication Type
PhD Dissertation
Version
publishedVersion
Publication Date
5-2025
Abstract
Software is increasingly pervasive in modern society, making the effective translation of human intent into code essential. Novice programmers often struggle with domain-specific code due to limited background knowledge, while experienced developers face challenges in maintaining evolving largescale codebases. Traditional pattern-based approaches address these issues, but such approaches are task-specific and require significant adaptation for different tasks. Transformer-based models offer a more flexible alternative, as the same architecture can be tailored for diverse programming tasks.
This dissertation investigates how Transformer-based models can be customized for various code generation and translation tasks. First, it introduces Transformer-based approaches that assist end-users with limited domain-specific knowledge in writing trigger-action and Arduino programs. Second, it addresses the automation of code evolution in large codebases, a process that is often time-consuming and error-prone when done manually. This includes an empirical study on deep learning models for generating Linux kernel semantic patches, followed by a development of a dual learning framework that improves how Transformer-based models learn code-to-code transformation patterns from change examples. Finally, this dissertation presents an efficient method that leverages graph modality to enhance the adaptability of Transformer-based models across different code generation and translation tasks.
These contributions demonstrate the versatility of Transformer-based models in code generation and translation, reducing barriers for novices while enhancing productivity for experienced developers. The findings open new opportunities for broader applications in software engineering.
Degree Awarded
PhD in Computer Science
Discipline
Programming Languages and Compilers | Software Engineering
Supervisor(s)
JIANG, Lingxiao
First Page
1
Last Page
230
Publisher
Singapore Management University
City or Country
Singapore
Citation
YUSUF, Imam Nur Bani.
Tailoring transformer-based deep learning for code generation and translation. (2025). 1-230.
Available at: https://ink.library.smu.edu.sg/etd_coll/776
Copyright Owner and License
Author
Creative Commons License

This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.