Publication Type
Conference Proceeding Article
Version
acceptedVersion
Publication Date
11-2024
Abstract
Knowledge graphs (KGs) are instrumental in various real-world applications, yet they often suffer from incompleteness due to missing relations. To predict instances for novel relations with limited training examples, few-shot relation learning approaches have emerged, utilizing techniques such as meta-learning. However, the assumption is that novel relations in meta-testing and base relations in meta-training are independently and identically distributed, which may not hold in practice. To address the limitation, we propose RelAdapter, a context-aware adapter for few-shot relation learning in KGs designed to enhance the adaptation process in meta-learning. First, RelAdapter is equipped with a lightweight adapter module that facilitates relation-specific, tunable adaptation of meta-knowledge in a parameter-efficient manner. Second, RelAdapter is enriched with contextual information about the target relation, enabling enhanced adaptation to each distinct relation. Extensive experiments on three benchmark KGs validate the superiority of RelAdapter over state-of-the-art methods.
Keywords
Knowledge graphs, Few-shot relation learning, Meta-learning, Meta-training, Context-aware adapter
Discipline
Artificial Intelligence and Robotics | Computer Sciences
Areas of Excellence
Digital transformation
Publication
Proceedings of the 19th Conference on Empirical Methods in Natural Language Processing (EMNLP 2024) : Miami, Florida, USA, November 12-16
First Page
17525
Last Page
17537
Identifier
10.18653/v1/2024.emnlp-main.970
Publisher
Association for Computational Linguistics
City or Country
Miami, Florida
Citation
LIU, Ran; LIU, Zhongzhou; LI, Xiaoli; and FANG, Yuan.
Context-aware adapter tuning for few-shot relation learning in knowledge graphs. (2024). Proceedings of the 19th Conference on Empirical Methods in Natural Language Processing (EMNLP 2024) : Miami, Florida, USA, November 12-16. 17525-17537.
Available at: https://ink.library.smu.edu.sg/sis_research/9687
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.
Additional URL
https://doi.org/10.18653/v1/2024.emnlp-main.970
Comments
PDF provided by faculty