Publication Type

Conference Proceeding Article

Version

publishedVersion

Publication Date

12-2023

Abstract

The discovery of new intent categories from user utterances is a crucial task in expanding agent skills. The key lies in how to efficiently solicit semantic evidence from utterances and properly transfer knowledge from existing intents to new intents. However, previous methods laid too much emphasis on relations among utterances or clusters for transfer learning, while paying less attention to the usage of semantics. As a result, these methods suffer from in-domain over-fitting and often generate meaningless new intent clusters due to data distortion. In this paper, we present a novel approach called Cluster Semantic Enhanced Prompt Learning (CsePL) for discovering new intents. Our method leverages two-level contrastive learning with label semantic alignment to learn meaningful representations of intent clusters. These learned intent representations are then utilized as soft prompt initializations for discriminating new intents, reducing the dominance of existing intents. Extensive experiments conducted on three public datasets demonstrate the superiority of our proposed method. It not only outperforms existing methods but also suggests meaningful intent labels and enables early detection of new intents.

Keywords

prompt learning, large language model

Discipline

Artificial Intelligence and Robotics

Research Areas

Intelligent Systems and Optimization

Publication

Findings of the Association for Computational Linguistics: EMNLP 2023

First Page

10468

Last Page

10481

Identifier

10.18653/v1/2023.findings-emnlp.702

Publisher

Association for Computational Linguistics

City or Country

SIngapore

Additional URL

https://doi.org/10.18653/v1/2023.findings-emnlp.702

Share

COinS