Alternative Title

9781450392037

Publication Type

Conference Proceeding Article

Version

publishedVersion

Publication Date

10-2022

Abstract

Multimodal dialogue systems attract much attention recently, but they are far from skills like: 1) automatically generate context- specific responses instead of safe but general responses; 2) naturally coordinate between the different information modalities (e.g. text and image) in responses; 3) intuitively explain the reasons for generated responses and improve a specific response without re-training the whole model. To approach these goals, we propose a different angle for the task - Reflecting Experiences for Response Generation (RERG). This is supported by the fact that generating a response from scratch can be hard, but much easier if we can access other similar dialogue contexts and the corresponding responses. In particular, RERG first uses a multimodal contrastive learning enhanced retrieval model for soliciting similar dialogue instances. It then employs a cross copy based reuse model to explore the current dialogue context (vertical) and similar dialogue instances' responses (horizontal) for response generation simultaneously. Experimental results demonstrate that our model outperforms other state-of-the-art models on both automatic metrics and human evaluation. Moreover, RERG naturally provides supporting dialogue instances for better explainability. It also has a strong capability in adapting to unseen dialogue settings by simply adding related samples to the retrieval datastore without re-training the whole model.

Keywords

Case-based reasoning, Response generation, Contrastive learning

Discipline

Computer Engineering

Research Areas

Data Science and Engineering

Publication

Proceedings of the 30th ACM International Conference on Multimedia, Lisboa, Portugal, 2022 October 10 - 14

First Page

5265

Last Page

5273

Identifier

10.1145/3503161.3548305

Publisher

Association for Computing Machinery

City or Country

Lisbon

Additional URL

https://doi.org/10.1145/3503161.3548305

Share

COinS