Publication Type

Conference Proceeding Article

Version

acceptedVersion

Publication Date

10-2023

Abstract

arge language models (LLMs), such as GPT-3 and ChatGPT, have demonstrated remarkable results in various natural language processing (NLP) tasks with in-context learning, which involves inference based on a few demonstration examples. Despite their successes in NLP tasks, no investigation has been conducted to assess the ability of LLMs to perform document information extraction (DIE) using in-context learning. Applying LLMs to DIE poses two challenges: the modality and task gap. To this end, we propose a simple but effective in-context learning framework called ICL-D3IE, which enables LLMs to perform DIE with different types of demonstration examples. Specifically, we extract the most difficult and distinct segments from hard training documents as hard demonstrations for benefiting all test instances. We design demonstrations describing relationships that enable LLMs to understand positional relationships. We introduce formatting demonstrations for easy answer extraction. Additionally, the framework improves diverse demonstrations by updating them iteratively. Our experiments on three widely used benchmark datasets demonstrate that the ICL-D3IE framework enables Davinci-003/ChatGPT to achieve superior performance when compared to previous pre-trained methods fine-tuned with full training in both the in-distribution (ID) setting and in the out-of-distribution (OOD) setting. Code is available at https://github.com/MAEHCM/ICL-D3IE.

Discipline

Artificial Intelligence and Robotics | Numerical Analysis and Scientific Computing

Publication

2023 IEEE/CVF International Conference on Computer Vision (ICCV): Paris, October 2-6: Proceedings

First Page

19428

Last Page

19437

ISBN

9798350307184

Identifier

10.1109/ICCV51070.2023.01785

Publisher

IEEE Computer Society

City or Country

Washington, DC

Embargo Period

4-15-2024

Copyright Owner and License

Authors

Additional URL

https://doi.org/10.1109/ICCV51070.2023.01785

Share

COinS