Publication Type

Conference Proceeding Article

Version

publishedVersion

Publication Date

8-2024

Abstract

The digital landscape is rapidly evolving with an ever-increasing volume of online news, emphasizing the need for swift and precise analysis of complex events.We refer to the complex events composed of many news articles over an extended period as Temporal Complex Event (TCE). This paper proposes a novel approach using Large Language Models (LLMs) to systematically extract and analyze the event chain within TCE, characterized by their key points and timestamps. We establish a benchmark, named TCELongBench, to evaluate the proficiency of LLMs in handling temporal dynamics and understanding extensive text. This benchmark encompasses three distinct tasks - reading comprehension, temporal sequencing, and future event forecasting. In the experiment, we leverage retrieval-augmented generation (RAG) method and LLMs with long context window to deal with lengthy news articles of TCE. Our findings indicate that models with suitable retrievers exhibit comparable performance with those utilizing long context window.

Keywords

Temporal complex events, Large language models, LLMS, Extensive text processing

Discipline

Artificial Intelligence and Robotics | Computer Sciences

Research Areas

Data Science and Engineering; Intelligent Systems and Optimization

Areas of Excellence

Digital transformation

Publication

Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (ACL 2024) : Bangkok, Thailand, August 11-16

Volume

1

First Page

1588

Last Page

1606

Identifier

10.18653/v1/2024.acl-long.87

Publisher

Association for Computational Linguistics

City or Country

Bangkok, Thailand

Additional URL

https://doi.org/10.18653/v1/2024.acl-long.87

Share

COinS