FROM CONSTRAINED EVENT SEQUENCES GENERATION TO TEXT GENERATION

FROM CONSTRAINED EVENT SEQUENCES GENERATION TO TEXT GENERATION

Date Presented: August 26, 2021
Speaker: Shanxiu He (UCLA)

Abstract
Understanding events is a critical component of natural language understanding (NLU). A key challenge lies in the fact that events can be described in different granularities. A coarse-grained event (e.g., publishing a paper) can often be disseminated into a fine-grained process of events (e.g., writing the paper, passing the peer review, and presenting at the conference). In this work, we tackle the problem of goal-oriented event process generation, where a task goal event, a process that completes this goal is automatically generated. We tackle this task with a constrained generation approach, inferring unobserved event chains based on existing sequences. To leverage prior knowledge to facilitate commonsense reasoning, we employ pre-trained LMs to generate event sequences and to retrieve original stories.

Speaker Bio:
Shanxiu He is an undergraduate at UCLA and a member of UCLANLP lab. Prior to the internship, her research interest focuses on pre-trained Vision-and-Language models such as VisualBERT and ClipBert and their applications to various structural learning tasks. During this internship, she researches on event-centric knowledge representation and specifically event sequences generations.

CONSTRAINEDEVENTSEQUENCES

Post a Comment

0 Comments