Cheng-Zen Yang


2025

Multi-party conversation is a popular form in online group chatting. However, the interweaving of utterance threads complicates the understanding of the dialogues for participants. Many conversation disentanglement models have been proposed using transformer-based pre-trained language models (PrLMs). However, advanced transformer-based PrLMs have not been extensively studied. This paper investigates the effectiveness of five advanced PrLMs: BERT, XLNet, ELECTRA, RoBERTa, and ModernBERT. The experimental results show that ELECTRA and RoBERTa are two PrLMs with outstanding performance than other PrLMs for the conversation disentanglement task.

2023

2020

2018

2017

2012

2008

2007