Ho-Lam Chung
2024
Codec-SUPERB: An In-Depth Analysis of Sound Codec Models
Haibin Wu
|
Ho-Lam Chung
|
Yi-Cheng Lin
|
Yuan-Kuei Wu
|
Xuanjun Chen
|
Yu-Chi Pai
|
Hsiu-Hsuan Wang
|
Kai-Wei Chang
|
Alexander Liu
|
Hung-yi Lee
Findings of the Association for Computational Linguistics: ACL 2024
The sound codec’s dual roles in minimizing data transmission latency and serving as tokenizers underscore its critical importance.Recent years have witnessed significant developments in codec models.The ideal sound codec should preserve content, paralinguistics, speakers, and audio information.However, the question of which codec achieves optimal sound information preservation remains unanswered, as in different papers, models are evaluated on their selected experimental settings.This study introduces Codec-SUPERB, an acronym for Codec sound processing Universal PERformance Benchmark.It is an ecosystem designed to assess codec models across representative sound applications and signal-level metrics rooted in sound domain knowledge.Codec-SUPERB simplifies result sharing through an online leaderboard, promoting collaboration within a community-driven benchmark database, thereby stimulating new development cycles for codecs.Furthermore, we undertake an in-depth analysis to offer insights into codec models from both application and signal perspectives, diverging from previous codec papers mainly concentrating on signal-level comparisons.Finally, we will release codes, the leaderboard, and data to accelerate progress within the community.
2022
Keyword Provision Question Generation for Facilitating Educational Reading Comprehension Preparation
Ying-Hong Chan
|
Ho-Lam Chung
|
Yao-Chung Fan
Proceedings of the 15th International Conference on Natural Language Generation
2020
A BERT-based Distractor Generation Scheme with Multi-tasking and Negative Answer Training Strategies.
Ho-Lam Chung
|
Ying-Hong Chan
|
Yao-Chung Fan
Findings of the Association for Computational Linguistics: EMNLP 2020
In this paper, we investigate the following two limitations for the existing distractor generation (DG) methods. First, the quality of the existing DG methods are still far from practical use. There are still room for DG quality improvement. Second, the existing DG designs are mainly for single distractor generation. However, for practical MCQ preparation, multiple distractors are desired. Aiming at these goals, in this paper, we present a new distractor generation scheme with multi-tasking and negative answer training strategies for effectively generating multiple distractors. The experimental results show that (1) our model advances the state-of-the-art result from 28.65 to 39.81 (BLEU 1 score) and (2) the generated multiple distractors are diverse and shows strong distracting power for multiple choice question.
Search
Co-authors
- Ying-Hong Chan 2
- Yao-Chung Fan 2
- Haibin Wu 1
- Yi-Cheng Lin 1
- Yuan-Kuei Wu 1
- show all...