Huy Nguyen

BCL Technologies Inc.

Other people with similar names: Huy Nguyen (ex-liulishuo), Huy Nguyen (UPitt, Amazon), Huy Nguyen (Stanford)


2023

pdf bib
A Spectral Viewpoint on Continual Relation Extraction
Huy Nguyen | Chien Nguyen | Linh Ngo | Anh Luu | Thien Nguyen
Findings of the Association for Computational Linguistics: EMNLP 2023

Continual Relation Extraction (CRE) aims to continuously train a model to learn new relations while preserving its ability on previously learned relations. Similar to other continual learning problems, in CRE, models experience representation shift, where learned deep space changes in the continual learning process, which leads to the downgrade in the performance of the old tasks. In this work, we will provide an insight into this phenomenon under the spectral viewpoint. Our key argument is that, for each class shape, if its eigenvectors (or spectral components) do not change much, the shape is well-preserved. We then conduct a spectral experiment and show that, for the shape of each class, the eigenvectors with larger eigenvalue are more preserved after learning new tasks which means these vectors are good at keeping class shapes. Based on this analysis, we propose a simple yet effective class-wise regularization that improve the eigenvalues in the representation learning. We observe that our proposed regularization leads to an increase in the eigenvalues. Extensive experiments on two benchmark datasets, FewRel and TACRED, show the effectiveness of our proposed method with significant improvement in performance compared to the state-of-the-art models. Further analyses also verify our hypothesis that larger eigenvalues lead to better performance and vice versa.

pdf bib
Transitioning Representations between Languages for Cross-lingual Event Detection via Langevin Dynamics
Chien Nguyen | Huy Nguyen | Franck Dernoncourt | Thien Nguyen
Findings of the Association for Computational Linguistics: EMNLP 2023

Cross-lingual transfer learning (CLTL) for event detection (ED) aims to develop models in high-resource source languages that can be directly applied to produce effective performance for lower-resource target languages. Previous research in this area has focused on representation matching methods to develop a language-universal representation space into which source- and target-language example representations can be mapped to achieve cross-lingual transfer. However, as this approach modifies the representations for the source-language examples, the models might lose discriminative features for ED that are learned over training data of the source language to prevent effective predictions. To this end, our work introduces a novel approach for cross-lingual ED where we only aim to transition the representations for the target-language examples into the source-language space, thus preserving the representations in the source language and their discriminative information. Our method introduces Langevin Dynamics to perform representation transition and a semantic preservation framework to retain event type features during the transition process. Extensive experiments over three languages demonstrate the state-of-the-art performance for ED in CLTL.

2002

pdf bib
Automatic Semantic Grouping in a Spoken Language User Interface Toolkit
Hassan Alam | Hua Cheng | Rachmat Hartono | Aman Kumar | Paul Llido | Crystal Nakatsu | Huy Nguyen | Fuad Rahman | Yuliya Tarnikova | Timotius Tjahjadi | Che Wilcox
COLING 2002: The 19th International Conference on Computational Linguistics