Jia Wang
2024
SensoryT5: Infusing Sensorimotor Norms into T5 for Enhanced Fine-grained Emotion Classification
Yuhan Xia
|
Qingqing Zhao
|
Yunfei Long
|
Ge Xu
|
Jia Wang
Proceedings of the Workshop on Cognitive Aspects of the Lexicon @ LREC-COLING 2024
In traditional research approaches, sensory perception and emotion classification have traditionally been considered separate domains. Yet, the significant influence of sensory experiences on emotional responses is undeniable. The natural language processing (NLP) community has often missed the opportunity to merge sensory knowledge with emotion classification. To address this gap, we propose SensoryT5, a neurocognitive approach that integrates sensory information into the T5 (Text-to-Text Transfer Transformer) model, designed specifically for fine-grained emotion classification. This methodology incorporates sensory cues into the T5’s attention mechanism, enabling a harmonious balance between contextual understanding and sensory awareness. The resulting model amplifies the richness of emotional representations. In rigorous tests across various detailed emotion classification datasets, SensoryT5 showcases improved performance, surpassing both the foundational T5 model and current state-of-the-art works. Notably, SensoryT5’s success signifies a pivotal change in the NLP domain, highlighting the potential influence of neurocognitive data in refining machine learning models’ emotional sensitivity.
Document Set Expansion with Positive-Unlabeled Learning Using Intractable Density Estimation
Haiyang Zhang
|
Qiuyi Chen
|
Yanjie Zou
|
Jia Wang
|
Yushan Pan
|
Mark Stevenson
Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)
The Document Set Expansion (DSE) task involves identifying relevant documents from large collections based on a limited set of example documents. Previous research has highlighted Positive and Unlabeled (PU) learning as a promising approach for this task. However, most PU methods rely on the unrealistic assumption of knowing the class prior for positive samples in the collection. To address this limitation, this paper introduces a novel PU learning framework that utilizes intractable density estimation models. Experiments conducted on PubMed and Covid datasets in a transductive setting showcase the effectiveness of the proposed method for DSE. Code is available from https://github.com/Beautifuldog01/Document-set-expansion-puDE.
2010
Recommendation in Internet Forums and Blogs
Jia Wang
|
Qing Li
|
Yuanzhu Peter Chen
|
Zhangxi Lin
Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics
Search
Co-authors
- Qing Li 1
- Yuanzhu Peter Chen 1
- Zhangxi Lin 1
- Yuhan Xia 1
- Qingqing Zhao 1
- show all...