Tra My Nguyen
Also published as: Tra-my Nguyen
2024
Guidelines for the Annotation of Intentional Linguistic Metaphor
Stefanie Dipper
|
Adam Roussel
|
Alexandra Wiemann
|
Won Kim
|
Tra-my Nguyen
Proceedings of the 4th Workshop on Figurative Language Processing (FigLang 2024)
This paper presents guidelines for the annotation of intentional (i.e. non-conventionalized) linguistic metaphors. Expressions that contribute to the same metaphorical image are annotated as a chain, additionally a semantically contrasting expression of the target domain is marked as an anchor. So far, a corpus of ten TEDx talks with a total of 20k tokens has been annotated according to these guidelines. 1.25% of the tokens are intentional metaphorical expressions.
2022
SLATE: A Sequence Labeling Approach for Task Extraction from Free-form Inked Content
Apurva Gandhi
|
Ryan Serrao
|
Biyi Fang
|
Gilbert Antonius
|
Jenna Hong
|
Tra My Nguyen
|
Sheng Yi
|
Ehi Nosakhare
|
Irene Shaffer
|
Soundararajan Srinivasan
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing: Industry Track
We present SLATE, a sequence labeling approach for extracting tasks from free-form content such as digitally handwritten (or “inked”) notes on a virtual whiteboard. Our approach allows us to create a single, low-latency model to simultaneously perform sentence segmentation and classification of these sentences into task/non-task sentences. SLATE greatly outperforms a baseline two-model (sentence segmentation followed by classification model) approach, achieving a task F1 score of 84.4%, a sentence segmentation (boundary similarity) score of 88.4% and three times lower latency compared to the baseline. Furthermore, we provide insights into tackling challenges of performing NLP on the inking domain. We release both our code and dataset for this novel task.
Search
Co-authors
- Apurva Gandhi 1
- Ryan Serrao 1
- Biyi Fang 1
- Gilbert Antonius 1
- Jenna Hong 1
- show all...