Devin Conathan


2022

pdf bib
Assessing Resource-Performance Trade-off of Natural Language Models using Data Envelopment Analysis
Zachary Zhou | Alisha Zachariah | Devin Conathan | Jeffery Kline
Proceedings of the 3rd Workshop on Evaluation and Comparison of NLP Systems

2021

pdf bib
Low Resource Quadratic Forms for Knowledge Graph Embeddings
Zachary Zhou | Jeffery Kline | Devin Conathan | Glenn Fung
Proceedings of the Second Workshop on Simple and Efficient Natural Language Processing

We address the problem of link prediction between entities and relations of knowledge graphs. State of the art techniques that address this problem, while increasingly accurate, are computationally intensive. In this paper we cast link prediction as a sparse convex program whose solution defines a quadratic form that is used as a ranking function. The structure of our convex program is such that standard support vector machine software packages, which are numerically robust and efficient, can solve it. We show that on benchmark data sets, our model’s performance is competitive with state of the art models, but training times can be reduced by a factor of 40 using only CPU-based (and not GPU-accelerated) computing resources. This approach may be suitable for applications where balancing the demands of graph completion performance against computational efficiency is a desirable trade-off.