ACL Anthology
News
(current)
FAQ
(current)
Corrections
(current)
Submissions
(current)
GitHub
Jacob
Steinhardt
2021
pdf
bib
Are Larger Pretrained Language Models Uniformly Better? Comparing Performance at the Instance Level
Ruiqi Zhong
|
Dhruba Ghosh
|
Dan Klein
|
Jacob Steinhardt
Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021
Search
Co-authors
Dhruba Ghosh
1
Dan Klein
1
Ruiqi Zhong
1
Venues
findings
1
Fix author