Tzu-Yen Wang
2024
FANTAstic SEquences and Where to Find Them: Faithful and Efficient API Call Generation through State-tracked Constrained Decoding and Reranking
Zhuoer Wang
|
Leonardo Ribeiro
|
Alexandros Papangelis
|
Rohan Mukherjee
|
Tzu-Yen Wang
|
Xinyan Zhao
|
Arijit Biswas
|
James Caverlee
|
Angeliki Metallinou
Findings of the Association for Computational Linguistics: EMNLP 2024
API call generation is the cornerstone of large language models’ tool-using ability that provides access to the larger world. However, existing supervised and in-context learning approaches suffer from high training costs, poor data efficiency, and generated API calls that can be unfaithful to the API documentation and the user’s request. To address these limitations, we propose an output-side optimization approach called FANTASE. Two of the unique contributions of FANTASE are its State-Tracked Constrained Decoding (SCD) and Reranking components. SCD dynamically incorporates appropriate API constraints in the form of Token Search Trie for efficient and guaranteed generation faithfulness with respect to the API documentation. The Reranking component efficiently brings in the supervised signal by leveraging a lightweight model as the discriminator to rerank the beam-searched candidate generations of the large language model. We demonstrate the superior performance of FANTASE in API call generation accuracy, inference efficiency, and context efficiency with DSTC8 and API Bank datasets.
Search
Co-authors
- Zhuoer Wang 1
- Leonardo Ribeiro 1
- Alexandros Papangelis 1
- Rohan Mukherjee 1
- Xinyan Zhao 1
- show all...