Lecheng Ruan
2024
AutoDSL: Automated domain-specific language design for structural representation of procedures with constraints
Yu-Zhe Shi
|
Haofei Hou
|
Zhangqian Bi
|
Fanxu Meng
|
Xiang Wei
|
Lecheng Ruan
|
Qining Wang
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Accurate representation of procedures in restricted scenarios, such as non-standardized scientific experiments, requires precise depiction of constraints. Unfortunately, Domain-specific Language (DSL), as an effective tool to express constraints structurally, often requires case-by-case hand-crafting, necessitating customized, labor-intensive efforts. To overcome this challenge, we introduce the AutoDSL framework to automate DSL-based constraint design across various domains. Utilizing domain specified experimental protocol corpora, AutoDSL optimizes syntactic constraints and abstracts semantic constraints. Quantitative and qualitative analyses of the DSLs designed by AutoDSL across five distinct domains highlight its potential as an auxiliary module for language models, aiming to improve procedural planning and execution.
2023
PersLEARN: Research Training through the Lens of Perspective Cultivation
Yu-Zhe Shi
|
Shiqian Li
|
Xinyi Niu
|
Qiao Xu
|
Jiawen Liu
|
Yifan Xu
|
Shiyu Gu
|
Bingru He
|
Xinyang Li
|
Xinyu Zhao
|
Zijian Zhao
|
Yidong Lyu
|
Zhen Li
|
Sijia Liu
|
Lin Qiu
|
Jinhao Ji
|
Lecheng Ruan
|
Yuxi Ma
|
Wenjuan Han
|
Yixin Zhu
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 3: System Demonstrations)
Scientific research is inherently shaped by its authors’ perspectives, influenced by various factorssuch as their personality, community, or society. Junior researchers often face challenges in identifying the perspectives reflected in the existing literature and struggle to develop their own viewpoints. In response to this issue, we introduce PersLEARN , a tool designed to facilitate the cultivation of scientific perspectives, starting from a basic seed idea and progressing to a well-articulated framework. By interacting with a prompt-based model, researchers can develop their perspectives explicitly. Our humanstudy reveals that scientific perspectives developed by students using PersLEARN exhibit a superior level of logical coherence and depth compared to those that did not. Furthermore, our pipeline outperforms baseline approaches across multiple domains of literature from various perspectives. These results suggest that PersLEARN could help foster a greater appreciation of diversity in scientific perspectives as an essential component of research training.
Search
Co-authors
- Yu-Zhe Shi 2
- Shiqian Li 1
- Xinyi Niu 1
- Qiao Xu 1
- Jiawen Liu 1
- show all...
Venues
- acl2