%0 Conference Proceedings %T TIE: Topological Information Enhanced Structural Reading Comprehension on Web Pages %A Zhao, Zihan %A Chen, Lu %A Cao, Ruisheng %A Xu, Hongshen %A Chen, Xingyu %A Yu, Kai %Y Carpuat, Marine %Y de Marneffe, Marie-Catherine %Y Meza Ruiz, Ivan Vladimir %S Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies %D 2022 %8 July %I Association for Computational Linguistics %C Seattle, United States %F zhao-etal-2022-tie %X Recently, the structural reading comprehension (SRC) task on web pages has attracted increasing research interests. Although previous SRC work has leveraged extra information such as HTML tags or XPaths, the informative topology of web pages is not effectively exploited. In this work, we propose a Topological Information Enhanced model (TIE), which transforms the token-level task into a tag-level task by introducing a two-stage process (i.e. node locating and answer refining). Based on that, TIE integrates Graph Attention Network (GAT) and Pre-trained Language Model (PLM) to leverage the topological information of both logical structures and spatial structures. Experimental results demonstrate that our model outperforms strong baselines and achieves state-of-the-art performances on the web-based SRC benchmark WebSRC at the time of writing. The code of TIE will be publicly available at https://github.com/X-LANCE/TIE. %R 10.18653/v1/2022.naacl-main.132 %U https://aclanthology.org/2022.naacl-main.132 %U https://doi.org/10.18653/v1/2022.naacl-main.132 %P 1808-1821