Bernd Kiefer

Also published as: B. Kiefer


2024

pdf bib
To Clarify or not to Clarify: A Comparative Analysis of Clarification Classification with Fine-Tuning, Prompt Tuning, and Prompt Engineering
Alina Leippert | Tatiana Anikina | Bernd Kiefer | Josef Genabith
Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 4: Student Research Workshop)

Misunderstandings occur all the time in human conversation but deciding on when to ask for clarification is a challenging task for conversational systems that requires a balance between asking too many unnecessary questions and running the risk of providing incorrect information. This work investigates clarification identification based on the task and data from (Xu et al., 2019), reproducing their Transformer baseline and extending it by comparing pre-trained language model fine-tuning, prompt tuning and manual prompt engineering on the task of clarification identification. Our experiments show strong performance with LM and a prompt tuning approach with BERT and RoBERTa, outperforming standard LM fine-tuning, while manual prompt engineering with GPT-3.5 proved to be less effective, although informative prompt instructions have the potential of steering the model towards generating more accurate explanations for why clarification is needed.

2019

pdf bib
Multi-Task Learning of System Dialogue Act Selection for Supervised Pretraining of Goal-Oriented Dialogue Policies
Sarah McLeod | Ivana Kruijff-Korbayova | Bernd Kiefer
Proceedings of the 20th Annual SIGdial Meeting on Discourse and Dialogue

This paper describes the use of Multi-Task Neural Networks (NNs) for system dialogue act selection. These models leverage the representations learned by the Natural Language Understanding (NLU) unit to enable robust initialization/bootstrapping of dialogue policies from medium sized initial data sets. We evaluate the models on two goal-oriented dialogue corpora in the travel booking domain. Results show the proposed models improve over models trained without knowledge of NLU tasks.

2013

pdf bib
An Efficient Typed Feature Structure Index: Theory and Implementation
Bernd Kiefer | Hans-Ulrich Krieger
Proceedings of the 13th International Conference on Parsing Technologies (IWPT 2013)

2011

pdf bib
The ACL Anthology Searchbench
Ulrich Schäfer | Bernd Kiefer | Christian Spurk | Jörg Steffen | Rui Wang
Proceedings of the ACL-HLT 2011 System Demonstrations

2008

pdf bib
Some Fine Points of Hybrid Natural Language Parsing
Peter Adolphs | Stephan Oepen | Ulrich Callmeier | Berthold Crysmann | Dan Flickinger | Bernd Kiefer
Proceedings of the Sixth International Conference on Language Resources and Evaluation (LREC'08)

Large-scale grammar-based parsing systems nowadays increasingly rely on independently developed, more specialized components for pre-processing their input. However, different tools make conflicting assumptions about very basic properties such as tokenization. To make linguistic annotation gathered in pre-processing available to “deep” parsing, a hybrid NLP system needs to establish a coherent mapping between the two universes. Our basic assumption is that tokens are best described by attribute value matrices (AVMs) that may be arbitrarily complex. We propose a powerful resource-sensitive rewrite formalism, “chart mapping”, that allows us to mediate between the token descriptions delivered by shallow pre-processing components and the input expected by the grammar. We furthermore propose a novel way of unknown word treatment where all generic lexical entries are instantiated that are licensed by a particular token AVM. Again, chart mapping is used to give the grammar writer full control as to which items (e.g. native vs. generic lexical items) enter syntactic parsing. We discuss several further uses of the original idea and report on early experiences with the new machinery.

2006

pdf bib
Preprocessing and Tokenisation Standards in DELPH-IN Tools
Benjamin Waldron | Ann Copestake | Ulrich Schäfer | Bernd Kiefer
Proceedings of the Fifth International Conference on Language Resources and Evaluation (LREC’06)

We discuss preprocessing and tokenisation standards within DELPH-IN, a large scale open-source collaboration providing multiple independent multilingual shallow and deep processors. We discuss (i) a component-specific XML interface format which has been used for some time to interface preprocessor results to the PET parser, and (ii) our implementation of a more generic XML interface format influenced heavily by the (ISO working draft) Morphosyntactic Annotation Framework (MAF). Our generic format encapsulates the information which may be passed from the preprocessing stage to a parser: it uses standoff-annotation, a lattice for the representation of structural ambiguity, intra-annotation dependencies and allows for highly structured annotation content. This work builds on the existing Heart of Gold middleware system, and previous work on Robust Minimal Recursion Semantics (RMRS) as part of an inter-component interface. We give examples of usage with a number of the DELPH-IN processing components and deep grammars.

2003

pdf bib
Integrated Shallow and Deep Parsing: TopP Meets HPSG
Anette Frank | Markus Becker | Berthold Crysmann | Bernd Kiefer | Ulrich Schäfer
Proceedings of the 41st Annual Meeting of the Association for Computational Linguistics

2002

pdf bib
A Novel Disambiguation Method for Unification-Based Grammars Using Probabilistic Context-Free Approximations
Bernd Kiefer | Hans-Ulrich Krieger | Detlef Prescher
COLING 2002: The 19th International Conference on Computational Linguistics

pdf bib
An Integrated Archictecture for Shallow and Deep Processing
Berthold Crysmann | Anette Frank | Bernd Kiefer | Stefan Mueller | Guenter Neumann | Jakub Piskorski | Ulrich Schaefer | Melanie Siegel | Hans Uszkoreit | Feiyu Xu | Markus Becker | Hans-Ulrich Krieger
Proceedings of the 40th Annual Meeting of the Association for Computational Linguistics

2000

pdf bib
A Context-free Approximation of Head-driven Phrase Structure Grammar
Bernd Kiefer | Hans-Ulrich Krieger
Proceedings of the Sixth International Workshop on Parsing Technologies

We present a context-free approximation of unification-based grammars, such as HPSG or PATR-II. The theoretical underpinning is established through a least fixpoint construction over a certain monotonic function. In order to reach a finite fixpoint, the concrete implementation can be parameterized in several ways , either by specifying a finite iteration depth, by using different restrictors, or by making the symbols of the CFG more complex adding annotations a la GPSG. We also present several methods that speed up the approximation process and help to limit the size of the resulting CF grammar.

pdf bib
An HPSG-to-CFG Approximation of Japanese
Bernd Kiefer | Hans-Ulrich Krieger | Melanie Siegel
COLING 2000 Volume 2: The 18th International Conference on Computational Linguistics

1999

pdf bib
Charting the Depths of Robust Speech Parsing
W. Kasper | B. Kiefer | H.-U. Krieger | C. J. Rupp | K. L. Worm
Proceedings of the 37th Annual Meeting of the Association for Computational Linguistics

pdf bib
A Bag of Useful Techniques for Efficient and Robust Parsing
Bernd Kiefer | Hans-Ulrich Krieger | John Carroll | Rob Malouf
Proceedings of the 37th Annual Meeting of the Association for Computational Linguistics

1995

pdf bib
Compilation of HPSG to TAG
Robert Kasper | Bernd Kiefer | Klaus Netter | K. Vijay-Shanker
33rd Annual Meeting of the Association for Computational Linguistics

1994

pdf bib
DISCO-An HPSG-based NLP System and its Application for Appointment Scheduling Project Note
Hans Uszkoreit | Rolf Backofen | Stephan Busemann | Abdel Kader Diagne | Elizabeth A. Hinkelman | Walter Kasper | Bernd Kiefer | Hans-Ulrich Krieger | Klaus Netter | Gunter Neumann | Stephan Oepen | Stephen P. Spackman
COLING 1994 Volume 1: The 15th International Conference on Computational Linguistics