Martin Braschler


2020

pdf bib
Database Search vs. Information Retrieval: A Novel Method for Studying Natural Language Querying of Semi-Structured Data
Stefanie Nadig | Martin Braschler | Kurt Stockinger
Proceedings of the Twelfth Language Resources and Evaluation Conference

The traditional approach of querying a relational database is via a formal language, namely SQL. Recent developments in the design of natural language interfaces to databases show promising results for querying either with keywords or with full natural language queries and thus render relational databases more accessible to non-tech savvy users. Such enhanced relational databases basically use a search paradigm which is commonly used in the field of information retrieval. However, the way systems are evaluated in the database and the information retrieval communities often differs due to a lack of common benchmarks. In this paper, we provide an adapted benchmark data set that is based on a test collection originally used to evaluate information retrieval systems. The data set contains 45 information needs developed on the Internet Movie Database (IMDb), including corresponding relevance assessments. By mapping this benchmark data set to a relational database schema, we enable a novel way of directly comparing database search techniques with information retrieval. To demonstrate the feasibility of our approach, we present an experimental evaluation that compares SODA, a keyword-enabled relational database system, against the Terrier information retrieval system and thus lays the foundation for a future discussion of evaluating database systems that support natural language interfaces.

2018

pdf bib
Overcoming the Long Tail Problem: A Case Study on CO2-Footprint Estimation of Recipes using Information Retrieval
Melanie Geiger | Martin Braschler
Proceedings of the Eleventh International Conference on Language Resources and Evaluation (LREC 2018)

2008

pdf bib
From Research to Application in Multilingual Information Access: the Contribution of Evaluation
Carol Peters | Martin Braschler | Giorgio Di Nunzio | Nicola Ferro | Julio Gonzalo | Mark Sanderson
Proceedings of the Sixth International Conference on Language Resources and Evaluation (LREC'08)

The importance of evaluation in promoting research and development in the information retrieval and natural language processing domains has long been recognised but is this sufficient? In many areas there is still a considerable gap between the results achieved by the research community and their implementation in commercial applications. This is particularly true for the cross-language or multilingual retrieval areas. Despite the strong demand for and interest in multilingual IR functionality, there are still very few operational systems on offer. The Cross Language Evaluation Forum (CLEF) is now taking steps aimed at changing this situation. The paper provides a critical assessment of the main results achieved by CLEF so far and discusses plans now underway to extend its activities in order to have a more direct impact on the application sector.

2004

pdf bib
The Future of Evaluation for Cross-Language Information Retrieval Systems
Carol Peters | Martin Braschler | Khalid Choukri | Julio Gonzalo | Michael Kluck
Proceedings of the Fourth International Conference on Language Resources and Evaluation (LREC’04)

The objective of the Cross-Language Evaluation Forum (CLEF) is to promote research in the multilingual information access domain. In this short paper, we list the achievements of CLEF during its first four years of activity and describe how the range of tasks has been considerably expanded during this period. The aim of the paper is to demonstrate the importance of evaluation initiatives with respect to system research and development and to show how essential it is for such initiatives to keep abreast of and even anticipate the emerging needs of both system developers and application communities if they are to have a future.

2002

pdf bib
The Importance of Evaluation for Cross-Language System Development: the CLEF Experience
Carol Peters | Martin Braschler
Proceedings of the Third International Conference on Language Resources and Evaluation (LREC’02)

2000

pdf bib
The Evaluation of Systems for Cross-language Information Retrieval
Martin Braschler | Donna Harman | Michael Hess | Michael Kluck | Carol Peters | Peter Schäuble
Proceedings of the Second International Conference on Language Resources and Evaluation (LREC’00)