Link to the University of Pittsburgh Homepage
Link to the University Library System Homepage Link to the Contact Us Form

Assessing user simulation for dialog systems using human judges and automatic evaluation measures

Ai, H and Litman, D (2011) Assessing user simulation for dialog systems using human judges and automatic evaluation measures. Natural Language Engineering, 17 (4). 511 - 540. ISSN 1351-3249

[img] Plain Text (licence)
Available under License : See the attached license file.

Download (1kB)


While different user simulations are built to assist dialog system development, there is an increasing need to quickly assess the quality of the user simulations reliably. Previous studies have proposed several automatic evaluation measures for this purpose. However, the validity of these evaluation measures has not been fully proven. We present an assessment study in which human judgments are collected on user simulation qualities as the gold standard to validate automatic evaluation measures. We show that a ranking model can be built using the automatic measures to predict the rankings of the simulations in the same order as the human judgments. We further show that the ranking model can be improved by using a simple feature that utilizes time-series analysis. © 2011 Cambridge University Press .


Social Networking:
Share |


Item Type: Article
Status: Published
CreatorsEmailPitt UsernameORCID
Ai, H
Litman, Ddlitman@pitt.eduDLITMAN
Centers: University Centers > Learning Research and Development Center (LRDC)
Date: 1 October 2011
Date Type: Publication
Journal or Publication Title: Natural Language Engineering
Volume: 17
Number: 4
Page Range: 511 - 540
DOI or Unique Handle: 10.1017/s1351324910000318
Schools and Programs: Dietrich School of Arts and Sciences > Computer Science
Dietrich School of Arts and Sciences > Intelligent Systems
Refereed: Yes
ISSN: 1351-3249
Date Deposited: 16 Oct 2014 17:57
Last Modified: 19 Jan 2019 15:55


Monthly Views for the past 3 years

Plum Analytics

Actions (login required)

View Item View Item