Cascaded deep neural network models for dialog state tracking

G Yang, X Wang - Multimedia Tools and Applications, 2019 - Springer
Multimedia Tools and Applications, 2019Springer
Dialog state tracking (DST) maintains and updates dialog states at each time step as the
dialog progresses. It is necessary to include dialog historical information in DST. Previous
word-based DST models took historical utterances as a word sequence and used n-grams
in the sequence as inputs of models. It suffered from the problem of data sparseness. This
paper proposes a cascaded deep neural network framework for DST. It alleviates the
problem of data sparseness by making use of the hierarchical structure in dialog. The bottom …
Abstract
Dialog state tracking (DST) maintains and updates dialog states at each time step as the dialog progresses. It is necessary to include dialog historical information in DST. Previous word-based DST models took historical utterances as a word sequence and used n-grams in the sequence as inputs of models. It suffered from the problem of data sparseness. This paper proposes a cascaded deep neural network framework for DST. It alleviates the problem of data sparseness by making use of the hierarchical structure in dialog. The bottom layer of the cascaded framework, implemented by an Long Short Term Memory (LSTM) or a Convolutional Neural Network (CNN), encodes the word sequence into a sentence embedding in each dialog turn, and the upper layer integrates the representation of each turn gradually to get the dialog state using an LSTM. The cascaded models integrate natural language understanding into DST, and the entire network is trained as a whole. The experimental results on the DSTC2 dataset indicate that the proposed models, LSTM+LSTM and CNN + LSTM, can achieve better performance than existing models.
Springer
顯示最佳搜尋結果。 查看所有結果