Publications

All Publications

Compositional Generalization in Spoken Language Understanding

Abstract

State-of-the-art spoken language understanding (SLU) models have shown tremendous success in benchmark SLU datasets, yet they still fail in many practical scenario due to the lack of model compositionality when trained on limited training data. In this paper, we study two types of compositionality: novel slot combination, and length generalization. We first conduct in-depth analysis, and find that state-of-the-art SLU models of- ten learn spurious slot correlations during training, which leads to poor performance in both compositional cases. To miti- gate these limitations, we create the first compositional splits of benchmark SLU datasets and we propose the first compositional SLU model, including compositional loss and paired training that tackle each compositional case respectively. On both benchmark and compositional splits in ATIS and SNIPS, we show that our compositional SLU model significantly out- performs state-of-the-art BERT SLU model.

Author: Yilin Shen, Hongxia Jin

Published: Annual Conference of the International Speech Communication Association (INTERSPEECH)

Date: Aug 24, 2023