Publications

All Publications

Zero-Shot Intent Classification Using a Semantic Similarity Aware Contrastive Loss and Large Language Model

Abstract

Zero-shot systems can reduce the cost of collecting data and training in a new domain since they can work directly with the test data without further training. In this paper, we build zero-shot systems for intent classification, based on Semantic Similarity-aware Contrastive Loss (SSCL) that addresses an issue in the original CL which treats non-corresponding pairs indiscriminately. We confirm that SSCL outperforms CL through experiments. Then, we explore how including text or speech in-domain data during the SSCL training affects the out-of-domain intent classification.During the zero-shot classification, embeddings for a set of classes in the new domain are generated to calculate the similarities between each class embedding and an input utterance embedding, after which the most similar class is predicted for the utterance’s intent. Although manually-collected text sentences per class can be used to generate the class embedding, the data collection can be costly. Thus, we explore how to generate better class embeddings without human-collected text data in the target domain. The best proposed method employing an instruction-tuned Llama2, a public large language model, shows the performance comparable to the case where the human-collected text data was used, implying the importance of accurate class embedding generation.

Author: Jaejin Cho, Rakshith Sharma Srinivasa, Ching-Hua Lee, Yashas Malur Saidutta, Chouchang Yang, Yilin Shen, Hongxia Jin

Published: ICASSP 2024 - 2024 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)

Date: Apr 14, 2024