ISCA Archive Interspeech 2023
ISCA Archive Interspeech 2023

Improving Zero-shot Cross-domain Slot Filling via Transformer-based Slot Semantics Fusion

Yuhang Li, Xiao Wei, Yuke Si, Longbiao Wang, Xiaobao Wang, Jianwu Dang

Slot filling is an essential component in task-oriented dialogue systems. Due to the scarcity of annotated data, zero-shot slot filling has been studied to transfer knowledge from source domains to a target domain. Previous methods adopt slot descriptions or questions as slot semantics, where they utilize slot descriptions to calculate similarity scores, or reformat the task as a question-answering problem. However, these methods do not fully exploit the token-level dependency between the slot semantics and utterances. In this study, we propose a Transformer-based Slot semantics fusion method for Slot Filling (TSSF). We first adopt two encoders with shared weights to obtain the representations of utterances and slot semantics. Then, we design a transformer-based fusion module for effectively integrating slot semantics into utterances. Experimental results on the public benchmark SNIPS show that our model significantly outperforms the state-of-the-art model by 6.09% in terms of slot F1.