ISCA Archive Interspeech 2015
ISCA Archive Interspeech 2015

Latent words recurrent neural network language models

Ryo Masumura, Taichi Asami, Takanobu Oba, Hirokazu Masataki, Sumitaka Sakauchi, Akinori Ito

This paper proposes a novel language modeling approach called latent word recurrent neural network language model, which solves the problems present in both recurrent neural network language models (RNNLMs) and latent word language models (LWLMs). The proposed model has a soft class structure based on a latent variable space as well as LWLM, where the latent variable space is modeled using RNNLM. From the viewpoint of RNNLMs, the proposed model can be considered as a soft class RNNLM with a vast latent variable space. In contrast, from the viewpoint of LWLMs, the proposed model can be considered as an LWLM that uses the RNN structure for latent variable modeling instead of the n-gram structure. This paper also details the parameter inference method and two kinds of usages for natural language processing tasks. Our experiments show effectiveness of the proposed model on a perplexity evaluation for the Penn Treebank corpus and an automatic speech recognition evaluation for Japanese spontaneous speech tasks.