ISCA Archive Interspeech 2005
ISCA Archive Interspeech 2005

The hidden vector state language model

Vidura Seneviratne, Steve Young

The Hidden Vector State (HVS) model extends the basic Hidden Markov Model (HMM) by encoding each state as a vector of stack states but with restricted stack operations. The model uses a right branching stack automaton to assign valid stochastic parses to a word sequence from which the language model probability can be estimated. The model is completely data driven and is able to model classes from the data that reflect the hierarchical structures found in natural language. This paper describes the design and the implementation of the HVS language model [1], focusing on the practical issues of initialisation and training using Baum-Welch re-estimation whilst accommodating a large and dynamic state space. Results of experiments conducted using the ATIS corpus [2] show that the HVS language model reduces test set perplexity compared to standard class based language models.

doi: 10.21437/Interspeech.2005-5

Cite as: Seneviratne, V., Young, S. (2005) The hidden vector state language model. Proc. Interspeech 2005, 9-12, doi: 10.21437/Interspeech.2005-5

  author={Vidura Seneviratne and Steve Young},
  title={{The hidden vector state language model}},
  booktitle={Proc. Interspeech 2005},