ISCA Archive Interspeech 2014
ISCA Archive Interspeech 2014

Distributed learning of multilingual DNN feature extractors using GPUs

Yajie Miao, Hao Zhang, Florian Metze

Multilingual deep neural networks (DNNs) can act as deep feature extractors and have been applied successfully to cross-language acoustic modeling. Learning these feature extractors becomes an expensive task, because of the enlarged multilingual training data and the sequential nature of stochastic gradient descent (SGD). This paper investigates strategies to accelerate the learning process over multiple GPU cards. We propose the DistModel and DistLang frameworks which distribute feature extractor learning by models and languages respectively. The time-synchronous DistModel has the nice property of tolerating infrequent model averaging. With 3 GPUs, DistModel achieves 2.6 × speed-up and causes no loss on word error rates. When using DistLang, we observe better acceleration but worse recognition performance. Further evaluations are conducted to scale DistModel to more languages and GPU cards.

doi: 10.21437/Interspeech.2014-211

Cite as: Miao, Y., Zhang, H., Metze, F. (2014) Distributed learning of multilingual DNN feature extractors using GPUs. Proc. Interspeech 2014, 830-834, doi: 10.21437/Interspeech.2014-211

  author={Yajie Miao and Hao Zhang and Florian Metze},
  title={{Distributed learning of multilingual DNN feature extractors using GPUs}},
  booktitle={Proc. Interspeech 2014},