In media technology, comparison of a sequential data with another is a fundamental technique for many applications, and dynamic time warping is often conducted. Conventionally, two sequences of raw features were compared, and recently, more abstract representations are used, which are obtained with deep neural networks. One of these representations is posteriorgram, where each frame is composed of n-dimensional class posteriors, and frame-to-frame distance is often calculated using a divergence metric such as Bhattacharyya distance. In this study, a novel method is proposed to distill the class-to-class distances encoded in any classifier used to calculate posteriors, and the distances are effectively used to accelerate posteriorgram DTW by approximating it as DTW between two sequences of most likely classes. Our proposal shows a much faster and even stabler performance and guarantees no requirement of calculating frame-to-frame distances on the fly during DTW processing.