Based on the concept of multiple-stream prior evolution andposterior pooling, we propose a new incremental adaptiveBayesian learning framework for eficient on-line adaptationof the continuous density hidden Markov model (CDHMM)parameters. As a first step, we apply the affine transformations to the mean vectors of CDHMMs to control the evolution of their prior distribution. This new stream of priordistribution can be combined with another stream of priordistribution evolved without any constraints applied. In aseries of comparative experiments on the task of continuousMandarin speech recognition, we show that the new adaptation algorithm achieves a similar fast-adaptation performance as that of incremental MLLR (maximum likelihoodlinear regression) in the case of small amount of adaptation data, while maintains the good asymptotic convergenceproperty as that of our previously proposed quasi-Bayesadaptation algorithms.