Non-negative Matrix Factorisation (NMF) has been successfully applied for learning the meaning of a small set of vocal commands without any prior knowledge of the language. This kind of learning is useful if flexibility in terms of the acoustic and language model is required, for example in assistive technologies for dysarthric speakers because they do not comply with common models. Vocal commands are grounded through the addition of semantic labels that represent the action corresponding to the command. The Kullback Leibler Divergence (KLD) is used to evaluate the acoustic model. The KLD is optimal for Poisson distributed data making it an appropriate metric for the acoustic features because they are a count of acoustic events. The semantic labels are however activations, so a multinomial likelihood function seems more appropriate because they are mutually exclusive. In this paper a cost function to evaluate the semantic model based on the multinomial likelihood function is proposed that aims to better suit its distribution. To minimise the proposed cost function a new set of update rules and a new normalisation scheme are proposed.