In this paper, we present a database with emotional childrens speech in a human-robot scenario: the children were giving instructions to Sonys pet robot dog AIBO, with AIBO showing both obedient and disobedient behaviour. In such a scenario, a specific type of partner-centered interaction can be observed. We aimed at finding prosodic correlates of childrens emotional speech and were interested to see which speech registers children use when talking to AIBO. For interpretation, we left the weighting and categorization of prosodic features to a statistic classifier. The parameters found to be most important were word duration, average energy, variation in pitch and energy, and harmonics-to-noise ratio. The data moreover suggests that the children used a register that resembled mostly child-directed and pet-directed speech and to some extent computer-directed speech.