We are developing multimodal man-machine interfaces through which users can communicate by integrating speech, gaze, facial expressions, and gestures such as nodding and finger pointing. In this paper, we study two discrete emotions, interest and disinterest. When communicating with computers, users may display a variety of emotions in faces and voices. Thus, in realizing more flexible and natural communications between humans and computers, we consider that computers need to know about their user's emotional state: whether the user is interested or not. Here we present work performed in our lab towards the analyses on modalities which would encode the user emotion of interest and disinterest in conversation.