ISCA Archive Eurospeech 1997
ISCA Archive Eurospeech 1997

Integration of eye fixation information with speech recognition systems

Ramesh R. Sarukkai, Craig Hunter

In this paper, a semi-tight coupling between visual and auditory modalities is proposed: in particular, eye fixation information is used to enhance the output of speech recognition systems. This is achieved by treating natural human eye fixations as diectic references to symbolic objects, and passing on this information to the speech recognizer. The speech recognizer biases its search towards these set of symbols/words during the best word sequence search process. As an illustrative example, the TRAINS interactive planning assistant system has been used as a test-bed; eye-fixations provide important cues to city names which the user sees on the map. Experimental results indicate that eye fixations help reduce speech recognition errors. This work suggests that integrating information from different interfaces to bootstrap each other would enable the development of reliable and robust interactive multi-modal human- computer systems.


doi: 10.21437/Eurospeech.1997-468

Cite as: Sarukkai, R.R., Hunter, C. (1997) Integration of eye fixation information with speech recognition systems. Proc. 5th European Conference on Speech Communication and Technology (Eurospeech 1997), 1639-1643, doi: 10.21437/Eurospeech.1997-468

@inproceedings{sarukkai97_eurospeech,
  author={Ramesh R. Sarukkai and Craig Hunter},
  title={{Integration of eye fixation information with speech recognition systems}},
  year=1997,
  booktitle={Proc. 5th European Conference on Speech Communication and Technology (Eurospeech 1997)},
  pages={1639--1643},
  doi={10.21437/Eurospeech.1997-468},
  issn={1018-4074}
}