ISCA Archive Interspeech 2023
ISCA Archive Interspeech 2023

Head movements in two- and four-person interactive conversational tasks in noisy and moderately reverberant conditions

Alan Archer-Boyd, Rainer Martin

Multi-modal processing schemes are of increasing importance for adaptive hearing devices. However, more data is required to understand interactions in complex application scenarios. In this study, the speech and head movements of eight normal-hearing participants were recorded in two- and four-person interactive conversational tasks, with and without 4-talker babble noise at 75 dB(A) and reverberation times of 0.25 and 0.6 s. Two-person conversations showed a head movement (yaw) interquartile range of 11.6° while four-person conversations showed a statistically significantly different interquartile range of 21.9°. No effect of acoustic condition was observed. The recorded data were also successfully used to test a previously published hearing-device direction of arrival estimation algorithm that utilized head movement information and correlation lag between acoustic signals from the left and right ear.