ISCA Archive IberSPEECH 2024
ISCA Archive IberSPEECH 2024

#neural2speech: Decoding Speech and Language from the Human Brain

Xabier de Zuazo, Vincenzo Verbeni, Li-Chuan Ku, Ekain Arrieta, Ander Barrena, Anastasia Klimovich-Gray, Ibon Saratxaga, Eva Navas, Eneko Agirre, Nicola Molinaro

Can speech be decoded from brain activity? The #neural2speech project will leverage breakthroughs in cognitive neuroscience and natural language processing to address this compelling question by means of robust neural decoders. Specifically, brain-to-speech decoders will be designed to reconstruct both perceived and produced speech from non-invasive brain recordings, namely functional magnetic resonance imaging and magnetoencephalography data. By integrating deep learning techniques and large language models, not only does #neural2speech seek to deepen our understanding of language processing in the human brain – with a particular focus on multilingual processing –, but it also aims to pave the way for the development of innovative communication aids that can help individuals affected by speech impairments. The potential applications are vast, with the promise to revolutionize clinical neuroscience and human-computer interactions.