ISCA Archive Interspeech 2024
ISCA Archive Interspeech 2024

Quantifying Unintended Memorization in BEST-RQ ASR Encoders

Virat Shejwalkar, Om Thakkar, Arun Narayanan

Self-supervised ASR encoders are increasingly being adopted in real-world applications as they enable downstream ASR tasks with impressive performances. This raises concerns around privacy of the data used to train such encoders, especially since neural networks are known to unintentionally memorize rare/unique samples from their training data. To this end, we perform the first systematic auditing of unintended memorization in ASR encoders. Specifically, we focus on a state-of-the-art Conformer-based ASR encoder pre-trained using the BEST-RQ technique, which forms the foundation of many real-world ASR applications. We propose a novel auditing method that can successfully demonstrate such memorization in ASR encoders, even for samples occurring just once in their training data. Finally, we show the promise of pre-training with per-sample gradient clipping towards mitigating such memorization in ASR encoders without significantly impacting downstream model quality.