Decoding inner speech from brain activity holds transformative promise for brain-computer interfaces and speech rehabilitation. This study examines EEG-based neural patterns across speech production, imagination, and perception. Signal-level visualizations reveal distinct, phase-specific cortical signatures that emerge within precise temporal windows in spatially localized regions. These insights motivate the introduction of a non-activity(NA) state to account for resting and transitional EEG periods - akin to pauses in natural speech. Capturing and excluding these "cognitive pauses" de-noises the EEG signal, improving the focus on task-relevant neural activity. Comparative evaluation across continuous and isolated speech datasets demonstrates that NA modeling improves syllabic recognition accuracy by 1.8% and 1.15%, respectively. These results underscore the role and generalizability of cognitive pauses in imagined speech-EEG decoding, enabling robust brain computer interfaces.