In this work we present a Language Model (LM) that accounts for the effects of speaker workload by drawing on recent findings in cognitive psychology research. Under workload, speakers tend to shorten their utterances, but still aim to convey their message; hence they use more informative words. Inspired by the Perception and Action Cycle Method (PACM), the LM is used as a baseline dictionary that is constrained to have higher entropy. We show that the resulting LM has a power law relation to the baseline dictionary; i.e., there is a linear relation between the word log-probability under workload and its baseline log-probability. We then test for the existence of this relation in transcriptions of audio text messages (SMS) dictated while driving under different workload conditions. Significance tests were conducted using Monte Carlo simulations, with the data modeled by principal component analysis (PCA) and linear regression (LR). Based on this power law, we suggest a simple algorithm for LM adaptation under workload. Experiments show encouraging results in perplexity improvement of the LM under workload, thus providing empirical support for our model.