The draconian regulation required by the inherent instability of cognitive phenomena -- gene expression, immune function, cancer suppression, wound healing, animal consciousness, machine intelligence, network stabilization, institutional cognition, and their many and varied composites -- can be viewed through the lens of the asymptotic limit theorems of both information and control theories. Here, we explore the dynamics and sometimes highly punctuated failures of the regulation of cognition under increasing `noise'. The approach parallels, and indeed generalizes, the Data Rate Theorem of control theory, extending the theorem's requirement of a minimum channel capacity necessary for stabilization of an inherently unstable system. Various models are explored across different basic underlying probability distributions characteristic of the system under study, and across different hierarchical scales, finding the addition of adaptive -- learned -- regulation greatly extends the reach of innate -- e.g., AI-driven or otherwise pre-programmed -- regulation. This work points toward the construction of new statistical tools for the analysis of observational and empirical data across a wide spectrum of inherent pathologies and adversarial challenges afflicting a wide variety of cognitive processes.