A coarse-grained representation of neuronal network dynamics is developed in terms of kinetic equations, which are derived by a moment closure, directly from the original large-scale integrate-andfire (I&F) network. This powerful kinetic theory captures the full dynamic range of neuronal networks, from the mean-driven limit (a limit such as the number of neurons N 3 ؕ, in which the fluctuations vanish) to the fluctuation-dominated limit (such as in small N networks). Comparison with full numerical simulations of the original I&F network establishes that the reduced dynamics is very accurate and numerically efficient over all dynamic ranges. Both analytical insights and scale-up of numerical representation can be achieved by this kinetic approach. Here, the theory is illustrated by a study of the dynamical properties of networks of various architectures, including excitatory and inhibitory neurons of both simple and complex type, which exhibit rich dynamic phenomena, such as, transitions to bistability and hysteresis, even in the presence of large fluctuations. The implication for possible connections between the structure of the bifurcations and the behavior of complex cells is discussed. Finally, I&F networks and kinetic theory are used to discuss orientation selectivity of complex cells for ''ring-model'' architectures that characterize changes in the response of neurons located from near ''orientation pinwheel centers'' to far from them. N euronal networks, whether real cortical networks (1, 2) or computer models (3, 4), frequently operate in a regime in which spiking is caused by irregular temporal fluctuations of the membrane potential. At this ''cortical operating point,'' the mean membrane potential (e.g., obtained by averaging over many voltage traces under the same stimulus condition or by averaging locally in time), does not reach firing threshold. Thus, the spiking process is fluctuation-driven.A theoretical challenge is to construct efficient and effective representations of such fluctuation-driven networks, which are needed both to ''scale-up'' computational models to large enough regions of the cortex to capture interesting cortical processing (such as optical illusions related to ''contour completion''), and to gain qualitative understanding of the cortical mechanisms underlying this level of cortical processing. In this article, we develop such a construction: Starting with large-scale model networks of integrateand-fire (I&F) neurons, which are sufficiently detailed for modeling neuronal computation of large systems but are difficult to scale-up, we tile the cortex with coarse-grained (CG) patches. Each CG patch is sufficiently small that the cortical architecture does not change systematically across it, yet it is sufficiently large to contain many (hundreds) of neurons. We then derive an effective dynamics to capture the statistical behavior of the many neurons within each CG patch in their interaction with other CG patches. This representation is achieved by a kinetic theory, accomplished by ...