Intrinsic computation refers to how dynamical systems store, structure, and transform historical and spatial information. By graphing a measure of structural complexity against a measure of randomness, complexity-entropy diagrams display the range and different kinds of intrinsic computation across an entire class of system. Here, we use complexity-entropy diagrams to analyze intrinsic computation in a broad array of deterministic nonlinear and linear stochastic processes, including maps of the interval, cellular automata and Ising spin systems in one and two dimensions, Markov chains, and probabilistic minimal finite-state machines. Since complexity-entropy diagrams are a function only of observed configurations, they can be used to compare systems without reference to system coordinates or parameters. It has been known for some time that in special cases complexity-entropy diagrams reveal that high degrees of information processing are associated with phase transitions in the underlying process space, the so-called "edge of chaos". Generally, though, complexity-entropy diagrams differ substantially in character, demonstrating a genuine diversity of distinct kinds of intrinsic computation. Discovering organization in the natural world is one of science's central goals. Recent innovations in nonlinear mathematics and physics, in concert with analyses of how dynamical systems store and process information, has produced a growing body of results on quantitative ways to measure natural organization. These efforts had their origin in earlier investigations of the origins of randomness. Eventually, however, it was realized that measures of randomness do not capture the property of organization. This led to the recent efforts to develop measures that are, on the one hand, as generally applicable as the randomness measures but which, on the other, capture a system's complexity-its organization, structure, memory, regularity, symmetry, and pattern. Here-analyzing processes from dynamical systems, statistical mechanics, stochastic processes, and automata theory-we show that measures of structural complexity are a necessary and useful complement to describing natural systems only in terms of their randomness. The result is a broad appreciation of the kinds of information processing embedded in nonlinear systems. This, in turn, * Electronic address: