A recent model of intrinsic plasticity coupled to Hebbian synaptic plasticity proposes that adaptation of a neuron's threshold and gain in a sigmoidal response function to achieve a sparse, exponential output firing rate distribution facilitates the discovery of heavy-tailed or super-Gaussian sources in the neuron's inputs. We show that the exponential output distribution is irrelevant to these dynamics and that, furthermore, while sparseness is sufficient, it is not necessary. The intrinsic plasticity mechanism drives the neuron's threshold large and positive, and we prove that in such a regime, the neuron will find super-Gaussian sources; equally, however, if the threshold is large and negative (an "anti-sparse" regime), it will also find super-Gaussian sources. Away from such extremes, the neuron can also discover sub-Gaussian sources. By examining a neuron with a fixed sigmoidal non-linearity and considering the synaptic strength fixed point structure in the two-dimensional parameter space defined by the neuron's threshold and gain, we show that this space is carved up into sub-and super-Gaussian-input-finding regimes, possibly with regimes of simultaneous stability of sub-and super-Gaussian sources or regimes of instability of all sources; a single Gaussian source may also be stabilised by the presence of a non-Gaussian source. A neuron's "operating point" (essentially its threshold and gain coupled with its input statistics) therefore critically determines its computational repertoire. Intrinsic plasticity mechanisms induce trajectories in this parameter space but do not fundamentally modify it. Un-2 less the trajectories cross critical boundaries in this space, intrinsic plasticity is irrelevant and the neuron's non-linearity may be frozen with identical receptive field refinement dynamics.3