Cortical networks exhibit complex stimulus-response patterns. Previous work has identified the balance between excitatory and inhibitory currents as a central component of cortical computations, but has not considered how the required synaptic connectivity emerges from biologically plausible plasticity rules. Using theory and modeling, we demonstrate how a wide range of cortical response properties can arise from Hebbian learning that is stabilized by the synapse-type-specific competition for synaptic resources. In fully plastic recurrent circuits, this competition enables the development and decorrelation of inhibition-balanced receptive fields. Networks develop an assembly structure with stronger connections between similarly tuned neurons and exhibit response normalization and surround suppression. These results demonstrate how neurons can self-organize into functional circuits and provide a foundational understanding of plasticity in recurrent networks.