It is widely believed that sensory and motor processing in the brain is based on simple computational primitives rooted in cellular and synaptic physiology. However, many gaps remain in our understanding of the connections between neural computations and biophysical properties of neurons. Here, we show that synaptic spike-timedependent plasticity (STDP) combined with spike-frequency adaptation (SFA) in a single neuron together approximate the well-known perceptron learning rule. Our calculations and integrate-and-fire simulations reveal that delayed inputs to a neuron endowed with STDP and SFA precisely instruct neural responses to earlier arriving inputs. We demonstrate this mechanism on a developmental example of auditory map formation guided by visual inputs, as observed in the external nucleus of the inferior colliculus (ICX) of barn owls. The interplay of SFA and STDP in model ICX neurons precisely transfers the tuning curve from the visual modality onto the auditory modality, demonstrating a useful computation for multimodal and sensoryguided processing.delta learning rule | Hebbian learning | sensory fusion | synaptic potentiation | supervised M any of the sensory and motor tasks solved by the brain can be captured in simple equations or minimization criteria. For example, minimization of errors made during reconstruction of natural images using sparse priors leads to linear filters reminiscent of simple cells (1, 2), minimization of retinal slip or visual error leads to emergence and maintenance of neural integrator networks (3-5), and optimality criteria derived from information theory can model the remapping dynamics of receptive fields in the barn owl midbrain (6).Despite these advances, little is known about cellular physiological properties that could serve as primitives for solving such computational tasks. Among the known primitives are short-term synaptic depression, which can give rise to multiplicative gain control (7), or spike-frequency adaptation (SFA), which may provide high-pass filtering of sensory inputs (8, 9).Here, we explore biophysical mechanisms and computational primitives for instructive coding. Instructive coding is a computation that allows the brain to constrain its sensory representations adaptively by exploiting intrinsic properties of the physical world. The example we consider here is that sound sources and salient visual stimuli often co-localize (e.g., when a dried branch cracks under the footstep of an animal). In the barn owl, a highly efficient predator, this auditory-visual co-localization is well reflected by registration of auditory and visual maps in the external nucleus of the inferior colliculus (ICX) and the optic tectum (OT). The instructive aspect of this registration is that it is actively maintained by plasticity mechanisms: When the visual field of owls is chronically shifted by prisms, neurons in ICX and OT develop a shift in their auditory receptive fields that corresponds to the visual field displacement (10, 11). Hence, visual inputs to these areas are ...