I have reanalyzed the data presented by Hallem and Carlson [Hallem EA, Carlson JR (2006) Cell 125(1):143-160] and shown that the combinatorial odor code supplied by the fruit fly antenna is a very simple one in which nearly all odors produce, statistically, the same neuronal response; i.e., the probability distribution of sensory neuron firing rates across the population of odorant sensory neurons is an exponential for nearly all odors and odor mixtures, with the mean rate dependent on the odor concentration. Between odors, then, the response differs according to which sensory neurons are firing at what individual rates and with what mean population rate, but not in the probability distribution of firing rates. This conclusion is independent of adjustable parameters, and holds both for monomolecular odors and complex mixtures. Because the circuitry in the antennal lobe constrains the mean firing rate to be the same for all odors and concentrations, the odor code is what is known as maximum entropy.fly | olfaction | theory | odor code T he projection neurons of the fly antennal lobe present odor information to the Kenyon cells of the mushroom body in the form of a combinatorial code-each odor is specified by a particular pattern of firing rates across the population of projection neurons-and a recent paper (1), using data published in ref. 2, provided preliminary evidence that this odor code is what information theorists call maximum entropy (3). To understand what a maximum entropy code is, suppose that we record the firing rates from, say, 10 different projection neurons, each presented with, say, 10 different odors to give a total of 100 firing rates. Now make a histogram of these firing rates. If the odor code is maximum entropy, this histogram would have nearly the same shape no matter which projection neurons and which odors were chosen. Also, the larger the sample of projection neurons and/or odors, the closer the shapes of histograms would be. Remarkably, if only a single odor and many projection neurons, or a single projection neuron and many odors, are used to generate the rates, the histogram shape is always nearly the same.Many different maximum entropy codes have been studied, and the type of code is defined by the shape of the histogram that results from a sample of rates. The histogram in each case is an approximation of a probability distribution of rates. For example, if the mean rate is always the same, the maximum entropy code is known to be associated with an exponential distribution of rates, and if both the mean and variance are always the same, a Gaussian distribution of rates is associated (3). For the fly projection neurons, the associated probability distribution of rates is proposed to be an exponential (1) (with always the same mean).As pointed out earlier (1), a maximum entropy code would be advantageous to the fly because it would permit the most odors to be discriminated with the available number of odorant receptors.This characterization of the odor code used by antennal lobe proje...