Brains are composed of networks of an enormous number of neurons interconnected with synapses. Neural information is carried by the electrical signals within neurons and the chemical signals among neurons. Generating these electrical and chemical signals is metabolically expensive. The fundamental issue raised here is whether brains have evolved efficient ways of developing an energy-efficient neural code from the molecular level to the circuit level. Here, we summarize the factors and biophysical mechanisms that could contribute to the energy-efficient neural code for processing input signals. The factors range from ion channel kinetics, body temperature, axonal propagation of action potentials, low-probability release of synaptic neurotransmitters, optimal input and noise, the size of neurons and neuronal clusters, excitation/inhibition balance, coding strategy, cortical wiring, and the organization of functional connectivity. Both experimental and computational evidence suggests that neural systems may use these factors to maximize the efficiency of energy consumption in processing neural signals. Studies indicate that efficient energy utilization may be universal in neuronal systems as an evolutionary consequence of the pressure of limited energy. As a result, neuronal connections may be wired in a highly economical manner to lower energy costs and space. Individual neurons within a network may encode independent stimulus components to allow a minimal number of neurons to represent whole stimulus characteristics efficiently. This basic principle may fundamentally change our view of how billions of neurons organize themselves into complex circuits to operate and generate the most powerful intelligent cognition in nature. V C 2017 Wiley Periodicals, Inc.
SIGNIFICANCEThe morphology and physiology of our brains are shaped by selective pressures through evolution to improve fitness for survival. The massive information-processing capacity of the brain, along with its tremendous metabolic energy consumption, requires our brain to be energy efficient, which could be achieved through trade-offs between benefits accrued and costs incurred by select morphological and physiological parameters. Eventually, our brains evolved to be highly energy efficient in neural computation and cognition. Understanding the mechanisms underlying the energy-efficient operation of the brain will not only help us better understand the computational and organizational design principles of the brain but will also have wide implications for the improvement of the next generation of energy-efficient artificial intelligence devices. This review demonstrated examples, from the gating of ion channels to whole-brain functional connectivity, of how an energy-efficient, nature-made computer was created by optimizing the design of neural systems.