The human brain takes up 20% of the body’s resting oxygen consumption. Half of that is used by potassium and sodium pumps, which must hydrolyze ATP to keep the resting potential, derived from the ionic concentration gradient across cell membranes, consistent. It is almost certain that this metabolic cost has a huge impact on the way that neural codes have been implemented.
In order to deduce the amount of energy that it takes for neurons to transmit information, Laughlin et al recorded from five blowfly photoreceptors that were randomly stimulated with 10^6 effective photons per second (i.e., daylight). They determined that each photoreceptor consumes 7 x 10^9 ATP molecules per second under these conditions, and 6 x 10^-5 milliliters of oxygen per minute. Given an information transportation rate of 1000 bits per second, this corresponds to 7 million ATP molecules consumed per bit of information.
In the second order large monopolar cell (i.e., an interneuron), they calculated that these neurons consume a lower bound of 9 x 10^5 molecules of ATP per second, as well as transmitting more information (1600 bits) per second at a higher reliability. The authors explain this discrepancy becuase this interneuron has a lower membrane conductance, in part because it need not have a large membrane area in order to accept and transduce photons like the photoreceptor.
The fact that neurons use so much energy may be due to the stochastic nature of receptor activation. Numerous components must converge, including molecular collision, diffusion, and vesicle release, all of which reduce reliable information transfer by introducing noise. Perhaps by using more ATP than the minimum required, neurons insert useful redundancy that helps stabilize their operations.
Laughlin SB, et al. 1998 The metabolic cost of neural information. Nature Neuroscience doi:10.1038/23.