Our Schedule

Res-2 : Energy efficient synaptic plasticity

As the brain only weighs about 2% of the body mass, but is responsible for 20% of the resting body's metabolism, energy efficiency is likely an important driving force in shaping neural design. In addition to the well-known costs of spikes and synaptic transmission, synaptic plasticity is also metabolically very demanding. Experimental research has suggested that the late phase of changing the synaptic strength between neurons, late long-term potentiation (l-LTP), requires particularly large amounts of energy. We wonder how synaptic plasticity rule can be made more energy efficient, yet at the same time remain powerful. Hereto we implement a perceptron and quantify how much metabolic energy is needed to successfully learn a binary classification problem. The metabolic energy is defined as the sum of the absolute changes in the synapses. We find that compared to the theoretical minimum, the classic perceptron learning rule is highly inefficient. However, by postponing consolidation, efficiency can be boosted. For example, batch learning - a well known method from machine learning, where many patterns are presented and tested before the synapses are updated - is four times more energy efficient than online learning. We determine an energy-saving scheme to optimize energy efficiency with respect to task performance. Finally, we find that these findings carry over to multi-layer networks trained with back-propagation. Thus energy considerations suggest a functional role for synaptic consolidation and the distinction between early and late LTP.