The Salk Institute recently discovered some astonishing research while studying the brain. They gathered data that shows that the brain’s memory capacity is in the petabyte range, as much as entire World Wide Web! If the brain can store that much information, we better optimize this critical organ.
To understand the Salk findings published in journal eLife, a basic explanation of the brain is necessary. Essentially, our memories and thoughts are the result of patterns of electrical and chemical activity in the brain. An important part of this process happens when branches of neurons, interact at certain junctions. These junctions are called synapses. An output ‘wire’, called the axon, connections from one neuron to an input ‘wire’, called dendrite, of a second neuron. Brain signals then travel through the synapse as chemicals called neurotransmitters, sending the receiving neuron a message whether to convey an electrical signal to other neurons.
The research team behind this study created a 3D reconstruction of every dendrite, axon, glial process and synapse in the brain and were surprised by what they found. They found it was there was an incredible complexity and diversity amongst the synapses.
Synapses have been a cloudy area of study for some time, though their dysfunction has been connected to a range of neurological diseases. It is known that larger synapses, which have more surface area and vesicles of neurotransmitters are stronger, which likely makes them more likely to activate their surrounding neurons than smaller synapses.
While building a 3D reconstruction of rat hippocampus tissue, the researchers noticed something odd. They noticed that in some cases, a single axon from one neuron formed two synapses reaching out to a single dendrite of another neuron, which meant that the first neuron appeared to be sending duplicate messages to the receiving neuron. While researchers almost brushed off this finding, one scientist, Tom Bartol, persuaded the team that they could use this finding to gain insight into synaptic sizes. Previously, synapses have only been classified in three sizes: small, medium and large.
The research team hypothesized that the synapses would be roughly similar in size, however, were shocked to find that the synapses were nearly identical in size with only about an average of 8% difference in size amongst the synapses. Since it is known that memory capacity of neurons is dependent on synapse size, this 8% difference turned out to be an important number that the team used to plug into their algorithmic models of the brain to measure how much information could be stored in these synaptic connections.
With this new data, it was found that there are about 26 categories of sizes of synapses present in the brain. This suggests there are 10 times more discrete sizes of synapses than neuroscientists had previously assumed! This translates to about 4.7 “bits” of information, in computer terms, meaning that the brain can store that much information. Before this research, it was thought that the brain was capable of about 1-2 bits for short and long-term memory storage. Further research shows that every 2 or 20 minutes your synapses are going up or down to the next size, adjusting themselves accordingly to the signals they receive.
This study opens up new possibilities in research around learning and memory mechanisms, provides a logical explanation for the brain’s surprising efficiency, and could even help computer scientists build ultra-precise, but energy-efficient (like the brain) computers!
Now if the brain can store that much information, we better learn to optimize it! How to boost your memory? Try the Bright Bundle which consists of Piracetam and Centrophenoxine. Both of these supplements have been known to optimize memory, improve focus and concentration, and prevent mental decline.