Med thumb 51951632 71eda9204e o

This time, the sensational-sounding headlines are accurate: The human brain can store up to 10 times more information than previously thought, according to one team of neuroscientists. (That capacity, Inverse points out, is large enough to hold the entire World of Warcraft; for non-gamers, that's enough memory for 4.7 billion books). The machines inside our skulls, it appears, run far more efficiently than the ones on our desks. And this revelation not only enhances our understanding of the brain, but could also contribute to the development of bigger, better digital devices and, potentially, pave the way to aritificially intelligent machines. 

The Big Idea

Brains are plastic, which means external experiences  — learning, exercise, playing music — can alter neural wiring to enable the storage of more information. Researchers have long tried to calculate the storage capacity of the three-pound organ that runs our lives. The key to understanding brain power lies in figuring out the strength of its synapses, those junctions that connect neurons and regulate the flow of information between them. A highly interconnected web of neurons makes for a strong brain.

“The type of information being stored in the brain consists of not just facts that we consciously recall at will, but also information related to how our brain controls our body and perceives the world through our senses,” said study author Tom Bartol, a scientist at the Salk Institute. “These abilities in particular are the very substance of our human intelligence and require vast amounts of learned information to be put into action. I suspect that this is where much of the storage capacity is put to use.”

Bartol, along with researchers from MIT, the University of California San Diego and the University of Texas at Austin, sought to re-calculate brain storage capacity based on a 3D reconstruction of a chunk of rat brain. Specifically, they looked at the vermin’s hippocampus, a region integral to memory formation, and published their findings in the journal eLife this month.

This knowledge could help us build better artificially intelligent machines in the future.

They devised a more precise method of measuring synaptic strength, and landed on a number that dwarfed previous estimates: 4.7 bits of information per synapse, of which the human brain has trillions. 

The recalculated estimate puts human memory around a pegabyte, comparable to that of the internet. The number, Bartol said, is nothing short of a bombshell. "We expected to perhaps confirm or refine the previous estimates and were caught off guard by such a massive leap."

Confirmation awaits — the team needs to repeat the experiment on a larger scale and in other parts of the brain. If they replicate the findings, it "likely means the whole brain can store 10 times more information,” said Bartol, who added that the research “gives important clues about how the brain is so much more energy-efficient than digital computers.”

In other words, Bartol said the brain could become a blueprint for upgraded computers, hastening the journey to artificial intelligence.

CircuitryBrain_MemoryStorageIncreased

Bit by Bit

Scientists measure the storage capacity of both brains and computers using the same unit of information: a bit — short for "binary digit." But the brain-as-computer framework offers an imperfect analogy. Mental and man-made machines store information in entirely different ways. Here’s how Bartol explained it:

“In a modern computer, each piece of specific information is uniquely localized to a specific memory address. This is very unlike how information is stored in the brain. Our current understanding of how things are learned and retrieved in the brain tells us that even very specific information is widely distributed across many neurons and synapses. This makes it more durable and harder to disrupt and easier to rebuild if disruption does occur.”

One giant step for brain science is, essentially, one giant leap for singularity.

Figuring out the amount of information a whole brain could hold, however, hinges on calculating the potential strength of each synapse. How do scientists determine synaptic strength? By measuring the physical size of bumps on neurons, called dendritic spines, where synapses form  — it's like calculating a person’s strength based on the size of their shoe. 

Neural Itinerary

Chemical messages travel through the axon of one neuron to the dendrite of a second one. The axon and the dendrite meet at the synapse. A single axon can form multiple synapses on the same dendrite, but only on different spines of that dendrite, like ships docking at two points in the same harbor. Synapses on different spines of the same dendrite should have equal strength because they share identical activity histories. (In the diagram below, outbound information travels through the axon of the first neuron (on the left), and is received by the green-gay dendrites of the second neuron. The information hand-off happens at synapses where the neurons meet. The diagram isn't detailed enough to show dendritic spines, which would line dendrites like tiny scales on a tail.)

neuron dia 

Measuring dendritic spines to determine synaptic strength isn't brand new. Each dendritic spine has a head and a neck, and other researchers had relied on spine volume as an indicator of strength. But, in calculating the total volume of the head and neck, together. Bartol's group realized they only needed, and only wanted, to know head volume. 

“It turns out that the volume of the neck is actually not at all correlated with the synaptic strength,” said Bartol. “So measuring the volume of the whole spine (clean measurements from the head, plus noisy measurements from the neck) just adds random noise to the volume estimate.”

"Almost every mental disorder can be traced to some malfunction of synapses"

After losing the neck, they found that spikes came in 26 different sizes, which translates to 26 different synaptic strengths. Pinpointing this range of strengths enabled more precise calculations. They measured synaptic strengths in bits, and then scaled their estimate to the human brain. 

The Implications

So here we are: By measuring a smaller section of an infistiminal piece of a neuron, MIT neuroscientists realized we underestimated brain storage.

“This discovery is giving us a huge clue about the biochemistry in synapses that must be operating to make the synapses so precise,” said Bartol. “How does it work? Almost every mental disorder can be traced to some malfunction of synapses —  does disease perturb the precision of information storage at synapses?”

It’s an interesting question. And there’s an equally interesting one, of course, when it comes to computers: What will this mean for building the next supercomputer?

“Our discovery gives clues about how to build more powerful computers that use less energy,” said Bartol. “For machine intelligence we need new kinds of computing hardware that emulate how neurons and synapses work and move away from binary digital hardware.”

One giant step for brain science is, essentially, one giant leap for singularity.