The human brain’s memory-storage capacity is an order of magnitude greater than previously thought, researchers at the Salk Institute for Biological Studies reported last week. The findings, recently detailed ineLife, are significant not only for what they say about storage space but more importantly because they nudge us toward a better understanding of how, exactly, information is encoded in our brains.
The question of just how much information our brains can hold is a longstanding one. We know that the human brain is made up of about 100 billion neurons, and that each one makes 1,000 or more connections to other neurons, adding up to some 100 trillion in total. We also know that the strengths of these connections, or synapses, are regulated by experience. When two neurons on either side of a synapse are active simultaneously, that synapse becomes more robust; the dendritic spine (the antenna on the receiving neuron) also becomes larger to support the increased signal strength. These changes in strength and size are believed to be the molecular correlates of memory. The different antenna sizes are often compared with bits of computer code, only instead of 1s and 0s they can assume a range of values. Until last week scientists had no idea how many values, exactly. Based on crude measurements, they had identified just three: small, medium and large.
But a curious observation led the Salk team to refine those measurements. In the course of reconstructing a rat hippocampus, an area of the mammalian brain involved in memory storage, they noticed some neurons would form two connections with each other: the axon (or sending cable) of one neuron would connect with two dendritic spines (or receiving antennas) on the same neighboring neuron, suggesting that duplicate messages were being passed from sender to receiver. Because both dendrites were receiving identical information, the researchers suspected they would be similar in size and strength. But they also realized that if there were significant differences between the two, it could point to a whole new layer of complexity. If the spines were of a different shape or size, they reasoned, the message they passed along would also be slightly different, even if that message was coming from the same axon.
So they decided to measure the synapse pairs. And sure enough, they found an 8 percent size difference between dendritic spines connected to the same axon of a signaling neuron. That difference might seem small, but when they plugged the value into their algorithms, they calculated a total of 26 unique synapse sizes. A greater number of synapse sizes means more capacity for storing information, which in this case translated into a 10-fold greater storage capacity in the hippocampus as a whole than the previous three-size model had indicated. “It’s an order of magnitude more capacity than we knew was there,” says Tom Bartol, a staff scientist at the Salk Institute and the study’s lead author.
But if our memory capacity is so great, why do we forget things? Because capacity is not really the issue, says Paul Reber, a memory researcher at Northwestern University who was not involved in the study, “Any analysis of the number of neurons will lead to a sense of the tremendous capacity of the human brain. But it doesn’t matter because our storage process is slower than our experience of the world. Imagine an iPod with infinite storage capacity. Even if you can store every song ever written, you still have to buy and upload all that music and then pull individual songs up when you want to play them.”
Reber says that it is almost impossible to quantify the amount of information in the human brain, in part because it consists of so much more information than we’re consciously aware of: not only facts and faces and measurable skills but basic functions like how to speak and move and higher order ones like how to feel and express emotions. “We take in much more information from the world than ‘what do I remember from yesterday?’” Reber says. “And we still don’t really know how to scale up from computing synaptic strength to mapping out these complex processes.”
The Salk study brings us a bit closer, though. “They’ve done an amazing reconstruction,” Reber says. “And it adds significantly to our understanding of not only memory capacity but more importantly of how complex memory storage actually is.” The findings might eventually pave the way toward all manner of advances: more energy-efficient computers that mimic the human brain’s data-transmission strategies, for example, or a better understanding of brain diseases that involve dysfunctional synapses.
But first scientists will have to see if the patterns found in the hippocampus hold for other brain regions. Bartol’s team is already working to answer this question. They hope to map the chemicals, which pass from neuron to neuron, that have an even greater capacity than the variable synapses to store and transmit information. As far as a precise measurement of whole-brain capacity, “we are still a long way off,” Bartol says. “The brain still holds many, many more mysteries for us to discover.”
Nenhum comentário:
Postar um comentário