A single neuron in a normal adult brain likely has more than a thousand genetic mutations that are not present in the cells that surround it, according to new research from Howard Hughes Medical Institute (HHMI) scientists. The majority of these mutations appear to arise while genes are in active use, after brain development is complete.
"We found that the genes that the brain uses most of all are the genes that are most fragile and most likely to be mutated," says Christopher Walsh, an HHMI investigator at Boston Children's Hospital who led the research. Walsh, Peter Park, a computational biologist at Harvard Medical School, and their colleagues reported their findings in the October 2, 2015, issue of the journal Science.
Ahhh, so that's where molecular computing went. It's us.
Related Stories
Forty years ago, a New York University graduate student named Arieh Aviram opened his Ph.D. dissertation with a bold suggestion: "Taking a clue from nature, [which] utilizes molecules for the carrying out of many physical phenomena, it may be possible to miniaturize electronic components down to molecular size." What Aviram was proposing was revolutionary: leapfrogging the ongoing miniaturization trend of Moore's Law by substituting single organic molecules for silicon transistors and diodes.
...
Aviram and Ratner's bold idea sank into obscurity.
...
Bulk ensembles of molecular electronics have made their way into commercial displays, and recent high-profile breakthroughs include single-molecule light-emitting diodes and carbon nanotube transistors coupled to silicon in a monolithic integrated circuit. Other, less flashy but more technically relevant results have come, for example, from Danny Porath and his colleagues at Hebrew University in Jerusalem, who have measured electrical transport in wires made of DNA; such wires are a self-assembled alternative to copper interconnects. Latha Venkataraman's group at Columbia University has measured single-molecule diodes to a rectification ratio of more than 200 times—a critical step for maintaining a high signal gain as devices shrink. And Christian Nijhuis and his coworkers at the National University of Singapore were able to measure the rectification changes that occurred when they replaced an individual functional group—just a handful of atoms—in a nanometer-size molecule. This is exactly the type of control dreamed of by Aviram and Ratner.
There's Plenty of Room at the Bottom
(Score: 2) by takyon on Wednesday October 07 2015, @11:12AM
And the chemistry nobel was just awarded for the discovery of DNA repair mechanisms [npr.org].
Genome sequencing level 2: now with better error correction.
They don't know if the point mutations are positive or negative. I wonder what would happen if all of the mutations in all of the neurons were corrected. Or all of the mutations in all of the cells of the human body.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 0) by Anonymous Coward on Wednesday October 07 2015, @11:31AM
Maybe neurons store long-term memories inside their DNA as mutations?
(Score: 2) by Subsentient on Wednesday October 07 2015, @11:35AM
Sounds a lot like write-only memory.
"It is no measure of health to be well adjusted to a profoundly sick society." -Jiddu Krishnamurti
(Score: 4, Interesting) by SunTzuWarmaster on Wednesday October 07 2015, @12:13PM
A agree that it is probably a feature. You want the brain to be plastic and respond to new situations quickly. Here is a recipe which might get you there:
- high neuron mutation rate, mentioned in article (investigate new solutions)
- high neuron repair rate, mentioned above (optimize towards solutions)
- high neuron creation/death/prune rate, (prune bad solutions), mentioned here (http://www.ncbi.nlm.nih.gov/pubmed/9700393)
We've known for a while that the brain is plastic and can reorganize itself. The above (and the recent discovery) point to some processes which could create a plastic brain. That said, the whole problem just got a lot harder. Things like "make an AI by making a really accurate model and shoving it into a robot body" (http://www.smithsonianmag.com/ist/?next=/smart-news/weve-put-worms-mind-lego-robot-body-180953399/) become somewhat more difficult with these findings.
(Score: 0) by Anonymous Coward on Wednesday October 07 2015, @12:17PM
Why? Who says an AI needs to mimic the human brain? Is the human brain the only way to achieve 'true intelligence'?
(Score: 2, Touché) by Wodan on Wednesday October 07 2015, @01:35PM
Maybe, maybe not, but it's the only way we know that works.
(Score: 2) by takyon on Wednesday October 07 2015, @04:02PM
Easier to copy than build from scratch.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 2) by takyon on Wednesday October 07 2015, @01:57PM
I'm not so sure these mutations are a feature:
That doesn't sound like a byproduct of a memory process or anything that would lead to increased plasticity. At the time the brain is the most plastic, childhood, there would be less of these mutations.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 0) by Anonymous Coward on Wednesday October 07 2015, @12:13PM
Then there only needs to be a way to transfer those memory genes from the neurons to the gonads, and voila, inheritable knowledge!
(Score: 0) by Anonymous Coward on Wednesday October 07 2015, @04:55PM
This sounds like homework, but I am really just wondering these things and couldn't figure it out:
1) Who introduced the term "mutation calling"?
2) They report ~1,500 mutations per neuron (~1.5% exonic) and cite some papers claiming this rate is ~6,000-18,000 in normal skin cells, and 20/exon (20/0.015~1,300 total) in normal kidney cells. How can we get from that to estimating the expected number of cells that are mutants for any arbitrary gene? I.e., how many cells do we need to put in a culture dish to expect to find at least one mutant for any gene we check?