Neurociencia | Neuroscience


Navegue por el glosario usando este índice.

Especial | A | B | C | D | E | F | G | H | I | J | K | L | M | N | Ñ | O | P | Q | R | S | T | U | V | W | X | Y | Z | TODAS

Página: (Anterior)   1  ...  3  4  5  6  7  8  9  10  11  12  ...  74  (Siguiente)


Logo KW

Bipolar patient develops mHealth app to help track mood disorders [1396]

de System Administrator - martes, 8 de septiembre de 2015, 20:33

Bipolar patient develops mHealth app to help track mood disorders

Logo KW

Blind Woman Receives Bionic Eye, Reads a Clock With Elation [1638]

de System Administrator - martes, 19 de enero de 2016, 23:46

Blind Woman Receives Bionic Eye, Reads a Clock With Elation



Recently, the BBC broadcast the reaction of Rhian Lewis, a 49-year-old blind mother of two, as she read a clock correctly using her right eye for the first time in 16 years.

Lewis was understandably emotional, and as the first patient in the UK to receive one of the world’s most advanced bionic eye implants—it was one for the books.

At the age of five, Lewis was diagnosed with retinitis pigmentosa, a condition that impairs the light-detecting cells—known as photoreceptors—in the retina, making it unable to absorb and process light. From the condition, Lewis’s right eye is completely blind and her left eye has next to zero vision.

Though the condition has no cure in the traditional sense, Lewis’s optic nerve and the brain circuitry necessary for vision have remained undamaged. The photoreceptors, then, are the only elements needing replacement.

The solution?

A tiny 3x3mm retinal implant chip developed by German engineering firm, Retina Implant AG. The chip, part of an NHS study of the tech, was implanted into the back of Lewis’s right eye during a daylong procedure at Oxford’s John Radcliffe Hospital.

Retina Implant AG’s tiny chip contains 1,600 electrodes—equivalent to less than one percent of one megapixel—which capture light as it enters the eye and activate the nerve cells of the inner retina. These then send electrical signals to the brain through the optic nerve. A small computer is placed underneath the skin behind the ear, with an exterior magnetic coil sitting outside the skin to power the computer.

Though the technology is certainly improving, the device is not yet perfect, nor does it grant perfect vision.

Patients see flashes of light when the implant is turned on, and after a few weeks, the brain begins to make sense of these flashes, forming them into meaningful shapes. With a small handheld wireless device, Lewis uses dials to modify the sensitivity, contrast, and frequency of the implant.

The images she sees are not seamless—objects are grainy and appear only in black and white—though the implant is indeed life altering for the blind.

“It’s been maybe eight years that I’ve had any sort of idea of what my children look like,” Lewis told The Guardian. “Now, when I locate something, especially like a spoon or a fork on the table, it’s pure elation. I just get so excited that I’ve got something right.”

Bionic eye technology isn’t new, though this is one of the best versions to date.

In 2012, Robin Miller was one of the first patients to receive an implant; yet his was built to last for 18 months, while Lewis’s implant may last up to 5 years. Also in 2012, Australian Dianne Ashworth, who suffered from the same condition as Lewis, received a retinal implant. While the technology was groundbreaking at the time, it contained a mere 24 electrodes, compared to the 1,600 in Lewis’s.

In 2013, the Argus II artificial retina designed at the Lawrence Livermore National Laboratory was the first of its kind in medical visual prosthetics to receive FDA approval. Yet, like the Australian implant, this device only contained 20 electrodes and required patients to wear a sunglasses-like visor.

The Retina Implant AG device Lewis tested out was itself version two. This version has 100 more electrodes, better resolution, lasts longer, and consumes less power.

With 285 million people estimated to be visually impaired worldwide, there is no question that this technology is in high demand and holds massive potential to change the lives of millions. As the technology matures in coming years, the implants are likely to provide better sight even as they become less invasive and longer lasting. And hopefully, they’ll become more affordable and accessible too.

Most significantly, the possibility of full facial recognition for those who haven’t seen loved ones in years no longer seems like a miracle, but a possibility in a not too distant future.

Image credit:

Alison E. Berman

  • Staff Writer at Singularity University
  • Alison tells the stories of purpose-driven leaders and is fascinated by various intersections of technology and society. When not keeping a finger on the pulse of all things Singularity University, you'll likely find Alison in the woods sipping coffee and reading philosophy (new book recommendations are welcome).



Logo KW


de System Administrator - martes, 7 de octubre de 2014, 18:31



Written By: Peniel M. Dimberu

With the recent and highly publicized death of actor Robin Williams, depression is once again making national headlines. And for good reason. Usually, the conversation about depression turns to the search for effective treatments, which currently include cognitive behavioral therapy and drugs such as selective serotonin reuptake inhibitors (SSRIs).

However, an equally important issue is the timely and proper diagnosis of depression.

Currently, depression is diagnosed by a physical and psychological examination, but it mostly depends on self-reporting of subjective symptoms like depressed mood, lack of motivation, and changes in appetite and sleep patterns. Many people who might want to avoid a depression diagnosis for various reasons can fake their way through this self-reporting, making it likely that depression is actually under-diagnosed.

Therefore, an objective test could be an important development in properly diagnosing and treating depression. Scientists at Northwestern University may have developed such a diagnostic tool, one that requires no more than a simple test tube of blood.

A team of researchers, led by Dr. Eva Redei and Dr. David Mohr, found that blood levels of nine biomarkers—molecules found in the body and associated with particular conditions—were significantly different in depressed adults when compared to non-depressed adults.

The new study, published in the current issue of Translational Psychiatry, builds uponearlier work by Dr. Redei, who found the levels of 11 other biomarkers to be different in depressed versus non-depressed teenagers.

While this work is still in the early stages, it represents an important advancement in the correct diagnosis and treatment of major depressive disorder, which currently has alifetime prevalence of 17% of the adult population in the US.

The new test looks at the levels of nine RNA markers in the patient’s blood. In case you forgot your molecular biology, RNA is the messenger molecule made using DNA as a template. In turn, cells then use RNA as a template to make proteins, which are the actual machines that do the work specified by the DNA.

The idea is that some genes can be overexpressed or underexpressed during depression and if those genes can be identified by looking at the RNA made from them, an objective way of identifying depression could be developed.

The researchers from Northwestern initially looked at the RNA levels of 26 genes in adolescents and young adults (ages 15-19). They found that 11 of them were significantly different in patients that had previously been diagnosed with depression versus those who were never diagnosed. They then tried to apply the same approach to adults and found significant differences in nine biomarkers.

Interestingly, the biomarkers that were found to differ in depressed versus non-depressed patients were actually different in young people and adults, implying that depression is genetically different when comparing different age groups.

Furthermore, the test was also able to identify differences between those who went through cognitive behavioral therapy and showed improvement versus those who showed no improvement after therapy. This kind of information can help predict the proper treatment regimen for patients, increasing their chance of going into remission.

While this work is still in the early stages and will take more studies to establish its accuracy and clinical usefulness, it represents an exciting development in diagnostics.

As medicine and genomics continue to change with the development of ever-increasingly powerful and cost-effective technology, we expect to get better at identifying and treating diseases before they take a significant toll.

These new blood tests have the potential to change our approach to diagnosing depression as well as selecting the proper treatment.

Sadly, many depression patients are resistant to the most common treatments so the need for new, more effective treatments is great. Finding differences in the expression of certain genes during depression could even lead to a clearer understanding of the disease, which in turn could lead to improved treatments down the road.

Depression is a thief that steals a productive and enjoyable life from far too many people around the world. Hopefully, this line of research will one day make it a thing of the past like smallpox.

Image Credit:

This entry was posted in GeneticsLongevity and tagged biomarkersblood testDavid Mohr,depressiondnaEva Redeinorthwestern universityrnaRobin Williamsselective serotonin reuptake inhibitors.


Logo KW

Boom in gene-editing studies amid ethics debate over its use [1511]

de System Administrator - lunes, 12 de octubre de 2015, 20:43

In this photo provided by UC Berkeley Public Affairs, taken June 20, 2014 Jennifer Doudna, right, and her lab manager, Kai Hong, work in her laboratory in Berkeley, Calif. The hottest tool in biology has scientists using words like revolutionary as they describe the long-term potential: wiping out certain mosquitoes that carry malaria, treating genetic diseases like sickle-cell, preventing babies from inheriting a life-threatening disorder. "We need to try to get the balance right," said Doudna. She helped develop new gene-editing technology and hears from desperate families, but urges caution in how it's eventually used in people. (Cailey Cotner/UC Berkeley via AP) 

Boom in gene-editing studies amid ethics debate over its use

by Lauran Neergaard

The hottest tool in biology has scientists using words like revolutionary as they describe the long-term potential: wiping out certain mosquitoes that carry malaria, treating genetic diseases like sickle cell, preventing babies from inheriting a life-threatening disorder. 

It may sound like sci-fi, but research into genome editing is booming. So is a debate about its boundaries, what's safe and what's ethical to try in the quest to fight disease.

Does the promise warrant experimenting with human embryos? Researchers in China already have, and they're poised to in Britain.

Should we change people's genes in a way that passes traits to future generations? Beyond medicine, what about the environmental effects if, say, altered mosquitoes escape before we know how to use them?

"We need to try to get the balance right," said Jennifer Doudna, a biochemist at the University of California, Berkeley. She helped develop new gene-editing technology and hears from desperate families, but urges caution in how it's eventually used in people.

The U.S. National Academies of Science, Engineering and Medicine will bring international scientists, ethicists and regulators together in December to start determining that balance. The biggest debate is whether it ever will be appropriate to alter human heredity by editing an embryo's genes.

"This isn't a conversation on a cloud," but something that families battling devastating rare diseases may want, Dr. George Daley of Boston Children's Hospital told specialists meeting this week to plan the ethics summit. "There will be a drive to move this forward."

Laboratories worldwide are embracing a technology to precisely edit genes inside living cells—turning them off or on, repairing or modifying them—like a biological version of cut-and-paste software. Researchers are building stronger immune cells, fighting muscular dystrophy in mice and growing human-like organs in pigs for possible transplant. Biotech companies have raised millions to develop therapies for sickle cell disease and other disorders.

The technique has a wonky name—CRISPR-Cas9—and a humble beginning.


In this photo taken Sept. 9, 2015, Kevin Esvelt poses for a photo at Harvard University's Wyss Institute in Boston. The hottest tool in biology has scientists using words like revolutionary as they describe the long-term potential: wiping out certain mosquitoes that carry malaria, treating genetic diseases like sickle-cell, preventing babies from inheriting a life-threatening disorder. Esvelt's projects include genetically manipulating a mosquito species to fight malaria. (AP Photo/Rodrique Ngowi) 

Doudna was studying how bacteria recognize and disable viral invaders, using a protein she calls "a genetic scalpel" to slice DNA. That system turned out to be programmable, she reported in 2012, letting scientists target virtually any gene in many species using a tailored CRISPR recipe.

There are older methods to edit genes, including one that led to an experimental treatment for the AIDS virus, but the CRISPR technique is faster and cheaper and allows altering of multiple genes simultaneously.

"It's transforming almost every aspect of biology right now," said National Institutes of Health genomics specialist Shawn Burgess.

CRISPR's biggest use has nothing to do with human embryos. Scientists are engineering animals with human-like disorders more easily than ever before, to learn to fix genes gone awry and test potential drugs.

Engineering rodents to harbor autism-related genes once took a year. It takes weeks with CRISPR, said bioengineer Feng Zhang of the Broad Institute at MIT and Harvard, who also helped develop and patented the CRISPR technique. (Doudna's university is challenging the patent.)

A peek inside an NIH lab shows how it works. Researchers inject a CRISPR-guided molecule into microscopic mouse embryos, to cause a gene mutation that a doctor suspects of causing a patient's mysterious disorder. The embryos will be implanted into female mice that wake up from the procedure in warm blankets to a treat of fresh oranges. How the resulting mouse babies fare will help determine the gene defect's role.

Experts predict the first attempt to treat people will be for blood-related diseases such as sickle cell, caused by a single gene defect that's easy to reach. The idea is to use CRISPR in a way similar to a bone marrow transplant, but to correct someone's own blood-producing cells rather than implanting donated ones.

"It's like a race. Will the research provide a cure while we're still alive?" asked Robert Rosen of Chicago, who has one of a group of rare bone marrow abnormalities that can lead to leukemia or other life-threatening conditions. He co-founded the MPN Research Foundation, which has begun funding some CRISPR-related studies.


In this photo provided by UC Berkeley Public Affairs, taken June 20, 2014 Jennifer Doudna, right, and her lab manager, Kai Hong, work in her laboratory in Berkeley, Calif. The hottest tool in biology has scientists using words like revolutionary as they describe the long-term potential: wiping out certain mosquitoes that carry malaria, treating genetic diseases like sickle-cell, preventing babies from inheriting a life-threatening disorder. "We need to try to get the balance right," said Doudna. She helped develop new gene-editing technology and hears from desperate families, but urges caution in how it's eventually used in people. (Cailey Cotner/UC Berkeley via AP) 

So why the controversy? CRISPR made headlines last spring when Chinese scientists reported the first-known attempt to edit human embryos, working with unusable fertility clinic leftovers. They aimed to correct a deadly disease-causing gene but it worked in only a few embryos and others developed unintended mutations, raising fears of fixing one disease only to cause another.

If ever deemed safe enough to try in pregnancy, that type of gene change could be passed on to later generations. Then there are questions about designer babies, altered for other reasons than preventing disease.

In the U.S., the NIH has said it won't fund such research in human embryos.

In Britain, regulators are considering researchers' request to gene-edit human embryos—in lab dishes only—for a very different reason, to study early development.

In this photo taken Sept. 9, 2015, Kevin Esvelt poses for a photo at Harvard University's Wyss Institute in Boston. The hottest tool in biology has scientists using words like revolutionary as they describe the long-term potential: wiping out certain mosquitoes that carry malaria, treating genetic diseases like sickle-cell, preventing babies from inheriting a life-threatening disorder. Esvelt's projects include genetically manipulating a mosquito species to fight malaria. (AP Photo/Rodrique Ngowi) 

Medicine aside, another issue is environmental: altering insects or plants in a way that ensures they pass genetic changes through wild populations as they reproduce. These engineered "gene drives" are in very early stage research, too, but one day might be used to eliminate invasive plants, make it harder for mosquitoes to carry malaria or even spread a defect that gradually kills off the main malaria-carrying species, said Kevin Esvelt of Harvard's Wyss Institute for Biologically Inspired Engineering.

No one knows how that might also affect habitats, Esvelt said. His team is calling for the public to weigh in and for scientists to take special precautions. For example, Esvelt said colleagues are researching a tropical mosquito species unlikely to survive cold Boston even if one escaped locked labs.

"There is no societal precedent whatsoever for a widely accessible and inexpensive technology capable of altering the shared environment," Esvelt told a recent National Academy of Sciences hearing.



CRISPR/Cas-derived technology offers the ability to dive into the genome and make a very precise change.

Chinese team performs gene editing on human embryo

by Bob Yirka

(—A team of researchers in China has announced that they have performed gene editing on human embryos. In their paper uploaded to the open access site Protein & Cell (after being rejected by Nature and Science) the researchers defended their research by pointing out that the embryos used in their research were non-viable. 

It has been only recently that scientists have had a tool that allows for directly editing a genome—called CRISPR, it allows for removing a single (defective) gene from a genome and replacing it with another one, to prevent genetic diseases. CRISPR has been used to edit animal embryos and adult stem cells, but up till now, no one has used the technique to edit the genome of human embryos due to ethical issues—or if they have, they have not acknowledged it publicly—this effort by the team in China has crossed that ethical line and because of that the announcement will likely incite condemnation by some and stir a new round of debate regarding the ethics of conducting such research.

The researchers report that their desire was to see how well CRISPR would work on human embryos. To find out, they collected 86 doubly fertilized embryos from a fertilization clinic—such embryos have been fertilized by two sperm and cannot mature beyond just a tiny clump of cells, they die naturally before growing into anything. The team reports that 71 of the embryos survived to grow enough for use in the CRISPR experiment. Unfortunately, the researchers found that the technique worked properly on just a fraction of the total, and only small percentage of those managed to relay the new gene properly when they split. They also found that sometimes the procedure wound up splicing the wrong gene segment, which led to inserting new genes in the wrong places—which in normal embryos could lead to a new disease. Additionally, of those that did get spliced and put in the right place, many were mosaic, a term used to describe a mix of old and new genes, which in addition to also leading to a new disease, could lead doctors to misidentify gene splicing results in normal embryos.

The researchers conclude by suggesting that the problems they encountered should be investigated further before any type of clinical application is begun.                                                                                                    

Explore further:  'CRISPR' science: Newer genome editing tool shows promise in engineering human stem cells

More information: CRISPR/Cas9-mediated gene editing in human tripronuclear zygotes, Protein & Cell, April 2015. DOI: 10.1007/s13238-015-0153-5.

ABSTRACT Genome editing tools such as the clustered regularly interspaced short palindromic repeat (CRISPR)-associated system (Cas) have been widely used to modify genes in model systems including animal zygotes and human cells, and hold tremendous promise for both basic research and clinical applications. To date, a serious knowledge gap remains in our understanding of DNA repair mechanisms in human early embryos, and in the efficiency and potential off-target effects of using technologies such as CRISPR/Cas9 in human pre-implantation embryos. In this report, we used tripronuclear (3PN) zygotes to further investigate CRISPR/Cas9-mediated gene editing in human cells. We found that CRISPR/Cas9 could effectively cleave the endogenous β-globin gene (HBB). However, the efficiency of homologous recombination directed repair (HDR) of HBB was low and the edited embryos were mosaic. Off-target cleavage was also apparent in these 3PN zygotes as revealed by the T7E1 assay and whole-exome sequencing. Furthermore, the endogenous delta-globin gene (HBD), which is homologous to HBB, competed with exogenous donor oligos to act as the repair template, leading to untoward mutations. Our data also indicated that repair of the HBB locus in these embryos occurred preferentially through the non-crossover HDR pathway. Taken together, our work highlights the pressing need to further improve the fidelity and specificity of the CRISPR/Cas9 platform, a prerequisite for any clinical applications of CRSIPR/Cas9-mediated editing.

See also: The ISSCR has responded to the publication of gene editing research in human embryos

via Nature                                        

Read more at:

Logo KW

Brain Activity Identifies Individuals [1514]

de System Administrator - martes, 13 de octubre de 2015, 13:23

This image shows the functional connections in the brain that tend to be most discriminating of individuals. Many of them are between the frontal and parietal lobes, which are involved in complex cognitive tasks. EMILY FINN

Brain Activity Identifies Individuals

By Kerry Grens

Neural connectome patterns differ enough between people to use them as a fingerprint.

Neuroscientists have developed a method to pick out an individual solely by his connectome—a pattern of synchronized neural activity across numerous brain regions.  Researchers had observed previously that brain connectivity is a unique trait, but a new study, published today (October 12) in Nature Neuroscience, demonstrates that neural patterns retain an individual’s signature even during different mental activities.

“What’s unique here is they were able to show it’s not just the functional connectivity—which is how different brain regions are communicating over time when you’re not doing a specific task—but even how the brain is activated during a specific task that is also very fingerprint-like,” said Damien Fair, who uses neuroimaging to study psychopathologies at Oregon Health and Science University but wasn’t involved with the study.

Fair and others said individuated brain scans could be applied to better understand the diversity of mental illnesses often lumped into the same diagnosis. “We don’t kneed to keep going at the average. We have the power to look at individuals,” said Todd Braver of Washington University who did not participate in the study. “To me I find that really exciting.”

The research team, based at Yale School of Medicine, extracted data from the Human Connectome Project, which includes functional MRI (fMRI) data from about 1,200 people so far. The Yale team analyzed imaging data from 268 different brain regions in 126 participants. To create a connectome profile for each individual, the researchers measured how strongly the activity of a specific brain region compared to the activity of every other brain region, creating an activity correlation matrix.

Each person, it turned out, had a unique activity correlation matrix. The team then used this profile to predict the identity of an individual in fMRI scans from another session.

Depending on the type of fMRI scan assessed, the researchers could nail someone’s identity with up to 99 percent accuracy. Scans taken during mental tasks, rather than resting, made it more difficult, and the accuracy dropped to below 70 percent.

“Even though brain function is always changing, and we saw it’s slightly harder to identify people when they are doing different things, people always looked most similar to themselves” than to another participant, said Emily Finn, a graduate student in Todd Constable’s lab and the lead author of the study.

Fair pointed out that one of the most individualized brain regions is the frontoparietal cortex, which helps to filter incoming information. He has found the same result in his own work on fingerprinting connectivity. “It really seems important for making an individual who we are,” he said.

The ability to identify individuals even during tasks on different days would be important for clinical applications. Mental disorders are often classified by phenotype, or symptoms, that may represent a variety of underlying causes. “These types of technologies I think are going to help us personalize mental health better,” Fair told The Scientist. “We’ll have more information to say specifically what’s happening in your brain.”

Finn’s group was also able to associate a person’s connectome with his or her “fluid intelligence.” This trait is measured by asking people to solve a problem or find a pattern without using language or math skills or learned information. Finn told The Scientist that stronger connections between the prefrontal and parietal lobes, brain regions already known to be involved in higher order cognition, were most indicative of higher fluid intelligence scores. The results “suggest levels of integration of different brain systems are giving rise to superior cognitive ability,” she said.

“It’s not just this idiosyncratic fingerpint that they’re talking about that basically allows you to differentiate one individual from another,” Braver said of the study, “but it pushes the idea that [the connectivity signature is] functionally relevant, that those things may be related to things that we think are interesting individual differences, like intelligence.”

Finn cautioned that the intelligence correlation is more a proof of concept to link brain connectivity with behaviors, rather than something having real-life applications. “Hopefully, we could replace that with some variable, like a neuropsychiatric illness or [predicting] who’s going to respond best to some treatment.”

E.S. Finn et al., “Functional connectome fingerprinting: identifying individuals using patterns of brain connectivity,” Nature Neuroscience, doi:10.1038/nn.4135, 2015.




Logo KW

Brain Freeze [1513]

de System Administrator - martes, 13 de octubre de 2015, 13:29

TIGHT SQUEEZE: Chemical fixation compacts synapses in a mouse brain (left), compared to freezing, which maintains the extracellular space (blue; right). GRAHAM KNOTT

Brain Freeze

By Kerry Grens

A common tissue fixation method distorts the true neuronal landscape.

The paper

N. Korogod et al., “Ultrastructural analysis of adult mouse neocortex comparing aldehyde perfusion with cryo fixation,” eLife, 4:e05793, 2015.

The fix

Soaking brain tissue with chemical fixatives has been the go-to method of preserving specimens for decades. Yet few neuroscientists take into account the physical distortion that these chemicals cause. And even among those who do pay attention, “we don’t really know in quantitative terms how much really changes,” says Graham Knott, a morphologist at the École Polytechnique Fédérale de Lausanne in Switzerland.


Comparing fresh to fixed tissue, Knott and his colleagues found that chemical fixation shrank the tissue by 30 percent. “It raises the question of, ‘What on earth is going on if it shrinks that much?’” says Knott. To find out, they turned to an alternative preservation approach, rapid freezing and low-temperature resin embedding, which was shown in the 1960s to better capture the natural state of the brain. Using a high-pressure version of this cryo-fixation technique, they observed neurons swimming in extracellular space and smaller astrocytes than are seen in chemically fixed samples.


NIH investigator Kevin Briggman says Knott’s technique offers a much more accurate snapshot of the brain. An added bonus is that the elbow room around neurons afforded by cryo fixation makes it easier for automated methods to count cells or analyze structures. The only problem, he adds, is that, in contrast to chemical fixation, “you can’t freeze a whole mouse brain.”

The compromise

Briggman and Knott don’t advocate doing away with fixatives. Rather, Knott says, scientists who use them should consider their effects when interpreting data. “We need to use models that pay very careful attention to how tissue has reacted to chemicals.”

Logo KW

Brain Gain [1473]

de System Administrator - viernes, 2 de octubre de 2015, 22:45


Brain Gain

By Jef Akst

Young neurons in the adult human brain are likely critical to its function.

At a lab meeting of Fred “Rusty” Gage’s group at the Salk Institute for Biological Studies in the mid-1990s, the neuroscientist told his team that he wanted to determine whether new neurons are produced in the brains of adult humans. At the time, adult neurogenesis was well established in rodents, and there had been hints that primate brains also spawned new neurons later in life. But reports of neurogenesis in the adult human brain were sparse and had not been replicated. Moreover, the experiments had relied primarily on autoradiography, which revealed images of cell division but did not follow the fate of new cells, so researchers couldn’t be sure if they really became mature neurons.

Gage’s group, which included clinicians, was familiar with the use of bromodeoxyuridine (BrdU) to monitor the progression of certain cancers. BrdU is an artificial nucleoside that can stand in for thymidine (T) during DNA replication. As cells duplicate their genomes just before they divide, they incorporate BrdU into their DNA. To assess tumor growth, physicians inject the nucleoside substitute into a patient’s bloodstream, then biopsy the tumor and use an antibody to stain for BrdU. The number of BrdU-labeled cells relative to the total number of cells provides an estimate of how quickly the cancer is growing. “If that nucleotide is labeled in such a way that [we] can identify it, you can birthdate individual cells,” Gage says.

Because BrdU goes everywhere in the body, Gage and his colleagues figured that in addition to labeling the patients’ tumors, the artificial base would also label the cells of the brain. If the researchers could get their hands on brain specimens from patients who’d been injected with BrdU, perhaps it would be possible to see new brain cells that had been generated in adults. With a second antibody, they could then screen for cell-type markers to determine if the new cells were mature neurons. “If you can . . . [use] a second antibody to identify the fate of a cell, then that’s pretty definitive,” Gage says.

As Gage’s fellows and postdocs left the lab and got involved in clinical trials involving BrdU injections, they began to keep an eye out for postmortem brain samples that Gage could examine. In 1996, one of them came through. Neurologist Peter Eriksson, who at the time was working at the Sahlgrenska University Hospital in Gothenburg, Sweden, began sending Gage samples from the brains of deceased patients. Every few months, a new sample arrived. And while waiting for the next delivery, Gage and his team were “getting fresh tissue from the coroner’s office to practice staining fresh tissue,” he says, “so that when we got these valuable brains we could see [what] they were doing.”

Soon enough, a clear picture emerged: the human hippocampus, a brain area critical to learning and memory and often the first region damaged in Alzheimer’s patients, showed evidence of adult neurogenesis. Gage’s collaborators in Sweden were getting the same results. Wanting to be absolutely positive, Gage even sent slides to other labs to analyze. In November 1998, the group published its findings, which were featured on the cover of Nature Medicine.1

“When it came out, it caught the fancy of the public as well as the scientific community,” Gage says. “It had a big impact, because it really confirmed [neurogenesis occurs] in humans.”

Fifteen years later, in 2013, the field got its second (and only other) documentation of new neurons being born in the adult human hippocampus—and this time learned that neurogenesis may continue for most of one’s life.2 Neuroscientist Jonas Frisén of the Karolinksa Institute in Stockholm and his colleagues took advantage of the aboveground nuclear bomb tests carried out by US, UK, and Soviet forces during the Cold War. Atmospheric levels of 14C have been declining at a known rate since such testing was banned in 1963, and Frisén’s group was able to date the birth of neurons in the brains of deceased patients by measuring the amount of 14C in the cells’ DNA.

“What we found was that there was surprisingly much neurogenesis in adult humans,” Frisén says—a level comparable to that of a middle-aged mouse, the species in which the vast majority of adult neurogenesis research is done. “There is hippocampal neurogenesis throughout life in humans.”

But many details remain unclear. How do newly generated neurons in adults influence brain function? Do disruptions to hippocampal neurogenesis play roles in cognitive dysfunction, mood disorders, or even psychosis? Are there ways to increase levels of neurogenesis in humans, and might doing so be therapeutic? Researchers are now seeking to answer these and other questions, while documenting the extent and function of adult neurogenesis in mammals.

Breaking the mold


RODENT NEUROGENESIS: In rodents, there are two populations of neural stem cells in the adult brain. The majority of new neurons are born in the subventricular zone along the lateral ventricle wall and migrate through the rostral migratory stream (RMS) to the olfactory bulb. About one-tenth as many new neurons are produced in the subgranular zone of the dentate gyrus (white) of the hippocampus.
See full infographic: WEB | PDF

In the early 1960s, MIT neurobiologist Joseph Altman used a hypodermic needle to induce lesions in rat brains, while simultaneously injecting tritiated thymidine, a radioactive form of the nucleoside commonly used for tracking DNA synthesis and cell proliferation. He found evidence of new brain cells that had been born at the time of injection, including some neurons and their neuroblast precursors.3

Researchers were immediately skeptical of the results. Long-standing theory held that neurons in the brain that had been damaged or lost could not be replaced; Altman was suggesting the opposite. “They were really good, solid indications, but it was such a strong dogma that neurons couldn’t be generated in the adult brain,” says Frisén. “It wasn’t really until the ’90s, when new techniques came along, [that researchers] showed, yes, indeed, new neurons are added in the rodent brain.”

Those new techniques included BrdU, as well as neuron-specific protein markers and confocal imaging, which together enabled researchers to identify the newly generated cells. Multiple studies subsequently confirmed that neurogenesis occurs in limited regions of the rodent brain, specifically in the olfactory bulb and the dentate gyrus region of the hippocampus. (See illustration.) Research also revealed that the rate of neurogenesis decreases with stress, depression, and anxiety, but increases with exercise and enrichment.


HUMAN NEUROGENESIS: Researchers have also demonstrated that neurogenesis occurs in the adult human brain, though the locations and degree of cell proliferation appear to differ somewhat from rodents.
See full infographic: WEB | PDF

“The field grew enormously at this point,” Gage says, and its focus began to shift from whether new neurons were being produced—they were—to whether those cells formed connections with existing networks to become functional—they do. Turns out, “these newly born cells have 5,000 synapses on their dendrites,” Gage says—well within the range of other neurons in the brain.

But would those rodent results hold up in primates? All signs pointed to yes. In March 1998, Princeton University’s Elizabeth Gould and colleagues found evidence of neurogenesis in the dentate gyrus of adult marmoset monkeys—and the researchers determined that the rate of cell proliferation was affected by stress, just as in rodents.4 Six months later, Gage’s group published its findings based on the clinical samples of human brain tissue. “It was a surprise to me, and I think to most people,” Frisén says. And the point was hammered home with the Frisén group’s analysis of 14C in human brain samples.

“The human evidence now unequivocally suggests that the dentate gyrus in humans undergoes turnover in our lifetime,” says Amar Sahay of Harvard University. “It really begs the question what the functions are of these adult-born neurons.”

Young and excitable

The first step in understanding the function of the new neurons in the adult brain was to characterize the cells themselves. In the late 1990s and early 2000s, researchers delved into the cell biology of neurogenesis, characterizing the populations of stem cells that give rise to the new neurons and the factors that dictate the differentiation of the cells. They also documented significant differences in the behavior of young and old neurons in the rodent brain. Most notably, young neurons are a lot more active than the cells of established hippocampal networks, which are largely inhibited.5,6

“For a period of about four or five weeks, while [the newborn neurons] are maturing, they’re hyperexcitable,” says Gage. “They’ll fire at anything, because they’re young, they’re uninhibited, and they’re integrating into the circuit.”

To determine the functional role of the new, hyperactive neurons, researchers began inhibiting or promoting adult neurogenesis in rodents by various means, then testing the animals’ performance in various cognitive tasks. What they found was fairly consistent: the young neurons seemed to play a role in processing new stimuli and in distinguishing them from prior experiences. For example, if a mouse is placed in a new cage and given time to roam, then subjected to a mild shock, it will freeze for about 40 seconds the next time it is placed in that same environment, in anticipation of a shock. It has no such reaction to a second novel environment. But in an enclosure that has some features in common with the first, fear-inducing cage, the mouse freezes for 20 seconds before seemingly surmising that this is not the cage where it received the initial shock. Knock out the mouse’s ability to produce new neurons, however, and it will freeze for the full 40 seconds. The brain is not able to easily distinguish between the enclosures.

This type of assessment is called pattern separation. While some researchers quibble over the term, which is borrowed from computational neuroscience, most who study hippocampal neurogenesis agree that this is a primary role of new neurons in the adult brain. “While probably five or six different labs have been doing this over the last four or five years, basically everybody’s come to the same conclusion,” Gage says.



The basic idea is that, because young neurons are hyperexcitable and are still establishing their connectivity, they are amenable to incorporating information about the environment. If a mouse is placed in a new cage when young neurons are still growing and making connections, they may link up with the networks that encode a memory of the environment. Just a few months ago, researchers in Germany and Argentina published a mouse study demonstrating how, during a critical period of cellular maturation, new neurons’ connections with the entorhinal cortex, the main interface between the hippocampus and the cortex, and with the medial septum change in response to an enriched environment.7



“The rate at which [new neurons] incorporate is dependent upon experience,” Gage says. “It’s amazing. It means that the new neurons are encoding things when they’re young and hyperexcitable that they can use as feature detectors when they’re mature. It’s like development is happening all the time in your brain.”

Adding support to the new neurons’ role in pattern separation, Sahay presented findings at the 2014 Society for Neuroscience conference that neurogenesis spurs circuit changes known as global remapping, in which overlap between the populations of neurons that encode two different inputs is minimized.8 “We have evidence now that enhancing neurogenesis does enhance global remapping in the dentate gyrus,” says Sahay. “It is important because it demonstrates that stimulating neurogenesis is sufficient to improve this very basic encoding mechanism that allows us to keep similar memories separate.”


NEWBORN PHOTOS: Three different micrographs show new neurons (top and bottom, green; middle, white) generated in the adult mouse brain. Neurogenesis occurs in the dentate gyrus, a V-shape structure within the hippocampus. Bushy dendritic processes extend into a relatively cell-free region called the molecular layer (black in bottom photo) and axons project to the CA3 region of the hippocampus (green “stem” in bottom photo). Astrocytes are stained pink (top).COURTESY OF FRED H. GAGE

Pattern separation is likely not the only role of new neurons in the adult hippocampus. Experiments that have suppressed neurogenesis in adult rats have revealed impairments in learning in a variety of other tasks. More broadly, “we think it has to do with the flexibility of learning,” says Gerd Kempermann of the Center for Regenerative Therapies at the Dresden University of Technology in Germany.

Last year, for example, neuroscientist Paul Frankland of the Hospital for Sick Children in Toronto and his colleagues found evidence that newly generated neurons play a role in forgetting, with increased neurogenesis resulting in greater forgetfulness among mice.9 “If you think about what you’ve done today, you can probably remember in a great deal of detail,” he says. “But if you go back a week or if you go back a month, unless something extraordinary happened, you probably won’t remember those everyday details. So there’s a constant sort of wiping of the slate.” New hippocampal neurons may serve as the “wiper,” he says, “cleaning out old information that, with time, becomes less relevant.”

Conversely, Frankland’s team found, suppressing neurogenesis seems to reinforce memories, making them difficult to unlearn. “We think that neurogenesis provides a way, a mechanism of living in the moment, if you like,” he says. “It clears out old memories and helps form new memories.”

Neurogenesis in the clinic

While studying the function of hippocampal neurogenesis in adult humans is logistically much more difficult than studying young neurons in mice, there is reason to believe that much of the rodent work may also apply to people—namely, that adult neurogenesis plays some role in learning and memory, says Kempermann. “Given that [the dentate gyrus] is so highly conserved and that the mechanisms of its function are so similar between the species—and given that neurogenesis is there in humans—I would predict that the general principle is the same.”

And if it’s true that hippocampal neurogenesis does contribute to aspects of learning involved in the contextualization of new information—an ability that is often impaired among people with neurodegenerative diseases—it’s natural to wonder whether promoting neurogenesis could affect the course of Alzheimer’s disease or other human brain disorders. Epidemiological studies have shown that people who lead an active life—known from animal models to increase neurogenesis—are at a reduced risk of developing dementia, and several studies have found reduced hippocampal neurogenesis in mouse models of Alzheimer’s. But researchers have yet to definitively prove whether neurogenesis, or lack thereof, plays a direct role in neurodegenerative disease progression. It may be that neurogenesis has “nothing to do with the pathology itself, but [with] the ability of our brain to cope with it,” says Kempermann.

Either way, the research suggests that “the identification of pro-neurogenic compounds would have a therapeutic impact on cognitive dysfunction, specifically, pattern separation alterations in aging and early stages of Alzheimer’s disease,” notes Harvard’s Sahay. “There’s a growing list of genes that encode secreted factors or other molecules that stimulate neurogenesis. Identifying compounds that harness these pathways—that’s the challenge.”

The birth of new neurons in the adult hippocampus may also influence the development and progression of mood disorders. Several studies have suggested that reduced neurogenesis may be involved in depression, for instance, and have revealed evidence that antidepressants act, in part, by promoting neurogenesis in the hippocampus. When Columbia University’s René Hen and colleagues short-circuited neurogenesis in mice, the animals no longer responded to the antidepressant fluoxetine.10 “It was a very big surprise,” Hen says. “The hippocampus has really been always thought of as critical for learning and memory, and it is, but we still don’t understand well the connection to mood.”

Adult neurogenesis has also been linked to post-traumatic stress disorder (PTSD). While it is perhaps less obvious how young neurons might influence the expression of fear, Sahay says it makes complete sense, given the emerging importance of neurogenesis in distinguishing among similar experiences. “In a way, the hippocampus acts as a gate,” he says, with connections to the amygdala, which is important for processing fear, and the hypothalamus, which triggers the production of stress hormones, among other brain regions. “It determines when [these] other parts of the brain should be brought online.” If new neurons are not being formed in the hippocampus, a person suffering from PTSD may be less able to distinguish a new experience from the traumatic one that is at the root of his disorder, Sahay and his colleagues proposed earlier this year.11 “We think neurogenesis affects the contextual processing, which then dictates the recruitment of stress and fear circuits.”

Of course, the big question is whether researchers might one day be able to harness neurogenesis in a therapeutic capacity. Some scientists, such as Hongjun Song of Johns Hopkins School of Medicine, say yes. “I think the field is moving toward [that],” he says. “[Neurogenesis] is not something de novo that we don’t have at all—that [would be] much harder. Here, we know it happens; we just need to enhance it.” 


  1. P.S. Eriksson et al., “Neurogenesis in the adult human hippocampus,” Nat Med, 4:1313-17, 1998.
  2. K.L. Spalding et al., “Dynamics of hippocampal neurogenesis in adult humans,” Cell, 153:1219-27, 2013.
  3. J. Altman, “Are new neurons formed in the brains of adult mammals?” Science, 135:1127-28, 1962.
  4. E. Gould et al., “Proliferation of granule cell precursors in the dentate gyrus of adult monkeys is diminished by stress,” PNAS, 95:3168 -71, 1998.
  5. C. Schmidt-Hieber et al., “Enhanced synaptic plasticity in newly generated granule cells of the adult hippocampus,” Nature, 429:184-87, 2004.
  6. S. Ge et al., “A critical period for enhanced synaptic plasticity in newly generated neurons of the adult brain,” Neuron, 54:559-66, 2007.
  7. M. Bergami et al., “A critical period for experience-dependent remodeling of adult-born neuron connectivity,” Neuron, 85:710-17, 2015.
  8. K. McAvoy et al., “Rejuvenating the dentate gyrus with stage-specific expansion of adult-born neurons to enhance memory precision in adulthood and aging,” Soc Neurosci, Abstract DP09.08/DP8, 2014.
  9. K.G. Akers et al., “Hippocampal neurogenesis regulates forgetting during adulthood and infancy,” Science, 344:598-602, 2014.
  10. L. Santarelli et al., “Requirement of hippocampal neurogenesis for the behavioral effects of antidepressants,” Science, 301:805-09, 2003.
  11. A. Besnard, A. Sahay, “Adult hippocampal neurogenesis, fear generalization, and stress,” Neuropsychopharmacology, doi:10.1038/npp.2015.167, 2015.


Logo KW

Brain Genetics Paper Retracted [841]

de System Administrator - viernes, 5 de septiembre de 2014, 19:51

Brain Genetics Paper Retracted

A study that identified genes linked to communication between different areas of the brain has been retracted by its authors because of statistical flaws. 

By Anna Azvolinsky

The authors of a June PNAS paper that purported to identify sets of genes associated with a specific brain function last week (August 29) retracted the work because of flaws in their statistical analyses. “We feel that the presented findings are not currently sufficiently robust to provide definitive support for the conclusions of our paper, and that an extensive reanalysis of the data is required,” the authors wrote in their retraction notice.

The now-retracted study identified a set of gene ontologies (GO) associated with a brain phenotype that has been previously shown to be disturbed in patients with schizophrenia. Andreas Meyer-Lindenberg, director of the Central Institute of Mental Health Mannheim, Germany, and his colleagues had healthy volunteers perform a working memory task known to require communication between the hippocampus and the prefrontal cortex while scanning their brains using functional magnetic resonance imaging (fMRI). The volunteers also underwent whole-genome genotyping. Combining the fMRI and genomic data, the researchers identified groups of genes that appeared associated with communication between the two brain regions, which can be disturbed in some people with schizophrenia. The authors used gene set enrichment analysis to pick out genes associated with this brain phenotype, identifying 23 that could be involved in the pathology of the brain disorder.

The Scientist first learned of possible problems with this analysis when the paper was under embargo prior to publication. At that time, The Scientist contacted Paul Pavlidis, a professor of psychiatry at the University of British Columbia who was not connected to the work, for comment on the paper. He pointed out a potential methodological flaw that could invalidate its conclusions. After considering the authors’ analyses, Pavlidis reached out to Meyer-Lindenberg’s team to discuss the statistical issues he perceived.

The original analysis flagged a set of 11 genes in close proximity to one another within the genome using the same single nucleotide polymorphism (SNP), inflating the significance of the results. “The researchers found a variant near a genomic region that they say is correlated with the [working memory] task,” explained Pavlidis. “But instead of counting that variant once, that variant was counted 11 times.”

“When we re-analyzed the data, we saw that we had to retract because addressing the problem went beyond just an erratum,” Meyer-Lindenberg told The Scientist. “The analysis could not be used to make any conclusions with the required statistical confidence.”

Elizabeth Thomas, ­who studies the molecular mechanisms of neurological disorders at The Scripps Research Institute in La Jolla, California, and was not involved in the work noted that the GO annotations used in the study were outdated. “GOs change every few months, and it’s unfortunate for researchers that rely on a certain set of annotations. It makes you wonder whether the papers published in the past five to 10 years are still relevant,” said Thomas. “This retraction raises the issue of how many papers may have falsely reported gene associations because of the constantly evolving changes in gene assemblies and boundaries. That’s really alarming to me.”

According to Meyer-Lindenberg, the researchers are re-evaluating their data using a different set of criteria and several updated sets of GO annotations.

In the meantime, however, Pavlidis lauded the researchers’ swift decision to retract. “What the authors did was the right thing,” he said.

Logo KW

Brain Imaging Can Predict the Success of Large Public Health Campaigns [1540]

de System Administrator - jueves, 29 de octubre de 2015, 21:23

Brain Imaging Can Predict the Success of Large Public Health Campaigns

Annenberg Shool for Communication - University of Pennsylvania

Research Area: 
Related People:

Emily Falk, Ph.D.Matt O'Donnell, Ph.D.

 It’s a frustrating fact that most people would live longer if only they could make small changes: stop smoking, eat better, exercise more, practice safe sex. Health messaging is one important way to change behavior on a large scale, but while a successful campaign can improve millions of lives, a failed one can be an enormous waste of resources.

The red highlighted area depicts the brain area of interest in this study.

These same 40 images were then used in an email campaign sent to 800,000 smokers by the New York State Smokers Quitline. The email to each smoker contained one of the images randomly assigned, along with the identical message: “Quit smoking. Start Living.” It also provided a link where smokers could get free help to quit.

Not all images were created equal: Among those who opened the email, click through rates varied from 10% for the least successful images to 26% for the most successful.


Examples of three of the images used as part of the New York State Quitline campaign.

But interestingly, the negative anti-smoking images which elicited the most powerful brain response in the MPFC of 50 smokers in Michigan were also the most successful at getting the hundreds of thousands of New York smokers to click for help in quitting. (And research has shown that visits to a quit-smoking site correlate with the likelihood that someone actually will quit.)

“By their nature, messages about the risks we take with our health — whether it be by smoking, eating poorly, or not exercising — cause people to become defensive,” says Falk. “If you can get around that defensiveness by helping people see why advice might be relevant or valuable to them, your messaging will have a more powerful effect.”

By combining the self-reported survey data — what smokers said they found effective — with the brain responses from the fMRI, the researchers found that they could more accurately predict which messages would be effective than by knowing either on its own.

Using the brain to predict the success or failure of advertising campaigns has long been a holy grail for marketers. Although some have made claims to proprietary methods for doing so, the science behind it has been opaque. This study by Falk and her colleagues is among the first to demonstrate specific brain patterns to predict the success of public health campaigns.

“If you ask people what they plan to do or how they feel about a message, you get one set of answers,” says Falk. “Often the brain gives a different set of answers, which may help make public health campaigns more successful. My hope is that moving forward, we might be able to use what we learned from this study and from other studies to design messages that are going to help people quit smoking and make them healthier and happier in the long run.”

Co-authors on the study include Matthew B. O’Donnell from the University of Pennsylvania; Steve Tompson,Richard GonzalezSonya Dal CinVictor Stretcher, and Lawrence An of the University of Michigan; and K.M. Cummings of the Medical University of South Carolina.

This study was funded by The Michigan Center of Excellence in Cancer Communication Research (NIH-P50 CA101451), and the National Institutes of Health New Innovator Award (NIH 1DP2DA03515601).

Media contact: Julie Sloane, Annenberg School for Communication, 215-746-1798,


Logo KW

Brain Imaging Shows Why Kids with Autism Have Social Difficulties [1657]

de System Administrator - jueves, 4 de febrero de 2016, 00:19

Brain Imaging Shows Why Kids with Autism Have Social Difficulties

Scientists believe that children with autism spectrum disorder (ASD) have difficulties in social interactions at least partly due to an inability to understand other people’s thoughts and feelings through a process called “theory of mind,” or ToM.

A new innovative brain imaging study has uncovered new evidence explaining why ToM deficiencies are present in ASD children. The researchers found disruptions in the brain’s circuitry involved in ToM at multiple levels compared to typical brain functioning. The findings provide valuable insight into an important neural network tied to the social symptoms in children with ASD. 

“Reduced brain activity in ToM-related brain regions and reduced connectivity among these regions in children with autism suggest how deficits in the neurobiological mechanisms can lead to difficulties in cognitive and behavioral functioning, such as theory of mind,” said Marcel Just, the D.O. Hebb University Professor of Psychology at Carnegie Mellon University.

“Weaker coordination and communication among core brain areas during social thinking tasks in autism provides evidence for how different brain areas in autism struggle to work together as a team.”

The researchers used an approach first developed by Fulvia Castelli and her colleagues in the U.K. that created animation videos showing two geometric shapes moving around the screen. The shapes, such as a large red triangle and a small blue triangle, moved in ways that could be perceived as an interaction between them, such as coaxing or dancing.

The team demonstrated that “seeing” the interactions was in the mind of the beholder, or to be more specific, in the ToM circuitry of the viewer’s brain. Without ToM, it just looked like geometric shapes moving around the screen.

To better understand the the neural mechanisms involved with ToM, the scientists asked 13 high-functioning children with ASD between the ages of 10 and 16 as well as 13 similarly aged children without ASD to watch these short animated films. The children were asked to identify the thoughts and feelings, or mental states, of those triangles while having their brains scanned by an fMRI scanner.

The ASD children showed significantly reduced activation compared to the control group children in the brain regions considered to be part of the ToM network, such as the medial frontal cortex and temporo-parietal junction. Furthermore, the synchronization between such pairs of regions was lower in the autism group. 

The findings support Just’s previous research in 2004 which discovered this lower synchronization. In later studies, Just continued to show how this theory accounted for many brain imaging and behavioral findings during tasks that are heavily linked to the frontal cortex.

“One reason this finding is so interesting is that the ‘actors’ in the films have no faces, facial expressions or body posture on which to base a judgment of an emotion or attitude,” said Rajesh Kana, associate professor of psychology at the University of Alabama at Birmingham.

“The neurotypical children managed to identify a social interaction without social cues, such as interpreting the large triangle nudging the smaller one as a parent’s attempt to encourage a child, but the ASD children were unable to make the connection.”

Until now, most research focused on the connectivity among core brain regions in ASD has focused on adults, limiting knowledge about how the disorder affects younger people.

“By studying children, we were able to show that it is possible to characterize the altered brain circuitry earlier in development, which could lead to designing earlier effective intervention programs that could train children to infer the intentions and thoughts that underlie physical interactions between people,” Just said. “For example, children could be trained to distinguish between a helpful nudge and a hostile poke.”

The findings are published in the journal Molecular Autism.


Página: (Anterior)   1  ...  3  4  5  6  7  8  9  10  11  12  ...  74  (Siguiente)