Neurociencia | Neuroscience


Neuroscience 

Navegue por el glosario usando este índice.

Especial | A | B | C | D | E | F | G | H | I | J | K | L | M | N | Ñ | O | P | Q | R | S | T | U | V | W | X | Y | Z | TODAS

Página:  1  2  3  4  5  6  (Siguiente)
  TODAS

S

Logo KW

Saboteando la Búsqueda de Empleo [1502]

de System Administrator - lunes, 12 de octubre de 2015, 17:23
 

Síntomas de que te puedes estar saboteando la búsqueda de empleo

por Andreina Villar-Woodworth

Trabajo todos los días con personas en crisis. Por los motivos que sea, ya sea porque han decidido irse de la empresa en la que están, han sido desincorporados, buscan un cambio, están agotados, huyen de su país, se sienten estancados en su empleo actual, se están anticipando a un cierre de su empresa o industria, o porque no quieren seguir trabajando en lo mismo, las personas con las que trabajo a diario, buscan, quieren y anhelan cosas nuevas en su vida. Y es que es hermoso ayudarles a instrumentar esos cambios. Sin embargo ocurre, que después de muchos años de experiencia, también he aprendido a detectar las señales que indican que la persona no está lista emocionalmente para dar ese paso. Lo cual puede ser MUY peligroso porque puede atrapar a la persona en un ciclo de saboteo interminable de su propia búsqueda.

Comenzare hablándoles de Mariana Rivas, una de las profesionales docentes más inteligentes que conozco. Una mujer de mediana edad, brillante que ha sobresalido en todos sus empleos. Cuando comencé a trabajar con Mariana, ella sabía lo que NO quería pero no sabía lo que quería. Su primera llamada me la hizo a comienzos de año, hablamos por aproximadamente una hora y media. Le di instrucciones de nuestro siguiente paso y una próxima cita a la semana siguiente. No volví a saber de ella hasta 9 meses después. Quiere retomar su proceso bajo una crisis que se le acaba de presentar en su empleo actual. Volvemos a conversar. Le vuelvo a dar otra cita. Me deja otra vez esperando. ¿Ustedes piensan que Mariana es sumamente maleducada? O ¿lo que esta haciendo es un clásico saboteo de su propio proceso de carrera? ¿Quién da más?

Pues Mariana es un encanto de persona. No tiene nada que me diga que es una persona maleducada. Lo que tiene es terror y pánico a afrontar ese cambio de carrera y de vida. Como dicen algunos “más vale malo conocido que bueno por conocer”. Ella sabe que necesita cambiar para crecer pero le mucho miedo hacerlo.

¿Te preguntas si tú haces lo mismo con tu propia carrera y tu proceso de búsqueda? Échale un vistazo a esta lista a ver:

1. Sabes que debes cambiar y no lo haces. Puedes aplicarte el ejemplo de Mariana a ti mismo. Debes cambiar de empleo pero cualquier excusa sirve para no dar ese paso. Te da pánico porque no sabes cómo hacerlo. ¡Las buenas noticias es que hay hoy en día muchos expertos en búsqueda de empleo que te podemos guiar!

2. Mal manejo de prioridades. Buscar a los niños al cole, hacer las compras, diligencias, proyectos en el hogar, amigos, TODO es más importante que tu carrera y en especial mas importante que tu búsqueda.

 

3. Llevas tiempo buscando sin resultados. Algo está pasando. Hoy en día hay oportunidades para prácticamente todos, así que si no te llaman debes revisar qué estas haciendo y cómo mejorarlo. Puede ser el CV, la entrevista, tus métodos de búsqueda. Existe mucha información.

4. Haz ido a muchas entrevistas y nadie te ha hecho ofertas. ¡Cuidado! Esto puede ser un síntoma clásico de saboteo (aunque en algunos casos el coaching de entrevista lo puede solucionar también). Lo recomendable es revisar con un coach experto qué puede estar pasando.

Lo fundamental es que si en cualquiera de estos puntos detectas miedo irracional, pánico o terror al cambio es posible que te sabotees sin darte cuenta. En ese caso, haz un examen profundo de consciencia y te invito a que te des tiempo, busca información sobre todas posibilidades maravillosas que tienes hoy en día para encontrar empleo y cambiar de carrera y cómo te puedes ayudar. Las cosas no son copmo antes (menos mal) y aunque el mundo laboral es más competitivo, las opciones son mayores y mejores para la mayoría, a veces es bueno reinventarse.

Te deseo mucha suerte en tu transición y a tu orden. ¡No te pierdas la próxima entrega donde hablare de las estrategias para contrarrestar el saboteo!

Si has disfrutado este artículo puedes también leer:

Link: https://www.linkedin.com

 

 

Logo KW

Salud Física del Paciente con Depresión [883]

de System Administrator - viernes, 19 de septiembre de 2014, 16:49
 

 

DEPRESIÓN Y ENFERMEDADES CRÓNICAS

Nuevas directrices sobre la salud física del paciente con depresión

 

La comorbilidad entre depresión y enfermedades físicas es muy frecuente. Un consenso de expertos españoles brinda recomendaciones sobre la prevención, diagnóstico e intervención de las patologías asociadas a la depresión.

Aunque subdiagnosticadas en la atención primaria, las enfermedades neuropsiquiátricas constituyen el 28% de los años de vida ajustados por discapacidad. Se ha comprobado que la depresión tiene un gran impacto en el empeoramiento de la patología somática asociada, especialmente en enfermedades neurológicas, cardiovasculares, metabólicas y el cáncer. Además, esta patología neuropsiquiátrica tiene un efecto negativo en la adherencia al tratamiento, la capacidad funcional, y la calidad de vida del paciente, según un consenso elaborado por un comité científico multidisciplinario español que da directrices sobre la prevención, diagnóstico e intervención de las enfermedades médicas relacionadas a la depresión.

Titulado "Consenso español de salud física del paciente con depresión", y avalado por las sociedades Españolas de Psiquiatría y de Médicos de Atención Primera, este estudio expresa que la prevalencia de depresión en pacientes con otras patologías es mayor que en la población en general, con cifras globales superiores al 20%.  Esta elevada incidencia puede explicarse, de acuerdo a estos expertos españoles, por los efectos de algunos tratamientos, además de que la fisiopatología de algunas enfermedades tendría un papel en el desarrollo de la depresión (como en algunos trastornos neuroendocrinos o autoinmunes).  

En el caso de pacientes con enfermedad cardiovascular, la prevalencia de depresión es desproporcionadamente elevada: el 50% de pacientes ingresados para intervención quirúrgica de by-pass coronario o por un síndrome coronario agudo, según estos autores. Los motivos que podrían explicar esta comorbilidad giran en torno a tres mecanismos: biológicos (alteraciones del eje hipotálamo-hipofisoadrenocortical), psicosociales y estilo de vida. Y en el caso de los trastornos endocrinos, se halló que la incidencia de depresión en pacientes con diabetes es 2 a 3 veces mayor que en la población en general. Además, se ha detectado que la prevalencia de depresión y dolor crónico es muy elevada en el ámbito de atención primaria (56%) y entre el 50 y 69% en la atención especializada. Por otra parte, estos investigadores manifiestan que las personas con depresión presentan un incremento de algunos marcadores inflamatorios como la proteína C reactiva, la interleucina 6 y el TNFα. La presencia de esta disregulación inmune explicaría la comorbilidad entre depresión y algunas enfermedades físicas como las alteraciones cardíacas o la diabetes.

Tratamiento antidepresivo y salud física

A la hora de elegir el tratamiento para pacientes con comorbilidad médica relacionada, este documento cree que se deben considerar dos aspectos fundamentales: los efectos secundarios del antidepresivo en relación con dicha patología somática. Y por otro, las posibles interacciones del antidepresivo con los fármacos empleados para el tratamiento de la enfermedad física. Además recomiendan la psicoterapia, sola en combinación con tratamiento farmacológico, en pacientes con depresión leve o moderada, especialmente las técnicas que han demostrado mayor eficacia: las estrategias cognitivo-conductuales.

Como conclusión de este estudio, sus autores redactaron un decálogo que contiene las recomendaciones más destacadas sobre esta problemática médica. Una de las directrices es que en el proceso de atención al paciente con depresión y enfermedades médicas comórbidas es fundamental la coordinación del médico de atención primaria, el psiquiatra, y otros especialistas implicados. Asimismo, hace hincapié en la necesidad de que la salud pública reconozca la importancia de la asociación entre depresión y enfermedad física, para facilitar su detección precoz y atención adecuada.

Ingrese ahora cliqueando en "Consenso español de salud física del paciente con depresión", disponible para su lectura en el Instituto Core Journals de RIMA.

Link: https://www.rima.org

Logo KW

SAS Best Practices White Paper [772]

de System Administrator - martes, 19 de agosto de 2014, 16:58
 

Anatomy of an Analytic Enterprise: a SAS Best Practices White Paper

Driven by the big data movement, many conversations about creating a data-driven organization begin by focusing on the acquisition and storage of data in all its forms. Data lakes are being created and data galore is flowing in. The issue immediately becomes:

What could and should be done with all that data?

 

Side note: Storage is cheap. That doesn't mean managing and maintaining the data comes free.

In the end, the organization with the most data does not win. It is the organization that does the most with its data that will ultimately prevail. And herein lies the crux of the issue.

Please read the attached whitepapers.

Logo KW

Scientists Connect Brain to a Basic Tablet—Paralyzed Patient Googles With Ease [1534]

de System Administrator - miércoles, 28 de octubre de 2015, 14:39
 

Scientists Connect Brain to a Basic Tablet—Paralyzed Patient Googles With Ease

BY SHELLY FAN

For patient T6, 2014 was a happy year.

That was the year she learned to control a Nexus tablet with her brain waves, and literally took her life quality from 1980s DOS to modern era Android OS.

A brunette lady in her early 50s, patient T6 suffers from amyotrophic lateral sclerosis (also known as Lou Gehrig’s disease), which causes progressive motor neuron damage. Mostly paralyzed from the neck down, T6 retains her sharp wit, love for red lipstick and miraculous green thumb. What she didn’t have, until recently, was the ability to communicate with the outside world.

Brain-Machine Interfaces

Like T6, millions of people worldwide have severe paralysis from spinal cord injury, stroke or neurodegenerative diseases, which precludes their ability to speak, write or otherwise communicate their thoughts and intentions to their loved ones.

 

The field of brain-machine interfaces blossomed nearly two decades ago in an effort to develop assistive devices to help these “locked-in” people. And the results have been fantastic: eye- or head-tracking devices have allowed eye movement to act as an output system to control mouse cursors on computer screens. In some cases, the user could also perform the click function by staring intently at a single spot, known in the field as “dwell time.”

Yet despite a deluge of promising devices, eye-tracking remains imprecise and terribly tiring to the users’ eyes. Since most systems require custom hardware, this jacks up the price of admission, limiting current technology to a lucky select few.

“We really wanted to move these assisted technologies towards clinical feasibility,” said Dr. Paul Nuyujukian, a neuroengineer and physician from Stanford University, in a talk at the 2015 Society for Neuroscience annual conference that took place this week in Chicago.

That’s where the idea of neural prostheses came in, Nuyujukian said.

In contrast to eye-trackers, neural prostheses directly interface the brain with computers, in essence cutting out the middleman — the sensory organs that we normally use to interact with our environment.

Instead, a baby-aspirin-sized microarray chip is directly implanted into the brain, and neural signals associated with intent can be decoded by sophisticated algorithms in real time and used to control mouse cursors.

It’s a technology that’s leaps and bounds from eye-trackers, but still prohibitively expensive and hard to use.

Nuyujukian’s team, together with patient T6, set out to tackle this problem.

A Nexus to Nexus 9

Two years ago, patient T6 volunteered for the BrainGate clinical trials and had a 100-channel electrode array implanted into the left side of her brain in regions responsible for movement.

At the time, the Stanford subdivision was working on a prototype prosthetic device to help paralyzed patients type out words on a custom-designed keyboard by simply thinking about the words they want to spell.

 

The prototype worked like this: the implanted electrodes recorded her brain activity as she looked to a target letter on the screen, passed it on to the neuroprosthesis, which then interpreted the signals and translated them into continuous control of cursor movements and clicks.

In this way, T6 could type out her thoughts using the interface, in a way similar to an elderly technophobe reluctantly tapping out messages with a single inflexible finger.

The black-and-white setup was state-of-the-art in terms of response and accuracy. But the process was painfully slow, and even with extensive training, T6 often had to move her eyes to the delete button to correct her errors.

What the field needed was a flexible, customizable and affordable device that didn’t physically connect to a computer via electrodes, according to Nuyujukian. We also wanted a user interface that didn’t look like it was designed in the 80s.

The team’s breakthrough moment came when they realized their point-and-click cursor system was similar to finger tapping on a touchscreen, something most of us do everyday.

 

We were going to design our own touchscreen hardware, but then realized the best ones were already on the market, laughed Nuyujukian, so we went on Amazon instead and bought a Nexus 9 tablet.

The team took their existing setup and reworked it so that patient T6’s brain waves could control where she tapped on the Nexus touchscreen. It was a surprisingly easy modification: the neuroprosthetic communicated with the tablet through existing Bluetooth protocols, and the system was up and running in less than a year.

“Basically the tablet recognized the prosthetic as a wireless Bluetooth mouse,” explained Nuyujukian. We pointed her to a web browser app and told her to have fun.

In a series of short movie clips, the team demonstrated patient T6 Googling questions about gardening, taking full advantage of the autocompletion feature to speed up her research. T6 had no trouble navigating through tiny links and worked the standard QWERTY keyboard efficiently.

Think about it, said Nuyujukian, obviously excited. It’s not just a prettier user interface; she now has access to the entire Android app store.

According to previous studies, the device can function at least two years without experiencing any hardware or software issues. The team is trying to make the implant even sturdier to extend its lifespan in the brain.

“We set out to utilize what’s already been perfected in terms of the hardware to make the experience more pleasant,” said Nuyujukian. “We’ve now showed that we can expand the scope of our system to a standard tablet.”

But the team isn’t satisfied. They are now working on ways to implement click-and-drag and multi-touch maneuvers. They also want to expand to other operating systems, enable the patients to use the device 24/7 without supervision, and expand their pilot program to more patients in all three of the BrainGate clinical sites.

“Our goal is to unlock the full user interface common to general-purpose computers and mobile devices,” said Nuyujukian. “This is a first step towards developing a fully-capable brain-controlled communication and computer interface for restoring function for people with paralysis.”

Image Credit: Shutterstock.com

Logo KW

Scientists learn how young brains form lifelong memories by studying worms' food choices [1683]

de System Administrator - martes, 16 de febrero de 2016, 01:21
 

Scientists learn how young brains form lifelong memories by studying worms' food choices

by Rockefeller University

When young C. elegans worms taste poisonous food, they remember that experience for the rest of their life, neuroscientists have found. Their work is teasing apart the biological mechanisms that drive different types of learning.

A segment of a C. elegans worm with a RIM neuron shown in green. The scientists showed that this neuron produces a learning signal that makes the newly hatched worm able to remember an olfactory experience for the rest of its life.

Credit: Lulu and Anthony Wang Laboratory of Neural Circuits and Behavior at Rockefeller University/Cell

Members of neuroscientist Cori Bargmann's lab spend quite a bit of their time watching worms move around. These tiny creatures, Caenorhabditis elegans, feed on soil bacteria, and their very lives depend on their ability to distinguish toxic microbes from nutritious ones. In a recent study, Bargmann and her colleagues have shown that worms in their first larval stage can learn what harmful bacterial strains smell like, and form aversions to those smells that last into adulthood.

Many animals are capable of making vital, lifelong memories during a critical period soon after birth. The phenomenon, known as imprinting, allows newly hatched geese to bond with their moms, and makes it possible for salmon to return to their native stream after spawning. And while the learning processes of humans may be more complex and subtle, scientists have long known that our brain's ability to store a memory and maintain it long-term depends on when and how that memory was acquired.

"In the case of worms, we were fascinated to discover that their small and simple nervous system is capable of not only remembering things, but of forming long-term memories," says Bargmann, who is Torsten N. Wiesel Professor and head of the Lulu and Anthony Wang Laboratory of Neural Circuits and Behavior, as well as co-director of the new Kavli Neural Systems Institute at Rockefeller. "It invites the question of whether learning processes that happen during different life stages are biologically different."

In the study, she and Rockefeller graduate student Xin Jin let both young and adult worms learn to avoid food smells, and studied in detail the neural circuits that produced memories of the experience. Their findings, published in Cell, clarify which neurons, genes, and molecular pathways distinguish the two types of memory, providing new vistas into the neurobiology of learning.

Imprinted aversions last a lifetime

When adult C. elegans worms encounter pathogenic bacteria they avoid it by moving in the opposite direction, and they shun similar bacteria for about twenty-four hours. But their memory soon fades.

Young worms, on the other hand, form more lasting impressions. The researchers allowed newborn worms to hatch directly onto a lawn of pathogens, and left them there for their first twelve hours of life -- the first larval stage. (The bugs gave the worms intestinal infections, but didn't kill them.) Then, when the worms encountered the pathogens again as adults -- three days later -- they fled. Worms that hadn't been hatched onto poisonous bacteria found them just as attractive as harmless ones.

Jin found that by silencing specific neurons in the worms, and repeating the learning assays, she was able to determine each nerve cell's contribution to the memory process. The results of her experiments show that the neural circuits that mediate the two types of learning are similar, but not identical. Many neurons are needed for both imprinted and adult learning, but cells called AIB and RIM are uniquely important for the formation of the imprinted memory during the larval stage.

A similar picture emerged when the researchers compared the genes and signaling pathways that are activated when worms form imprinted versus short-term memories. The two processes rely on similar molecular components, but some genes were found to be specifically required for only one type of learning.

"These findings suggest that early imprinting isn't totally different from other learning--it's the same system enhanced with some special features," Bargmann says.

How memories are formed, stored, and retrieved

Several neurological processes are at play when we learn new things. For example, when a baby songbird learns a song from an adult bird, a memory of the tutor's performance must first form and be stored in his brain. Then, when it's time for the bird to debut with his own song, that memory must be retrieved to practice and then perform a vocal behavior.

Because most animals' brains are very complex, it has been difficult for scientists to study these elements of learning in detail. By using C. elegans, whose modest brain has only 302 neurons, the researchers were able to shed light on the neural circuits that drive the formation and retrieval of a worm's memory--and the two processes turned out to be neurologically distinct.

"We learned that when worms form an early memory of a food smell, they use one set of neurons to plant that memory," Bargmann says, "and later in life, when they encounter the same smell again, they use a different set of neurons to pull the memory out."

How memories are stored in the brain -- in what neurons they reside and what constitutes them at the molecular level -- remains elusive. But Bargmann says her lab's findings have laid the groundwork for future research into stored memory and other open questions.

"The most evocative thing about this work is that it reminds us that learning isn't some fancy innovation of a complex brain," she says. "It's a fundamental function that any nervous system can perform."

Story Source:

The above post is reprinted from materials provided by Rockefeller University. Note: Materials may be edited for content and length.

Journal Reference:

  1. Xin Jin, Navin Pokala, Cornelia I. Bargmann. Distinct Circuits for the Formation and Retrieval of an Imprinted Olfactory Memory. Cell, February 2016 DOI: 10.1016/j.cell.2016.01.007

Link: https://www.sciencedaily.com

Logo KW

Scientists show a link between intestinal bacteria and depression [1333]

de System Administrator - viernes, 7 de agosto de 2015, 17:30
 

Scientists show a link between intestinal bacteria and depression

by NeuroScientisNews

Immunostaining of mouse ilium. Credit: Vasanta Subramanian / Wellcome Images

Exploring the role of intestinal microbiota in the altered behavior that is a consequence of early life stress

Scientists from the Farncombe Family Digestive Health Research Institute at McMaster University have discovered that intestinal bacteria play an important role in inducing anxiety and depression.

The new study, published in Nature Communications, is the first to explore the role of intestinal microbiota in the altered behavior that is a consequence of early life stress.

"We have shown for the first time in an established mouse model of anxiety and depression that bacteria play a crucial role in inducing this abnormal behavior," said Premysl Bercik, senior author of the paper and an associate professor of medicine with McMaster's Michael G. DeGroote School of Medicine. "But it's not only bacteria, it's the altered bi-directional communication between the stressed host -- mice subjected to early life stress -- and its microbiota, that leads to anxiety and depression."

It has been known for some time that intestinal bacteria can affect behavior, but much of the previous research has used healthy, normal mice, said Bercik.

In this study, researchers subjected mice to early life stress with a procedure of maternal separation, meaning that from day three to 21, newborn mice were separated for three hours each day from their mothers and then put back with them.

First, Bercik and his team confirmed that conventional mice with complex microbiota, which had been maternally separated, displayed anxiety and depression-like behavior, with abnormal levels of the stress hormone corticosterone. These mice also showed gut dysfunction based on the release of a major neurotransmitter, acetylcholine.

Then, they repeated the same experiment in germ-free conditions and found that in the absence of bacteria mice which were maternally separated still have altered stress hormone levels and gut dysfunction, but they behaved similar to the control mice, not showing any signs of anxiety or depression.

Next, they found that when the maternally separated germ-free mice are colonized with bacteria from control mice, the bacterial composition and metabolic activity changed within several weeks, and the mice started exhibiting anxiety and depression.

"However, if we transfer the bacteria from stressed mice into non stressed germ-free mice, no abnormalities are observed. This suggests that in this model, both host and microbial factors are required for the development of anxiety and depression-like behavior. Neonatal stress leads to increased stress reactivity and gut dysfunction that changes the gut microbiota which, in turn, alters brain function," said Bercik.

He said that with this new research, "We are starting to explain the complex mechanisms of interaction and dynamics between the gut microbiota and its host. Our data show that relatively minor changes in microbiota profiles or its metabolic activity induced by neonatal stress can have profound effects on host behavior in adulthood."

Bercik said this is another step in understanding how microbiota can shape host behaviour, and that it may extend the original observations into the field of psychiatric disorders.

"It would be important to determine whether this also applies to humans. For instance, whether we can detect abnormal microbiota profiles or different microbial metabolic activity in patients with primary psychiatric disorders, like anxiety and depression," said Bercik.

Note: Material may have been edited for length and content. For further information, please contact the cited source.

McMaster University

Link: http://www.neuroscientistnews.com

Logo KW

Scientists Synthesize Bacteria with Smallest Genome Yet [1689]

de System Administrator - sábado, 26 de marzo de 2016, 22:06
 
Nature
Medical & Biotech

Scientists Synthesize Bacteria with Smallest Genome Yet

By Ewen Callaway, Nature magazine

"Minimal" cell raises the stakes in race to harness life’s building blocks

 

Researchers have designed and synthesized a minimal bacterial genome, containing only the genes necessary for life. This material relates to a paper that appeared in the March 25, 2016, issue of Science, published by AAAS. The paper, by C.A. Hutchison III at J. Craig Venter Institute in La Jolla, Calif., and colleagues was titled, "Design and synthesis of a minimal bacterial genome."
C. Bickel / Science (2016)

Genomics entrepreneur Craig Venter has created a synthetic cell that contains the smallest genome of any known, independent organism. Functioning with 473 genes, the cell is a milestone in his team’s 20-year quest to reduce life to its bare essentials and, by extension, to design life from scratch.

Venter, who has co-founded a company that seeks to harness synthetic cells for making industrial products, says that the feat heralds the creation of customized cells to make drugs, fuels and other products. But an explosion in powerful ‘gene-editing’ techniques, which enable relatively easy and selective tinkering with genomes, raises a niggling question: why go to the trouble of making new life when you can simply tweak what already exists?

J. Craig Venter, Ph.D.
Credit: J. Craig Venter Institute

Unlike the first synthetic cells made in 2010, in which Venter’s team at the J. Craig Venter Institute in La Jolla, California, copied an existing bacterial genome and transplanted it into another cell, the genome of the minimal cells is like nothing in nature. Venter says that the cell, which is described in a paper released on March 24in Science, constitutes a brand new, artificial species.

“The idea of building whole genomes is one of the dreams and promises of synthetic biology,” says Paul Freemont, a synthetic biologist at Imperial College London, who is not involved in the work.

The design and synthesis of genomes from scratch remains a niche pursuit, and is technically demanding. By contrast, the use of genome editing is soaring—and its most famous tool, CRISPR–Cas9, has already gained traction in industry, agriculture and medicine, notes George Church, a genome scientist at Harvard Medical School in Boston, Massachusetts, who works with CRISPR. “With much less effort, CRISPR came around and suddenly there are 30,000 people practising CRISPR, if not more.”

Microbiologists were just starting to characterize the bacterial immune system that scientists would eventually co-opt and name CRISPR when Venter’s team began its effort to whittle life down to its bare essentials. In a 1995 Science paper, Venter’s team sequenced the genome of Mycoplasma genitalium, a sexually transmitted microbe with the smallest genome of any known free-living organism, and mapped its 470 genes. By inactivating genes one by one and testing to see whether the bacterium could still function, the group slimmed this list down to 375 genes that seemed essential.

One way to test this hypothesis is to make an organism that contains just those genes. So Venter, together with his close colleagues Clyde Hutchison and Hamilton Smith and their team, set out to build a minimal genome from scratch, by joining together chemically synthesized DNA segments. The effort required the development of new technologies, but by 2008, they had used this method to make what was essentially an exact copy of the M. genitalium genome that also included dozens of non-functional snippets of DNA ‘watermarks’.

But the sluggish growth of natural M. genitalium cells prompted them to switch to the more prolificMycoplasma mycoides. This time, they not only synthesized its genome and watermarked it with their names and with famous quotes, but also implanted it into another bacterium that had been emptied of its own genome.

The resulting ‘JCVI-syn1.0’ cells were unveiled1 in 2010 and hailed—hyperbolically, many say—as the dawn of synthetic life. (The feat prompted US President Barack Obama to launch a bioethics review, and the Vatican to question Venter’s claim that he had created life.) However, the organism’s genome was built by copying an existing plan and not through design—and its bloated genome of more than 1 million DNA bases was anything but minimal.

In an attempt to complete its long-standing goal of designing a minimal genome, Venter’s team designed and synthesized a 483,000-base, 471-gene M. mycoides chromosome from which it had removed genes responsible for the production of nutrients that could be provided externally, and other genetic ‘flotsam’. But this did not produce a viable organism.

So, in a further move, the team developed a ‘design-build-and-test’ cycle. It broke theM. mycoides genome into eight DNA segments and mixed and matched these to see which combinations produced viable cells; lessons learned from each cycle informed which genes were included in the next design. This process highlighted DNA sequences that do not encode proteins but that are still needed because they direct the expression of essential genes, as well as pairs of genes that perform the same essential task—when such genes are deleted one at a time, both mistakenly seem to be dispensable.

Eventually, the team hit on the 531,000-base, 473-gene design that became known as JCVI-syn3.0 (syn2.0 was a less streamlined intermediary). Syn3.0 has a respectable doubling time of 3 hours, compared with, for instance, 1 hour for M. mycoides and 18 hours for M. genitalium.

“This old Richard Feynman quote, ‘what I cannot create, I do not understand’, this principle is now served,” says Martin Fussenegger, a synthetic biologist at the Swiss Federal Institute of Technology (ETH) in Zurich, Switzerland. “You can add in genes and see what happens.”

With nearly all of its nutrients supplied through growth media, syn3.0’s essential genes tend to be those involved in cellular chores such as making proteins, copying DNA and building cellular membranes. Astoundingly, Venter says that his team could not identify the function of 149 of the genes in syn3.0’s genome, many of which are found in other life forms, including humans. “We don’t know about a third of essential life, and we’re trying to sort that out now,” he says.

This has blown Fussenegger away. “We’ve sequenced everything on this planet, and we still don’t know 149 genes that are most essential for life!” he says. “This is the coolest thing I want to know.”

Syn3.0’s lasting impact on synthetic biology is an open question. “I think it’s kind of a George Mallory moment,” says Church, referring to the English mountaineer who died in 1924 trying to become the first person to climb Mount Everest. “‘Because it’s there’ was the excuse he gave for climbing Everest.”

Church says that genome-editing techniques will remain the go-to choice for most applications that require a small number of genetic alterations, whereas genome design will be useful for specialized applications, such as recoding an entire genome to incorporate new amino acids. Fussenegger thinks that genome editing will be the favoured approach for therapies, but that writing genomes from scratch will appeal to scientists interested in fundamental questions about how genomes evolve, for instance.

Even Venter acknowledges that syn3.0’s genome, although new, was designed by trial and error, rather than being based on a fundamental understanding of how to build a functioning genome. But he expects fast improvements, and thinks that genome synthesis from scratch will become the preferred approach for manipulating life. “If you want to make a few changes, CRISPRs are a great tool,” he says. “But if you’re really making something new and you’re trying to design life, CRISPRs aren’t going to get you there.”

This article is reproduced with permission and was first published on March 24, 2016.

Link: http://www.scientificamerican.com

 

 

 

Logo KW

Secrets to Long Haul Creativity [1617]

de System Administrator - jueves, 24 de diciembre de 2015, 19:14
 

Secrets to Long Haul Creativity

BY STEVEN KOTLER

About five years ago, I started thinking long and hard about a very specific type of creativity. Unlike most researchers, I was less interested in exploring the day-to-day puzzle of making something out of nothing and more about the equally baffling mystery of how to do this over a lifetime. Long Haul Creativity is how I’ve come to think of this topic.

Over the past few years, this topic has become a bit of an obsession. I’ve talked about it with everyone I know and have come to realize the answers I’ve gotten apply to far more folks than just writers. These days, creativity is a buzz word in just about every field. It seems like everyone’s hunting for more of this skill, but no one’s really talking about the ramifications of getting what we desire.

This is a critical point. Being creative over a career involves a whole subset of nearly invisible skills, a great many of which conflict with most people’s general ideas about what it means to be creative. What’s more, being creative is different than the business of being creative, and most people who learn how to be good at the first, are often really terrible at the second. Finally, emotionally, creativity just takes a toll. Decade after decade, that toll adds up.

So here are eight of my favorite lessons on the hard fight of long-haul creativity. A few are my own. Most are things I learned from others. All have managed to keep me saner along the way.


One: Creativity De-Coded

 

The one thing neuroscientists know for sure about creativity is that it’s not one thing.* The brain is creative in dozens and dozens of different ways, which is why training people to be more creative can be so difficult. Yet, what we do know is that creativity is always recombinatory — it’s the product of novel information bumping into old ideas to produce something startlingly new.

What’s more, we also know that this recombinatory process always requires the interaction of three overlapping neural networks: attention, imagination and salience. Understanding how these networks workand how we can augment their effects gives long haul creatives some much needed leverage.

  1. Attention: This network governs executive attention or spotlight attention. It’s the go-to system for the hours-on-end laser-focus required by creativity. And this leads to an obvious intervention: anything that trains up attention, amplifies creativity. Almost any mindfulness practice will work or, if you prefer a more dynamic experience, the Flow Genome Project designed this Art of Flow video-meditation for those too twitchy to follow their breath.
  2. Imagination: The imagination network or, more formally, the default mode network (DMN), is all about mind-wandering. It’s what allows you to construct mental simulations of potential outcomes and test out creative possibilities. The trick here is to activate the DMN you have to stop focusing on the problem you’ve been trying to solve. This means turning off the spotlight attention system. Research shows the best way to pull this off is via low-grade physical activity. I prefer gardening. Tim Ferriss (see below) likes long walks. But Lee Zlotoff, creator of the TV show MacGyver and (no surprise) an expert on creative problem solving, has tested dozens of different activities, and found that building models — airplanes, dinosaurs, whatever — consistently produces the best results.
  3. Salience: This network monitors incoming information and tags it as important or irrelevant. The more salient info the brain detects, the more raw material it has to be creativeThe big issue here is that familiarity breeds contempt — meaning, when we are locked into our normal routine this network usually runs on autopilot. It notices what it always notices. The secret to getting its attention is risk and novelty. New experiences and new ideas. Ceaseless adventure and constant reading are key. For the former, see this article I wrote on risk and creativity. For the latter, books are always better than magazines, newspapers, blogs etc. — I explain why in this piece for Forbes.

The best book on all the different neuronal systems involved in creativity is the recently released How Creativity Happens In the Brain, by neuroscientist and pioneering flow researcher Arne Dietrich. But be warned, this is not light summer fare. Dietrich is funny as hell, but the book is dense and — because it’s published by an academic publisher — expensive.


TwoKnow The Better Question.

A little while back author and investor Tim Ferriss walked me through the four things he does on a regular basis to support long haul creativity. His whole list is really good, so we’ll start there:

  1. Daily Exercise: at least an hour, needed to lower anxiety levels and clear the head. Interestingly, the research shows that weight training is better than aerobic training for quieting the inner critic.
  2. Keep a Maker Schedule: Carve out dedicated periods for key tasks that require creativity. If complex problem-solving or analysis is required, Ferriss recommends at least four hour blocks. And this also means no distractions — turn off email, phone, messages, skype, twitter, facebook and all the rest.
  3. Long Walks: Without music or podcasts or distraction, purposefully letting the mind wander. This switches off spotlight attention and switches on the default mode network — aka, the imagination network.
  4. Surround yourself with driven people who are good at spotting your assumptions. Ferriss explains: “The people who are the very best at this are the ones who hear my question and respond with: ‘You’re asking the wrong question. The better question is….’”

This last point is really important. While feedback can often be a hindrance to in-the-moment creativity, it’s essential for the long haul. But choice in feedback giver is critical.

This becomes doubly important the more successful you get. If you make a name for yourself in creativity people tend to trust your creative ideas a little more than they should and too frequently give you the benefit of the doubt. This is no bueno. To make sure he’s getting the feedback he needs, Ferriss hunts for folks who help him reframe his question, rather than just play devil’s advocate. This is spot on. People who play devil’s advocate often do so out of reflex — this means they tend to lack the technical sophistication to really help and often derail creativity through generalization. Reframers, meanwhile, take the idea farther faster. By providing a better question, they’re providing a new launch pad. This creates momentum. And forlong haul creativity, nothing is more fundamental than momentum.


Three: Momentum Matters Most

Speaking of momentumthere is something deeply exhausting about the year-in and year-out requirements of imagination. Every morning, the writer faces a blank page, the painter an empty canvas; the innovator a dozen directions to go at once. The brilliant tidbit of advice that has helped me solve this slog came from Nobel Laureate Gabriel Garcia Marquez.

 

Gabriel Garcia Marquez

Marquez said that the key was to have quit working at the point you’re most excited. In other words, once Marquez really starts to cook, he shuts down the stove. This seems counter-intuitive. Creativity is an emergent property. Quitting when most excited — when ideas are really emerging — seems like the exact opposite of what you should do.

Yet Marquez is exactly right. Creativity isn’t a single battle; it’s an ongoing war. By quitting when you’re most excited, you’re carrying momentum into the next day’s work session. Momentum is the key. When you realize that you left off someplace both exciting and familiar — someplace where you know the idea that comes next — you dive right back in, no time wasted, no time to let fear creep back into the equation, and far less time to get up to speed.


Four: A Few Thoughts on Sobbing, Shouting, and Punching Hard Objects

I’ve written nine books. Two are in drawers. Seven are in stores. All share one thing in common: at some point during their writing, I lost my mind.

Without question, at least once a book, I end up on the ground, sobbing, shouting, and punching the floor. For a long time, I was convinced I was the only one who behaved this way. But about five years ago, I heard author David Foster Wallace tell a story about the difficulty of creativity. “It never fails,” he said, “at least once a book, I end up on the ground, sobbing, screaming and punching the floor.”

 

The obvious point here is yes, creativity is insanely frustrating for everybody. The core question for Long Haul Creativity is what to do about it? Turns out, researchers have discovered that frustration is actually a fundamental step in the creative process. From a technical perspective, this seems to have something to do with the limits of working memory and the requirements of creativity’s incubation period, but no one is exactly certain.

From a practical perspective, this means reversing our traditional relationship with frustration. Since this emotion is a basic step in the creative process, we need to stop feeling its arrival as disaster. For creatives, frustration is actually a sign of progress, a sign of movement in the right direction, a sign that the much needed breakthrough is ever closer to showing up.


Enjoying this article? It’s from Far Frontiers, my monthly email newsletter. If you want more,sign up for a dose of science, technology, creativity and fun every month. Thanks!


Five: Sir Ken Robinson Weighs In On Frustration

I just returned from presenting at the World Business Forum in Milan, where I spent some time with creativity expert and all-around great guy Sir Ken Robinson. Sir Ken pointed out that long haul creativityrequires a low-level, near-constant sense of frustration — which is different from the just-discussed moment-of-madness version of frustration.

 

George Lucas.

Moment-of-madness frustration makes you punch the ground. Ken’s version is about motivation. It’s a constant, itchy dissatisfaction, a deep sense of what-if, and can-we-make-it-better, and the like.

To illustrate this, he told me a story about George Lucas. Robinson, apparently, popped the question: “Hey George,” he said, “why do you keep remaking all thoseStar Wars movies?” Lucas had a great answer: “In this particular universe, I’m God. And God isn’t satisfied.”


SixEverybody’s Got A Job To Do

There’s this mistaken assumption that creativity is a solitary pursuit. This may be true, but the business of creativity is always collaborative. Every journalist has to brave a gauntlet of editors, copy-editors, managing editors ad infinitum. Movies and books and plays and poems are more of the same. Startup entrepreneurs always have investors — etc.

And this brings me to an important point: everybody’s got a job to do. And everybody wants to keep that job. In writing, this means that even if I turn in something perfect, my editors are still being paid to edit — so they will. This is why, I discovered, every time I turned in a piece of finished work I intentionally include a few horrible lines. It gives my editors something to do. It lets them feel useful. It keeps their grubby little hands away from my damn perfect sentences.


Seven: Creativity Is A By-Product

Contrary to popular opinion, creativity is almost always the by-product of passionate hard work and not the other way around. Olympian Gretchen Bleiler — one of the more creative snowboarders in history — puts it this way: “You don’t wake up and say: ‘Today I’m going to be more creative. You do the things you love to do and try to get at their essence and allow things to emerge.’ ”


Eight: Listen To Neil Gaiman

Pretty much everything I’ve learned about long haul creativity author Neil Gaiman says in this speech, only he says it so much better than I could.

https://vimeo.com/42372767

Image Credit: Shutterstock.comWikimedia Commons

Steven Kotler 

  • Steven Kotler is an author, journalist and Director of Research for the Flow Genome Project, an organization dedicated to decoding the science of ultimate human performance. His books include “Abundance,” “A Small Furry Prayer,” “West of Jesus,. and "The Angle Quickest For Flight." His articles have appeared in over 60 publications, including The New York Times Magazine, Atlantic Monthly, Wired, Forbes, GQ, National Geographic, Popular Science, and Discover. He has a deep interest in the intersection of science, technology, and culture, with specific focus on the extreme edges of the discussion—both larger philosophical implications and completely personal applications.

RELATED TOPICS: 

Link: http://singularityhub.com

Logo KW

Self-Organizing Systems [766]

de System Administrator - lunes, 18 de agosto de 2014, 23:54
 

THOUSAND-ROBOT SWARM HINTS AT FUTURE CAR, DRONE, EVEN NANOBOT COLLECTIVES

 

Written By: Arlington Hewes

When you think nanorobot, you don’t think just one. Or ten. You think millions or billions. Huge swarms of nanobots may work in concert with each other to accomplish tasks on tiny scales, perhaps in the human body, or more radically, to form larger robots, each nanobot functioning like a mechanical cell or 3D pixel.

Although we don’t have practical nanobots yet, we can work on the software that may coordinate them—and in fact, that’s exactly what a Harvard group has been up to.

Working out of Harvard’s School of Engineering and Applied Sciences (SEAS) and the Wyss Institute, researchers Radhika Nagpal, Michael Rubenstein, and Alejandro Cornejofirst unveiled their quarter-sized Kilobots in 2011 when 25 of them were shown capable of performing synchronized actions.

The name Kilobot refers to the goal of building collectives of a thousand coordinated robots. That initial squad of 25 grew to 100 last year and now, most recently, to 1024. In a new video, the robots autonomously assume shapes with no more guidance than the original input from the researchers (e.g., form the letter ‘K’).

Each $14 robot is exceedingly simple, just two vibrating motors on four spindly legs.

But we know from a flock of swallows or an ant colony that complex patterns can arise from simple individual behaviors. In this case, four Kilobots mark the center of a grid. Then, by hopping along the edge of the group and noting the relative distance to the center and each other with infrared transmitters and receivers, the bots shuffle into place.

 

From these basic abilities they can form shapes—letters, a wrench, a starfish.

“Biological collectives involve enormous numbers of cooperating entities—whether you think of cells or insects or animals—that together accomplish a single task that is a magnitude beyond the scale of any individual,” said  Michael Rubenstein, lead author of the group’s recent paper in the journal Science and research associate at Harvard SEAS and the Wyss Institute.

Algorithms can be written and tested in computer models, but having real world robots to test them on is key. In practice the robots need to self-correct for problems like traffic jams and broken or wayward bots. Cost is also often a limiting factor. According to Harvard Gazette, only few robot swarms have so far surpassed 100 members.

In the future, we may see more and more machines behaving as collectives in groups that far exceed 1,000. But we won’t have to wait around for robots built on the nanoscale. Human-scale robots will benefit from coordination too—examples may include disaster response, environmental cleanup, drone delivery systems, or self-driving cars.

“Increasingly, we’re going to see large numbers of robots working together, whether it’s hundreds of robots cooperating to achieve environmental cleanup or a quick disaster response, or millions of self-driving cars on our highways,” Nagpal said.

Learn more about the research at the Harvard Gazette, “The 1,000-robot swarm.”

Image Credit: Harvard University/YouTube

This entry was posted in AIRobots and tagged ai,Harvard School of Engineering and Applied Sciencesrobot swarmwyss institute.

Logo KW

SENC - ¿Somos neurocientíficos o neuroaficionados? [616]

de System Administrator - viernes, 1 de agosto de 2014, 20:40
 

 

¿Somos neurocientíficos o neuroaficionados?

por Oscar Herreras (Instituto Cajal – CSIC) 

Dicen que los animales nos diferenciamos de las plantas en que tenemos un Sistema Nervioso. Digo “dicen”, porque  a la mayoría se nos podría haber olvidado, y es que el último sitio académico en el que se nos confirmó tal descubrimiento pudo ser en el “cole”, quizá en el Instituto. Me gustaría pensar que este comienzo es una simple ironía exagerada para llamar la atención, pero puede estar más cerca de la realidad de lo que parece. Veamos.

Hace unas pocas semanas asistimos al Congreso de la SENC donde tuvimos ocasión de reunirnos buena parte de los Neurocientíficos del país una vez más. Aunque nos conocemos casi todos, cuando nos juntamos, recordamos más fácilmente la heterogeneidad de nuestros orígenes académicos. Es genial para las sobremesas. Y sin duda, el variado acerbo académico que atesora el colectivo Neuro es uno de sus puntos fuertes, pues amplía enormemente la perspectiva con la que encaramos el experimento diario. Pero también permite vislumbrar algunas de las carencias globales de la Neurociencia española, y, en mi opinión, la más grave y trascendente para el futuro,… que ya es presente y la estamos sufriendo. Nuestros Neurocientíficos provienen de disciplinas académicas tan variadas como Ciencias de la Vida, Biotecnológicas varias, Psicología, Medicina, Física, Química, Ingenierías, Veterinaria, Ciencias de la Información y otras,  lo que me lleva a plantear qué formación académica en Neurociencia hemos recibido cada uno. Y ahí es donde empiezo a temblar. No porque sea heterogénea, no porque tenga diversos enfoques o profundidades, sino porque buena parte de nosotros somos Neurocientíficos aficionados. Sí, aficionados, porque no hemos tenido una enseñanza académica mínimamente aceptable de aquello que nos hace distintos de un geranio. ¿Que no importa? Que en qué lo notamos? Pues en detalles que dejan marca y alerta, como las que me quedaron cuando en un curso de doctorado con estudiantes de todos los orígenes geográficos y académicos se me ocurrió preguntar a la joven audiencia, para romper el hielo, qué procesaba el Sistema Nervioso, y tras unos angustiosos y árticos 30 segundos sólo interrumpidos por mis ánimos, la única respuesta tímida y casi avergonzada que obtuve fue: “¿…señales?”. “De humo”, pensé para mis adentros, con ganas de derrumbarme en la silla.

Para aquellos que aún piensen que es una exageración fuera de contexto, he tratado de cotejar mi triste formación académica en Neurociencia (no revelaré mi Universidad para no avergonzar a nadie, si es que hay alguien que pudiera avergonzarse) con la de unos cuantos de vosotros, al alcance de mi agenda de email, y si la muestra de mi escrutinio fuera representativa, el panorama sería simplemente penoso. He recibido respuesta de 27 responsables de grupo, de todas las edades y repartidos por toda la geografía, y les he preguntado cuándo, dónde y cómo estudiaron Neurociencia en sus carreras, y si el panorama académico había mejorado. La conclusión inicial es que, salvo honrosas excepciones (por ej., algunas facultades de medicina) y a falta de un estudio de campo más amplio, en nuestras Universidades NO se imparte Neurociencia. La mayoría de nosotros hemos tenido un contacto leve, casi anecdótico, con contenidos no mucho más profundos que los que pudimos obtener en el Instituto. Hemos obtenido fragmentos de conocimiento, como partes de asignaturas más genéricas, alguna optativa por aquí, un viejo profesor acullá…  Y lo peor de todo, es que en estos últimos 30 años no ha mejorado ni un ápice. Los pocos sitios donde la fragmentación Boloñesa hizo surgir algunas Neuroasignaturas son compensados por aquellos otros en los que eran impartidas por un profesor fan de la Neurociencia, que rara vez la practicó experimentalmente y que ya se jubiló. Sí, en algunos sitios lo poco que había, ha desaparecido. ¿De dónde salen nuestros jóvenes Neurocientíficos? ¿Y qué saben? Amigos, es un milagro que estemos donde estamos. Diría que salen de sus propias ganas, por no decir que en buena medida son accidentes académicos.

Pienso que si nuestra SENC quiere apadrinar cursos y másters de calidad en Neurociencia, también puede interesarse por la Neurociencia base, esforzarse en hacer oír su voz a las autoridades académicas, hacer lobby donde haga falta, e intentar imponer la enseñanza del conocimiento que nos hace animales en todas las Academias donde se estudien las “cosas” de los animales. No quiero recordar que, células, también las tienen los geranios; rutas metabólicas, citoquinas y ondas de calcio, las hortensias; desarrollo y diferenciación celular, hasta el musgo; genes, las cebollas; receptores, las acelgas; y glutamato, los fibrosos cereales del desayuno. Podría argumentar de manera más profunda y analizar cómo la mercantilización y el dirigismo autocomplaciente han favorecido una caótica fragmentación académica y han generado una polarización de Neuro-temáticas entre lo celular/molecular y el comportamiento, que 100 años después de Cajal y Lorente parecen aliarse de nuevo para colocar al SN en el rincón medieval que lo tuvo como la caja negra de los animales, demasiado complicada para entenderla, demasiado complicada para estudiarla. ¿Demasiado complicada para enseñarla?

Necesitamos que nuestros jóvenes, en todas y cada una de esas carreras universitarias, se enfrenten a una Neurociencia bien estructurada. Sólo así podrán responder sin titubear a la pregunta de “qué procesa el Sistema Nervioso”. Sólo así podrán utilizar su periodo doctoral en aprender a hacer Ciencia, y no en adquirir a marchas forzadas las bases teóricas mínimas para no hacer el ridículo. Sólo así podrán explicar a la Sociedad porqué lo que hacemos es importante.

Link: http://blog.senc.es/somos-neurocientificos-o-neuroaficionados-2/


Página:  1  2  3  4  5  6  (Siguiente)
  TODAS