Neurociencia | Neuroscience
Especial | A | B | C | D | E | F | G | H | I | J | K | L | M | N | Ñ | O | P | Q | R | S | T | U | V | W | X | Y | Z | TODAS
Ayudando a niños y adolescentes a superar la violencia y los desastres: Qué pueden hacer los padres 
Back-up brains: The era of digital immortality 
Back-up brains: The era of digital immortality
Behavioral analytics vs. the rogue insider 
Behavioral analytics vs. the rogue insider
The promise of User Behavioral Analytics is that it can go beyond simply detecting insider threats to predicting them. Some experts say that creates a significant privacy problem.
The recent arrest by the FBI of a former employee of JP Morgan Chase for allegedly trying to sell bank account data, including PINs, ended well for the bank.
According to the FBI, the former employee, Peter Persaud, was caught in a sting operation when he attempted to sell the data to informants and federal agents.
But such things don’t always end so well for the intended victims. The arrest was yet another example of the so-called “rogue insider” threat to organizations.
And such incidents are providing increasing incentives to use technology to counter it.
The threat of employees going rogue – wittingly or not – is significant enough that some organizations are turning to behavior analytics that, according to its advocates, are able not only to detect insider security threats as they happen, but even predict them.
Such protection would likely be welcomed by most organizations, but it comes with an obvious consequence: Worker privacy. Predicting security threats calls up images of “Minority Report,” the 2002 movie starring Tom Cruise, in which police arrested people before they committed crimes.
In that sci-fi world, it was “precogs” – psychics – who predicted the impending crimes. The IT version is User Behavior Analytics (UBA).
According to Gartner, “UBA is transforming security and fraud management practices because it makes it much easier for enterprises to gain visibility into user behavior patterns to find offending actors and intruders.”
Saryu Nayyar, CEO of Gurucul Solutions, in a recent statement, said her firm’s technology, “continuously monitors hundreds of (employee behavior) attributes to detect and rank the risk associated with anomalous behaviors.
Saryu Nayyar, CEO, Gurucul Solutions
“It identifies and scores anomalous activity across users, accounts, applications and devices to predict risks associated with insider threats.”
This should not be a surprise. Data analytics are being applied to just about every challenge in the workplace, from marketing to efficiency. So it is inevitable that it would be used to counter what has always been the weakest link in the security chain – the human.
Americans have also been told for years that personal privacy is essentially dead. Still, some of them may not appreciate just how dead it is, or soon will be, in the workplace.
But Nayyar and others note that there should be no expectation of privacy in the workplace when it comes to corporate data.
“This technology is simply monitoring activity within a company’s IT systems,” she said. “It does not read emails or personal communications.”
She added that monitoring of employee behaviors by IT has been going on for a long time. “This is nothing new,” she said. “What’s different today is the use of big data analytics, machine learning algorithms and risk scoring being applied to these logs.”
Michael Overly, technology partner at Foley & Lardner LLP, said companies should notify their employees that, “business systems should not be used for personal or private communications and other activities, and that the systems and data can and likely will be reviewed, including through automated means.”
Michael Overly, technology partner, Foley & Lardner LLP
But he agreed with Nayyar that privacy is necessarily limited in the workplace. “Employees must understand that if they want privacy with regard to their online activities, they need to use a means other than their employer’s computers, like a smartphone or a home computer,” he said.
That is also the view of Troy Moreland, chief technology officer at Identity Automation. “In general, if employees are using employer-provided equipment, they have no right to privacy as long as it’s clearly expressed,” he said.
But Joseph Loomis, founder and CEO of CyberSponse, said such policies, if they are too heavy handed, can cause morale problems. “I believe it’s justified,” he said, “it’s just that there are various opinions on what type of privacy someone is entitled to or not.”
He said it would likely take significant “training, education and explaining” to eliminate the feeling of a “Big Brother” atmosphere in the workplace.
Joseph Loomis, founder and CEO, CyberSponse
Gabriel Gumbs, vice president of product strategy at Identity Finder, said he believes the potential for morale problems is real. “At the core of UBA is an unspoken distrust of everyone, not just the rogue employees,” he said.
Matthew Prewitt, partner at Schiff Hardin and chairman of its cybersecurity and data privacy practice, said one problem with predicting misconduct is that it can become self-fulfilling. “An employee who is viewed with mistrust and suspicion is more likely to become a rogue employee,” he said.
He agrees that there is a limited expectation of privacy in the workplace, especially on the corporate network. But he said a “creative advocate” for an employee could argue that, “UBA is so different from other types of monitoring that some sort of express reference to UBA needs to be provided in the notice.”
Loomis added that in states not governed by “right-to-work” laws, UBA, “will cause legal issues if one terminates without cause other than predictive intelligence.”
The much larger problem, they say, is from unintentional rogues – those with too many access privileges, who use “shadow” IT and/or who are simply lazy or careless.
“In our experience over-privileged scenarios account for approximately 65% of insider threat incidents, shadow IT 20% and carelessness 15%,” Nayyar said.
Moreland has a list of labels for such employees, including “access hoarders” who “gobble up as much access as they possibly can and refuse to relinquish any of it, even when it's no longer needed.”
Others, who he calls “innovators,” are well intentioned – they are trying to be more productive – but one of the ways they do so is by circumventing IT policies.
Gumbs noted that the Verizon Data Breach Investigations Report found that, “privilege abuse is the most damaging of insider threats.”
But he added that not all abuse of access privileges is innocent, and does not necessarily mean an employee is over-privileged. “In the majority of cases, users had the proper level of privilege for their roles, they simply abused those privileges for personal or financial gain,” he said. In those cases, he and other experts say identity and access management can reduce the security risks significantly.
“Over-privilege is a substantial concern,” Overly said. “In general, the majority of users in businesses today are over-privileged. The concept of least privilege is seldom implemented properly and even more seldom addressed as personnel duties change and evolve over time.”
Dennis Devlin, cofounder, CISO and senior vice president of privacy practice at SAVANTURE, said he sees the same thing. “In my experience most individuals who have been with an organization for a long time are over-privileged,” he said. “Access privileges are accretive and tend to grow over time. The law of least privileges exists not just to prevent malicious access, but to also to prevent accidental or inadvertent disclosure.”
Dennis Devlin, cofounder, CISO and senior vice president of privacy practice, SAVANTURE
He said better access management could reduce the need for intrusive monitoring. “Appropriate privileges keep individuals in their respective ‘swim lanes,’ reduce the need for excessive monitoring and make SIEM analysis much more effective,” he said.
Beyond the legal and morale questions, however, the verdict is still out on how well UBA works.
Overly said in his experience, “it has a long way to go with regard to accuracy. All too often, the volume of false alarms causes the results to be disregarded when an actual threat is identified.”
Nayyar said it does work, through analysis of unusual or “anomalous” behaviors in things like geolocation, elevated permissions, connecting to an unknown IP or installing unknown software for backdoor access to sensitive data (see sidebar).
She provided an example of flagging rogue behavior: A software engineer who had resigned from a company and was leaving in a month, exhibited behavior never seen before.
While on vacation, the employee, “logged in from a previously unseen IP address, accessed source code repository and downloaded sensitive files from a project he wasn't assigned to,” she said.
“Two days later, the engineer accessed multiple servers and moved the downloaded files to a NFS (Network File System) location, which he made mountable and attempted to sync the files to prohibited consumer cloud storage service.”
She said the user was flagged as soon as he created the NFS mount point, “based on predictive modeling, and his VPN connection was terminated.”
But as effective as that sounds, even advocates of UBA warn that, like any security tool, it is a “layer” of protection, not a guarantee.
“Perfection cannot be achieved,” Overly said. “If an insider is intent on causing harm to the business, it may be impossible to prevent it.”
What does UBA track?
According to Saryu Nayyar, CEO of Gurucul Solutions, User Behavioral Analytics can detect behavioral anomalies by monitoring activities including:
Biological Compass 
By Bob Grant
A protein complex discovered in Drosophila may be capable of sensing magnetism and serves as a clue to how some animal species navigate using the Earth’s magnetic field.
A variety of different animal species possess remarkable navigational abilities, using the Earth’s magnetic field to migrate thousands of miles every year or find their way home with minimal or no visual cues. But the biological mechanisms that underlie this magnetic sense have long been shrouded in mystery. Researchers in China may have found a tantalizing clue to the navigational phenomenon buried deep in the fruit fly genome. The team, led by biophysicist Can Xie of Peking University, discovered a polymer-like protein, dubbed MagR, and determined that it forms a complex with a photosensitive protein called Cry. The MagR/Cry protein complex, the researchers found, has a permanent magnetic moment, which means that it spontaneously aligns in the direction of external magnetic fields. The results were published today (November 16) in Nature Materials.
“This is the only known protein complex that has a permanent magnetic moment,” said Peter Hore, a physical chemist at the University of Oxford, U.K., who was not involved in the research. “It’s a remarkable discovery.”
Xie and his colleagues called upon long-standing biochemical models that sought to explain animals’ magnetic sense to initiate the search for a physical magnetoreceptor. One of these involves molecules that incorporate oxides of iron in their structure and another involves Cry, which is known to produce radical pairs in some magnetic fields. “However, this only proved that Cry plays a critical role in the magnetoreptive biological pathways, not necessarily that it is the receptor,” Xie wrote in an email to The Scientist. “We believe there [are] such universal magnetosensing protein receptors in an organism, and we set out to find this missing link.”
The researchers performed whole-genome screens of Drosophila DNA to search for a protein that might partner with Cry and serve as that magnetoreceptor. “We predicted the existence of a multimeric magentosensing complex with the attributes of both Cry- and iron-based systems,” Xie wrote. “Amazingly, later on, our genome-wide screening and experiments showed this is real.”
Xie and his colleagues were “brave enough to go through the whole genome-wide search to hunt for this protein,” said James Chou, a biophysicist at Harvard Medical School. “Sometimes it works, sometimes you don’t get anything. Luckily, this time he got something big out of it.”
In 2012, after identifying MagR in Drosophila, Xie and his colleagues screened the genomes of several other animal species, finding genes for both Cry and MagR in virtually all of them, including in butterflies, pigeons, robins, rats, mole rats, sharks, turtles, and humans. “This protein is evolutionarily conserved across different classes of animals (from butterflies to pigeons, rats, and humans),” Xie wrote.
Determining that MagR and Cry were highly expressed and colocalized in the retinas of pigeons, Xie’s team focused on that species to conduct further experiments to ferret out the structure and behavior of the protein complex. Using biochemical co-purification, electron microscopy, and cellular experiments in the presence of a magnetic field, the researchers constructed a rod-shaped model of the MagR/Cry complex, and suggested a potential mechanism for how the complex might work in situ to sense magnetism. “It is quite convincing that this complex may be the magnetoreceptor, at least for the organism they have fished it out from,” Chou said. “I think it’s a great step forward to open this whole mystery.”
Cry likely regulates the magnetic moment of the rod-shaped complex, while the iron-sulfur clusters in the MagR protein are probably what give rise to the permanent magnetic polarity of the structure. “The nanoscale biocompass has the tendency to align itself along geomagnetic field lines, and to obtain navigation cues from a geomagnetic field,” Xie wrote. “We propose that any disturbance of this alignment may be captured by connected cellular machinery such as the cytoskeleton or ion channels, which would channel information to the downstream neural system, forming the animal’s magnetic sense (or magnetic ‘vision’).”
Hore was cautious about saying that the newly modeled complex is absolutely responsible for magnetoreception in animals. “I don’t think I would say that its game-changing, but it is very interesting and will prompt a lot of experimental and theoretical work,” he said. “It may be very relevant to magnetoreception, it’s just too soon to know.”
“It may not be very accurate because this is really just a model,” Chou agreed, “but I think it’s a good effort, and it will stimulate follow-up work on the structure.”
Of course, there may well be additional biological components that play into giving animals a magnetic sense. Pigeons, for example, sense the inclination of the Earth’s magnetic field rather than the absolute direction of the field, Hore points out. The MagR/Cry complex, as described in the paper, would be capable of detecting the absolute direction or intensity of a field, not the inclination.
But beyond the clues into how animals sense the Earth’s magnetic field for navigational purposes, the discovery may yield new biochemical tools that could be used by other researchers. Chief among these applications is the potential to use the MagR/Cry complex along with controlled magnetic fields to control the behavior of cells or whole organisms. Such a development would be “sort of the magnetic version of optogenetics,” Cho said.
Xie agreed. “It may give rise to magnetogenetics,” he wrote.
The study also generates multiple questions about the biological components surrounding the protein complex and how they contribute to magnetic sensation. “This is just the tip of the iceberg,” Choe said. “This opens up a lot of future projects to unveil how this polarity, or alignment with the Earth’s magnetic field, can transmit signal, whether it’s a neural signal or one that regulates transcription.”
Xie said he thinks that while the “biocompass model” he and his colleagues proposed may serve as a universal mechanism for animal magnetoreception, there may be more magnetoreceptors to be discovered. Additionally, evolution enhanced the magnetic sense in some, especially migratory, species, which could have led to numerous variations on the theme. This may even extend to humans, he added. “I have a friend who has really good sense of directions, and he keep telling people that he can always know where is south, where is north, even to a new place he has never been,” Xie wrote in an email. “According to him, he felt there is a compass in his brain. I was laughed, but now I guess I understand what he meant. . . . However, human’s sense of direction is very complicated. Magnetoreception may play some roles.”
S. Qin et al., “A magnetic protein biocompass,” Nature Materials, doi:10.1038/nmat4484, 2015.
Biomarkers, apps help in suicide prevention 
Report: Biomarkers, apps help in suicide prevention
Biomedical ecosystem focused on innovative knowledge encoded with SNOMED CT 
Title: Biomedical ecosystem focused on innovative knowledge encoded with SNOMED CT
Presenter: Prog. Gustavo Tejera, SUEIIDISS and KW Foundation
October 25th-30th, 2015 - Radisson Montevideo, Uruguay
Doctors, Project Managers, Managers of Content’s Departments, Architects of the Digital Society, MBA, Engineers, Analysts of BIG DATA and IoT, Interoperability and Automation Specialists.
Transmitting the key concepts to create and share reusable contents encoded with SNOMED CT. They can improve application logic and training at the point of care, without its creators know programming. It is the first step towards a social network 3.0 based on SNOMED CT.
SNOMED CT is a source of incremental knowledge to articulate episodes, processes, tasks, forms, descriptors, indicators, rules and agents in all layers of the electronic health record.
How to build reusable components with SNOMED CT knowledge to improve the logic of the applications? How to swap? Web 3.0 is ready to start, but there are difficulties related to the training of professionals, the value of knowledge and market rules.
In this study we present the experience of the KW Foundation in the development and implementation of HealthStudio (open source) at the National Cancer Institute, Sanatorium John Paul II, Medical Federation of the Interior, Uruguayan Medical Sanatorium, College of Nursing and Maryland University.
In a BIG DATA composed of more than 20 million log events, we discover how to involve users in the construction process and content that increase both contextual cognition at the attention area, logistical and administrative tasks.
With the construction of reusable knowledge of SNOMED CT, the healthcare community gets a great facilitator for the essential alignment ("light"), connection ("camera") and interoperability ("action").
We believe that an innovative, cognitive, community and incremental ecosystem is possible to build on the basis of SNOMED CT, ready for the generation and analysis of BIG DATA and the Internet of Things. But above all, it is essential to ensure that these tools allow the inclusion of all levels in the democratic construction of eHealth and Digital Society.
Bionic Eye 
The Bionic Eye
Using the latest technologies, researchers are constructing novel prosthetic devices to restore vision in the blind.
In 1755, French physician and scientist Charles Leroy discharged the static electricity from a Leyden jar—a precursor of modern-day capacitors—into a blind patient’s body using two wires, one tightened around the head just above the eyes and the other around the leg. The patient, who had been blind for three months as a result of a high fever, described the experience like a flame passing downwards in front of his eyes. This was the first time an electrical device—serving as a rudimentary prosthesis—successfully restored even a flicker of visual perception.
More than 250 years later, blindness is still one of the most debilitating sensory impairments, affecting close to 40 million people worldwide. Many of these patients can be efficiently treated with surgery or medication, but some pathologies cannot be corrected with existing treatments. In particular, when light-receiving photoreceptor cells degenerate, as is the case in retinitis pigmentosa, or when the optic nerve is damaged as a result of glaucoma or head trauma, no surgery or medicine can restore the lost vision. In such cases, a visual prosthesis may be the only option. Similar to cochlear implants, which stimulate auditory nerve fibers downstream of damaged sensory hair cells to restore hearing, visual prostheses aim to provide patients with visual information by stimulating neurons in the retina, in the optic nerve, or in the brain’s visual areas.
In a healthy retina, photoreceptor cells—the rods and cones—convert light into electrical and chemical signals that propagate through the network of retinal neurons down to the ganglion cells, whose axons form the optic nerve and transmit the visual signal to the brain. (Seeillustration.) Prosthetic devices work at different levels downstream from the initial reception and biochemical conversion of incoming light photons by the pigments of photoreceptor rods and cones at the back of the retina. Implants can stimulate the bipolar cells directly downstream of the photoreceptors, for example, or the ganglion cells that form the optic nerve. Alternatively, for pathologies such as glaucoma or head trauma that compromise the optic nerve’s ability to link the retina to the visual centers of the brain, prostheses have been designed to stimulate the visual system at the level of the brain itself. (See illustration.)
While brain prostheses have yet to be tested in people, clinical results with retinal prostheses are demonstrating that the implants can enable blind patients to locate and recognize objects, orient themselves in an unfamiliar environment, and even perform some reading tasks. But the field is young, and major improvements are still necessary to enable highly functional restoration of sight.
Henri Lorach is currently a visiting researcher at Stanford University, where he focuses on prosthetic vision and retinal signal processing.
Substitutes for Lost Photoreceptors
by Daniel Palanker
PALANKER LAB, STANFORD UNIVERSITY
In the subretinal approach to visual prosthetics, electrodes are placed between the retinal pigment epithelium (RPE) and the retina. (See illustration.) There, they stimulate the nonspiking inner retinal neurons—bipolar, horizontal, and amacrine cells—which then transmit neural signals down the retinal network to the retinal ganglion cells (RGCs) that propagate to the brain via the optic nerve. Stimulating the retinal network helps preserve some aspects of the retina’s natural signal processing, such as the “flicker fusion” that allows us to see video as a smooth motion, even though it is composed of frames with static images; adaptation to constant stimulation; and the nonlinear integration of signals as they flow through the retinal network, a key aspect of high spatial resolution. Electrical pulses lasting several milliseconds provide selective stimulation of the inner retinal neurons and avoid direct activation of the ganglion cells and their axons, which would otherwise considerably limit patients’ ability to interpret the spatial layout of a visual scene.
In the subretinal approach to visual prosthetics, electrodes are placed between the retinal pigment epithelium and the retina, where they stimulate the nonspiking inner retinal neurons.
The Boston Retinal Implant Project, a multidisciplinary team of scientists, engineers, and clinicians at research institutions across the U.S., is developing a retinal prosthesis that transmits information from a camera mounted on eyeglasses to a receiving antenna implanted under the skin around the eye using radiofrequency telemetry—technology similar to radio broadcast. The decoded signal is then delivered to an implanted subretinal electrode array via a cable that penetrates into the eye. The information delivered to the retina by this device is not related to direction of gaze, so to survey a scene a patient must move his head, instead of just his eyes.
The Alpha IMS subretinal implant, developed by Retina Implant AG in Reutlingen, Germany, rectifies this problem by including a subretinal camera, which converts light in each pixel into electrical currents. This device has been successfully tested in patients with advanced retinitis pigmentosa and was recently approved for experimental clinical use in Europe. Visual acuity with this system is rather limited: most patients test no better than 20/1000, except for one patient who reached 20/550.1 The Alpha IMS system also needs a bulky implanted power supply with cables that cross the sclera and requires complex surgery, with associated risk of complications.
COURTESY OF NESTLAIR PHOTOGRAPHY
To overcome these challenges, my colleagues and I have developed a wireless photovoltaic subretinal prosthesis, powered by pulsed light. Our system includes a pocket computer that processes the images captured by a miniature video camera mounted on video goggles, which project these images into the eye and onto a subretinally implanted photodiode array. Photodiodes in each pixel convert this light into pulsed current to stimulate the nearby inner retinal neurons. This method for delivering the visual information is completely wireless, and it preserves the natural link between ocular movement and image perception.
Our system uses invisible near-infrared (NIR, 880–915 nm) wavelengths to avoid the perception of bright light by the remaining functional photoreceptors. It has been shown to safely elicit and modulate retinal responses in normally sighted rats and in animals blinded by retinal degeneration.2Arrays with 70 micrometer pixels restored visual acuity in blind rats to half the natural level, corresponding to 20/250 acuity in human. Based on stimulation thresholds observed in these studies, we anticipate that pixel size could be reduced by a factor of two, improving visual acuity even further. Ease of implantation and tiling of these wireless arrays to cover a wide visual field, combined with their high resolution, opens the door to highly functional restoration of sight. We are commercially developing this system in collaboration with the French company Pixium Vision, and clinical trials are slated to commence in 2016.
COURTESY OF NESTLAIR PHOTOGRAPHY
Fabio Benfenati of the Italian Institute of Technology in Genoa and Guglielmo Lanzani at the institute’s Center for Nanoscience and Technology in Milan are also pursuing the subretinal approach to visual prostheses, developing a device based on organic polymers that could simplify implant fabrication.3 So far, subretinal light-sensitive implants appear to be a promising approach to restoring sight to the blind.
Behind the Eye
by Lauren Ayton and David Nayagam
COURTESY OF BIONIC VISION AUSTRALIA'S BIONIC EYE PROJECT
Subretinal prostheses implanted between the retina and the RPE, along with epiretinal implants that sit on the surface of the retina (see below), have shown good results in restoring some visual perception to patients with profound vision loss. However, such devices require technically challenging surgeries, and the site of implantation limits the potential size of these devices. Epiretinal and subretinal prostheses also face challenges with stability and the occurrence of adverse intraocular events, such as infection or retinal detachment. Due to these issues, researchers have been investigating a less invasive and more stable implant location: between the vascular choroid and the outer sclera. (See illustration.)
Like subretinal prostheses, suprachoroidal implants utilize the bipolar cells and the retinal network down to the ganglion cells, which process the visual information before relaying it to the brain. But devices implanted in this suprachoroidal location can be larger than those implanted directly above or below the retina, allowing them to cover a wider visual field, ideal for navigation purposes. In addition, suprachoroidal electrode arrays do not breach the retina, making for a simpler surgical procedure that should reduce the chance of adverse events and can even permit the device to be removed or replaced with minimal damage to the surrounding tissues.
Early engineering work on suprachoroidal device design began in the 1990s with research performed independently at Osaka University in Japan1 and the Nano Bioelectronics and Systems Research Center of Seoul National University in South Korea.2 Both these groups have shown proof of concept in bench testing and preclinical work, and the Japanese group has gone on to human clinical trials with promising results.3 Subsequently, a South Korean collaboration with the University of New South Wales in Australia continued suprachoroidal device development.
More recently, our groups, the Bionics Institute and the Centre for Eye Research Australia, working as part of the Bionic Vision Australia (BVA) partnership, ran a series of preclinical studies between 2009 and 2012.4 These studies demonstrated the safety and efficacy of a prototype suprachoroidal implant, made up of a silicone carrier with 33 platinum disc-shaped electrodes that can be activated in various combinations to elicit the perception of rudimentary patterns, much like pixels on a screen. Two years ago, BVA commenced a pilot trial, in which researchers implanted the prototype in the suprachoroidal space of three end-stage retinitis pigmentosa patients who were barely able to perceive light. The electrode array was joined to a titanium connector affixed to the skull behind the ear, permitting neurostimulation and electrode monitoring without the need for any implanted electronics.5 In all three patients, the device proved stable and effective, providing enough visual perception to better localize light, recognize basic shapes, orient in a room, and walk through mobility mazes with reduced collisions.6Preparation is underway for future clinical trials, which will provide subjects with a fully implantable device with twice the number of electrodes.
Suprachoroidal prostheses can be larger than those implanted directly above or below the retina, allowing them to cover a wider visual field, ideal for navigation purposes.
Meanwhile, the Osaka University group, working with the Japanese company NIDEK, has been developing an intrascleral prosthetic device, which, unlike the Korean and BVA devices, is implanted in between the layers of the sclera rather than in the suprachoroidal space. In a clinical trial of this device, often referred to as suprachoroidal-transretinal stimulation (STS), two patients with advanced retinitis pigmentosa showed improvement in spatial resolution and visual acuity over a four-week period following implantation.3
Future work will be required to fully investigate the difference in visual perception provided by devices implanted in the various locations in the eye, but the initial signs are promising that suprachoroidal stimulation is a safe and viable clinical option for patients with certain degenerative retinal diseases.
Shortcutting the Retina
By Mark Humayun, James Weiland, and Steven Walston
© PHILIPPE PSAILA/SCIENCE SOURCE
Bypassing upstream retinal processing, researchers have developed so-called epiretinal devices that are placed on the anterior surface of the retina, where they stimulate the ganglion cells that are the output neurons of the eye. This strategy targets the last cell layer of the retinal network, so it works regardless of the state of the upstream neurons. (See illustration.)
In 2011, Second Sight obtained approval from the European Union to market its epiretinal device, the Argus II Visual Prosthesis System, which allowed clinical trial subjects who had been blind for several years to recover some visual perception such as basic shape recognition and, occasionally, reading ability. The following year, the FDA approved the device, which uses a glasses-mounted camera to capture visual scenes and wirelessly transmits this information as electrical stimulation patterns to a 6 x 10 microelectrode array. The array is surgically placed in the macular region, responsible in a healthy retina for high-acuity vision, and covers an area of approximately 20° of visual space.
A clinical trial showed that 30 patients receiving the device are able to more accurately locate a high-contrast square on a computer monitor, and when asked to track a moving high-contrast bar, roughly half are able to discriminate the direction of the bar’s movement better than without the system.1 The increased visual acuity has also enabled patients to read large letters, albeit at a slow rate, and has improved the patients’ mobility.2 With the availability of the Argus II, patients with severe retinitis pigmentosa have the first treatment that can actually improve vision. To date, the system has been commercially implanted in more than 50 patients.
Several other epiretinal prostheses have shown promise, though none have received regulatory approval. Between 2003 and 2007, Intelligent Medical Implants tested a temporarily implanted, 49-electrode prototype device in eight patients, who reported seeing spots of light when electrodes were activated. Most of these prototype devices were only implanted for a few months, however, and with no integrated camera, patients could not activate the device outside the clinic, limiting the evaluation of the prosthesis’s efficacy. This group has reformed as Pixium Vision, the company currently collaborating with Daniel Palanker’s group at Stanford to develop a subretinal device, and has now developed a permanent epiretinal implant that is in clinical trials. The group is also planning trials of a 150-electrode device that it hopes will further improve visual resolution.
Future developments in this area will aim to improve the spatial resolution of the stimulated vision; increase the field of view that can be perceived; and increase the number of electrodes. Smaller electrodes would activate fewer retinal ganglion cells, which would result in higher resolution. These strategies will be rigorously tested, and, if successful, may enable retinal prostheses that provide an even better view of the world.
Into the Brain
By Collette Mann, Arthur Lowery, and Jeffrey V. Rosenfeld
COURTESY OF MVG
In addition to the neurons of the eye, researchers have also targeted the brain to stimulate artificial vision in humans. Early experimentation in epileptic patients with persistent seizures by German neurologists and neurosurgeons Otfrid Förster in 1929 and Fedor Krause and Heinrich Schum in 1931, showed that electrical stimulation of an occipital pole, the most posterior part of each brain hemisphere, resulted in sensations of light flashes, termed phosphenes. By the mid-1950s, Americans John C. Button, an osteopath and later MD, and Tracy Putnam, then Chief of Neurosurgery at Cedars-Sinai Hospital in Los Angeles, had implanted stainless steel wires connected to a simple stimulator into the cortices of four people who were blind, and the patients subsequently reported seeing flashes of light.
The first functional cortical visual prosthesis was produced in England in 1968, when Giles Brindley, a physiologist, and Walpole Lewin, a neurosurgeon, both at Cambridge University, implanted 80 surface electrodes embedded in a silicone cap in the right occipital cortex of a patient. Each electrode connected to one of 80 corresponding extracranial radio receivers, which generated simple, distinctly located phosphene shapes. The patient could point with her hand to their location in her visual field. When more than one electrode at a time was stimulated, simple patterns emerged.
The subsequent aim of the late William H. Dobelle was to provide patients with visual images comprising discrete sets of phosphenes—in other words, artificial vision. Dobelle had begun studying electrical stimulation of the visual cortex in the late 1960s with sighted patients undergoing surgery to remove occipital lobe tumors. He subsequently implanted surface-electrode arrays, first temporarily, then permanently, in the visual cortices of several blind volunteers. However, it was not until the early 2000s that the technology became available to connect a miniature portable camera and computer to the electrodes for practical conversion of real-world sights into electrical signals. With the resultant cortical stimulation, a patient was able to recognize large-print letters and the outline of images.
COURTESY OF MVG
To elicit phosphenes, however, the surface electrodes used in these early cortical prostheses required large electrical currents (~3 mA–12 mA), which risked triggering epileptic seizures or debilitating migraines. The devices also required external cables that penetrated the skull, risking infection. Today, with the use of wireless technology, a number of groups are aiming to improve cortical vision prostheses, hoping to provide benefit to millions of people with currently incurable blindness.
One promising device from our group is the Gennaris bionic-vision system, which comprises a digital camera on a glasses frame. Images are transmitted into a small computerized vision processor that converts the picture into waveform patterns, which are then transmitted wirelessly to small electronic tiles that are implanted into the visual cortex located in the back of the brain. Each tile houses 43 penetrating electrodes, and each electrode may generate a phosphene. The patterns of phosphenes will create 2-D outlines of relevant shapes in the central visual field. The device is in the preclinical stage, with the first human trials planned for next year, when we hope to implant four to six tiles per patient to stimulate patterns of several hundred phosphenes that patients can use to navigate the environment, identify objects in front of them, detect movement, and possibly read large print.
The development in bionic vision devices is accelerating rapidly due to collaborative efforts using the latest silicon chip and electrode design, computer vision processing algorithms, and wireless technologies.
Other groups currently developing cortical visual prostheses include the Illinois Institute of Technology, the University of Utah, the École Polytechnique de Montréal in Canada, and Miguel Hernández University in Spain. All these devices follow the same principal of inducing phosphenes that can be visualized by the patient. Many technical challenges must be overcome before such devices can be brought to the clinic, however, including the need to improve implantation techniques. In addition to the need for patient safety, accuracy and repeatability when inserting the device are important for maximum results.
Development of bionic vision devices is accelerating rapidly due to collaborative efforts using the latest silicon chip and electrode design, computer vision processing algorithms, and wireless technologies. We are optimistic that a range of practical, safe, and effective bionic vision devices will be available over the next decade and that blind individuals will have the ability to “see” their world once again.
Tagsvision, sensory biology, retinal prosthesis, retina, neuroprosthetics, eye, disease/medicine, degenerative eye disease and bionics
Bionic Implants 
Meet Bionic Amputee, Nigel Ackland
In 2006, Nigel Ackland had an accident. Working as a metal smelter a the time, his right hand was crushed in an industrial mixer. The hand was so severely damaged that six months later he was forced to have it amputated.
Speaking at Exponential Medicine, Ackland presented himself as an ordinary guy facing an extraordinary challenge—a distinction he shares with millions of fellow amputees. How to put your life back together? Learn to live with your disability?
A year after Ackland lost his hand he said he was beset by fits of sudden raging anger. It was hard enough to physically adapt, but he noticed changes in how people treated him too. Strangers would avoid him or stare, with some mixture of pity, fear, or disgust. The guy he saw in the mirror was a physical and mental wreck.
“Psychologically, I was in a very dark place.”
Then something happened that changed Ackland’s life. He got a call from RSL Steeper, maker of the bebionic robotic hand. The company asked if he’d be interested in becoming the first amputee to test out their product.
Bebionic is a prosthetic hand that myoelectrically senses muscle twitches in an amputee’s stump. Depending on which muscles users twitch, the hand can perform a variety of functional or communicative grip patterns and hand positions.
With practice and experience, Ackland is now a pro. But it isn’t just about the technology itself enables him to do. His life has positively changed pretty much across the board.
“People still stop and stare,” he says, “But it’s not out of fear or pity anymore.”
He says people are more ready to accept him as he is, and of course, are full of curiosity about his bionic hand. They want to shake hands, see what he can do with it.
Ackland’s story offers a critical glimpse behind the scenes. The human element is all too often lost in the flush of excitement, in the feverish development of technology. But the point, ultimately, is to help people regain a sense of wholeness.
“I’m just an ordinary guy fortunate enough to wear an extraordinary piece of technology,” he says.
Ackland told us every thirty seconds someone becomes an amputee. Since he began testing the hand, another twenty people have joined him. And no doubt, the technology can’t come fast enough for everyone else that could benefit.
Encouragingly, the technology continues to develop. We’ve covered a number of prosthetic limbs—arms andlegs—that intelligently respond to a user’s thoughts and intentions to move. One team, out of Case Western, is even working on a prosthetic that can provide the user with a rudimentary sense of touch.
In the coming years, we hope to see continued improvement and wider availability. Ackland is, undoubtedly, a powerful model of just how much folks stand to benefit.
“I do believe life changing doesn’t have to be life ending,” he told us.
Bipolar Disorder