Computer model fosters potential improvements to ‘bionic eye’ technology

There are millions of people who face the loss of their eyesight from degenerative eye diseases. The genetic disorder retinitis pigmentosa alone affects 1 in 4,000 people worldwide.

Today, there is technology available to offer partial eyesight to people with that syndrome. The Argus II, the world’s first retinal prosthesis, reproduces some functions of a part of the eye essential to vision, to allow users to perceive movement and shapes.

Image credit: Dan Foy via Flickr, CC BY 2.0

While the field of retinal prostheses is still in its infancy, for hundreds of users around the globe, the “bionic eye” enriches the way they interact with the world on a daily basis. For instance, seeing outlines of objects enables them to move around unfamiliar environments with increased safety.

 

That is just the start. Researchers are seeking future improvements upon the technology, with an ambitious objective in mind.

“Our goal now is to develop systems that truly mimic the complexity of the retina,” said Gianluca Lazzi, PhD, MBA, a Provost Professor of Ophthalmology and Electrical Engineering at the Keck School of Medicine of USC and the USC Viterbi School of Engineering.

He and his USC colleagues cultivated progress with a pair of recent studies using an advanced computer model of what happens in the retina. Their experimentally validated model reproduces the shapes and positions of millions of nerve cells in the eye, as well as the physical and networking properties associated with them.

“Things that we couldn’t even see before, we can now model,” said Lazzi, who is also the Fred H. Cole Professor in Engineering and director of the USC Institute for Technology and Medical Systems. “We can mimic the behavior of the neural systems, so we can truly understand why the neural system does what it does.”

Focusing on models of nerve cells that transmit visual information from the eye to the brain, the researchers identified ways to potentially increase clarity and grant color vision to future retinal prosthetic devices.

The eye, bionic and otherwise

To understand how the computer model could improve the bionic eye, it helps to know a little about how vision happens and how the prosthesis works.

When light enters the healthy eye, the lens focuses it onto the retina, at the back of the eye. Cells called photoreceptors translate the light into electrical impulses that are processed by other cells in the retina. After processing, the signals are passed along to ganglion cells, which deliver information from retina to brain through long tails, called axons, that are bundled together to make up the optic nerve.

Photoreceptors and processing cells die off in degenerative eye diseases. Retinal ganglion cells typically remain functional longer; the Argus II delivers signals directly to those cells.

“In these unfortunate conditions, there is no longer a good set of inputs to the ganglion cell,” Lazzi said. “As engineers, we ask how we can provide that electrical input.”

A patient receives a tiny eye implant with an array of electrodes. Those electrodes are remotely activated when a signal is transmitted from a pair of special glasses that have a camera on them. The patterns of light detected by the camera determine which retinal ganglion cells are activated by the electrodes, sending a signal to the brain that results in the perception of a black-and-white image comprising 60 dots.

Computer model courts new advances

Under certain conditions, an electrode in the implant will incidentally stimulate the axons of cells neighboring its target. For the user of the bionic eye, this off-target stimulation of axons results in the perception of an elongated shape instead of a dot. In a study published in IEEE Transactions on Neural Systems and Rehabilitation Engineering, Lazzi and his colleagues deployed the computer model to address this issue.

“You want to activate this cell, but not the neighboring axon,” Lazzi said. “So we tried to design an electrical stimulation waveform that more precisely targets the cell.”

The researchers used models for two subtypes of retinal ganglion cells, at the single-cell level as well as in huge networks. They identified a pattern of short pulses that preferentially targets cell bodies, with less off-target activation of axons.

Another recent study in the journal Scientific Reports applied the same computer modeling system to the same two cell subtypes to investigate how to encode color.

This research builds upon earlier investigations showing that people using the Argus II perceive variations in color with changes in the frequency of the electrical signal — the number of times the signal repeats over a given duration. Using the model, Lazzi and his colleagues developed a strategy for adjusting the signal’s frequency to create the perception of the color blue.

Beyond the possibility of adding color vision to the bionic eye, encoding with hues could be combined with artificial intelligence in future advances based on the system, so that particularly important elements in a person’s surroundings, such as faces or doorways, stand out.

“There’s a long road, but we’re walking in the right direction,” Lazzi said. “We can gift these prosthetics with intelligence, and with knowledge comes power.”

Source: USC