VICUG-L Archives

Visually Impaired Computer Users' Group List

VICUG-L@LISTSERV.ICORS.ORG

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
"Kennedy, Bud" <[log in to unmask]>
Reply To:
Kennedy, Bud
Date:
Fri, 31 Aug 2001 14:04:04 -0400
Content-Type:
text/plain
Parts/Attachments:
text/plain (501 lines)
Discover Magazine

DISCOVER Vol. 22 No. 8 (August 2001)
Table of Contents

Artificial Sight
Just because we don't understand how the brain interprets the messages it
gets
from the eye doesn't mean we can't help the blind see again By Gregory Cerio

Photography by James Smolka

Glasses like these, developed by Wentai Liu and Chris DeMarco at North
Carolina State University in Raleigh in collaboration with Johns Hopkins,
may
one day, along with a retinal implant, help the blind see. Harry Woehrle, a
research subject at Hopkins, models the glasses: The tiny camera on the
frame
transmits an analog signal that is digitized and sent on its way- with luck-

to the brain. I tried an experiment not long ago, an experiment that
involved
eyesight. The goal was to experience what it's like to be on the cutting
edge
of vision technology. It was a test that, fortunately or unfortunately, I am

well qualified to perform. You see, back in the 1960s, when I was 4 years
old,
I had a terrible accident. My sister Camille and I had gotten hold of two of

those old, long-necked bottles of Pepsi, capped and full of soda. Morons
that
we were, we began playing The Three Musketeers, fencing with the glass
bottles, clacking them together like swords. A shard flew into my right eye;

Camille's legs were torn up a bit (our poor parents . . .). Surgery saved my

eye, but the sight I have has always been extremely poor. I can just about
make out the largest letter on the Snellen visual acuity chart.

Luckily my left eye is fine, but I wanted to find out how well I could get
around with my right. I put cotton and tape over my good eye and took a
walk.
The room was brightly lit. I could make out doorways and see furniture as
vague shapes, enough to distinguish a chair from a desk. I made my way
outside
to the newsstand and bought Wint O Green LifeSavers without tripping or
falling. I couldn't watch TV. I certainly couldn't read. I couldn't really
recognize faces. But I could see a friend hold her arms wide to give me a
hug.

It wasn't much. But even the vision in my bad eye would mean the world to
people like Harry Woehrle, who was blinded by retinitis pigmentosa, a
hereditary disease that destroys the photoreceptor cells of the eye. He
began
to lose his sight as a young man. Now he can barely remember his children's
faces. Recently remarried, he has never seen his wife, Carol.

Today Woehrle has hope that he might be able to see his loved ones again. He

is a test subject for the Intraocular Retinal Prosthesis Group of the Wilmer

Eye Institute at Johns Hopkins University, one of the leading programs in
artificial vision research- a field that aims to use chip-driven
microelectrodes to stimulate dormant neural tissues in the visual pathways
of
the blind. During the next year, Harry may be among the first to take an
eye-
chip shakedown cruise.

Hopkins researchers intend to implant pea-sized chip arrays into the eyes of
a
small group of blind volunteers like Woehrle as part of a yearlong, FDA-
approved safety and feasibility trial. The array consists of a signal
processor and microelectrodes that will excite neurons in the retina in a
pattern that corresponds to the view of the world as captured by a camera
mounted on a pair of glasses.

No one expects miracles. Giving patients the sort of eyesight I experience
in
my torn-up eye would be considered a thundering success. "If we can
eventually
help some blind people just to see a little bit, enough to get around
unaided,
that will be very exciting," says eye surgeon Mark Humayun, director of the
Hopkins project. If retinal-chip implants work, they will aid only a
fraction
of the blind. (It will not help those born blind or those without a
functioning optic nerve, and so other researchers are attempting to pipe
patterned electronic stimuli directly into the brain's visual cortex, the
place where sight is actually formed- see " Straight to the Brain.")

The eye is a supremely refined, highly organized instrument that acts, in
effect, as a digital image processor. After light of different frequencies
enters through the lens and cornea, it strikes the retina, the
image-capturing
membrane at the back of the eye. Less than 0.04 inches thick, the retina is
ever so dense, with 10 layers of tissue containing more than 1 million
neural
cells and upwards of 150 million photoreceptor cells- the rods and cones.
Photons of light prompt the rods and cones to release bursts of
electrochemical charges. These charges set off a signal-processing chain,
which digitizes the light into neural messages that travel through the optic

nerve to the visual cortex. Any breakdown along that route can end the
transmission. "Human beings have as much sensory processing circuitry
devoted
to sight as a bat has for hearing," notes James Weiland, a biomedical
engineer
who is studying the interface between the electronics and the retina for the

Hopkins team. "Replacing even a piece of that circuitry is an awesome task."


This array of microelectrodes was implanted in a human eye at Johns Hopkins
last year. When the array was charged in an E-shaped pattern, the patient
successfully saw the letter E. Photograph courtesy of the Intraocular
Retinal
Prosthesis Group 2001/The Wilmer Eye Institute at Johns Hopkins University.
The Hopkins group and an equally prominent team at Harvard University and
the
Massachusetts Institute of Technology have both elected to go with an
"epiretinal" chip that will rest against the inner wall of the eye. Success
is
far from assured, but faith in the idea is based partly on the
accomplishments
of the cochlear implant, a device that has helped many deaf people hear
again.
The cochlear implant is a bit baffling: Scientists do not fully understand
how
the brain learns to recognize speech as well as it does with the limited
information the implant provides. The cause of most deafness is the loss of
"hair cells"- antennalike cells that line the cochlea, a snail-shaped
section
of the inner ear. In healthy people, the hair cells pick up sound vibrations

and translate them into electrochemical signals that are sent to the
auditory
nerve. The cochlear implant takes sound passed through a microphone and a
sound processor and sends impulses to electrodes in the cochlea, which
passes
a signal to the auditory nerve. The device has restored a degree of hearing
for 25,000 people.

Vision researchers are counting on the incredible plasticity demonstrated by

the brain in response to the cochlear implant. William Heetderks, head of
the
neural prosthesis program at the National Institutes of Health, says, "This
implant has gotten a lot of people wondering how the auditory system works.
Given how little information is going into the brain, it's amazing the
implant
works as well as it does." If the brain is so resilient, he adds, "something

similar may happen with the visual prosthesis."

The operation of the retinal implant systems being designed by the
Harvard/MIT
and Hopkins teams is similar to that of the cochlear implant: Data is taken
up, encoded, then transmitted as patterned stimuli. Here's how the nearly
identical epiretinal implants will work: A tiny, charge-coupled device (CCD)

camera, mounted on an eyeglass frame, captures and digitizes images of the
outside world. The digital signal is sent to a belt pack that supplies power

and transmits the data to the retinal chip by means of radio waves. The
inch-
long chip, which curves along the inner wall of the retina, contains a
signal
processor and as many as 100 disk-shaped platinum electrodes, each about the

size of the tip of a human eyelash. The decoded signal from the CCD controls

the firing pattern of the electrodes, which stimulate healthy neural cells
that lie beneath the retina's inner surface.

While it seems like a straightforward system, the approach is fraught with
challenges- and much work needs to be done before a fully functioning chip
that works inside the eye is available. First, no one knows if the retina
will
tolerate a foreign device for a period of years. The eye is delicate and has

difficulty fighting infection. Ideally, the epiretinal chip will be a
permanent installation, but the Hopkins team has never left a chip inside a
human eye for longer than 45 minutes. The Harvard/MIT group has kept an
array
inside an eye for a few months. This is going to be one of those "there's
only
one way to find out" scenarios. Hopkins researchers are confident the eye
can
live with the chip; they are more concerned about the microelectronics
soaking
in the equivalent of a tub of salt water- the vitreous humor, the watery gel

that gives an eyeball its turgidity. "Imagine throwing a television set into

the ocean," says Robert Greenberg, a former member of the Hopkins team. This

is just half the problem, possibly the simpler half. Weiland believes "the
human body will protect itself. What we need to do is protect the chip from
the body." To solve that problem, the team has devised a hermetic seal for
the
chip made of titanium and ceramic that is impervious even to helium atoms,
which are smaller than water molecules.

The fineness of the retinal membrane, especially when coupled with the eye's

rapid movements, poses another challenge. "The notion of putting a computer
chip, this slab of silicon, on the retina is problematic," says Joseph
Rizzo,
codirector with John Wyatt of the Harvard/MIT project. "The retina is the
most
delicate part of the eye, and you need a delicate way of communicating with
it. Putting this brick on a surface that's like wet tissue paper, then
shaking
the wet tissue paper back and forth- it's not going to be good." Ideally,
says
Rizzo, what's needed is a mechanism that can hold the implant stable while
suspending the device just above the retina. His group has experimented with
a
ring-shaped platform tucked behind the iris. The platform supports the
implant's signal processor, while the microelectrode array is gently draped
down to the retina on a ribbon of silicone-coated wires and held in place by
a
bonding agent. The Hopkins researchers intend to use tiny metal tacks to
keep
their implant in place.

The nature of the contact point between the retina and the stimulating
electrodes raises tough issues that are as much a matter of physics as
biology. The optic neurons that researchers are trying to stimulate are 50
to
100 micrometers beneath the retinal surface- only the width of a couple of
hairs, but a huge distance in cellular terms. An electrical charge strong
enough to stimulate these neurons sufficiently may generate so much heat
that
it burns retinal tissue. A less powerful, safer charge, however, may not
stimulate neurons at all. Researchers have also struggled with questions
regarding the proper frequency and kind of electrical current to use.
Because
the retinal tissue will build up a charge, they plan to use an alternating
current so that the negative phase will cancel the positive phase of the
charge before electricity can accumulate in the eye.

Harry Woehrle, with his wife, Carol, hopes he will receive a retinal
implant.
"I have no trepidation, even though no one knows what's going to happen
until
the thing is in there." Finally, there is the matter of the size of the
electrodes. As scientists try to create detailed vision, they are faced with
a
catch-22. Say each electrode is meant to create a pixel, as on a TV screen.
Small electrodes will deliver a very localized stimulation to nerve cells,
presumably resulting in more pixels and a sharper picture. But because the
charge coming out of a smaller electrode is more concentrated, the charge is

more likely to burn the retina. A larger electrode delivers a safer, more
diffused charge but would create a fatter pixel and a less distinct image.
After years spent working with human and animal subjects, Hopkins
researchers
have settled on electrodes 200 to 400 micrometers in size- tiny in real
terms,
but still 10 to 20 times the size of human neural cells. For now team
members
believe they have found a happy medium- the right charge level, the right
frequency, and an electrode that can deliver a safe charge and a useful
stimulus. Other artificial-vision researchers are not satisfied. "These
retinas are very degenerated, and in order to get them to be responsive you
have to stimulate them more strongly than a normal retina," says Rizzo. "In
our experiments, that amount of charge can be unsafe. I think that the way
this issue will resolve itself isn't known yet."

Even if researchers meet these challenges, a larger question remains: Will
the
brain be able to figure out what's going on? It would help if we understood
what goes on in the mind of a healthy, seeing person. But we don't. "Nobody
understands why or how perception exists. It's the question that has beset
neuroscience," says Richard Normann, head of the cortical implant project at

the University of Utah (see " Straight to the Brain"). " Why is a stop sign
seen as red? Why is grass green? Nobody knows." Test subjects at Hopkins
have
identified a box shape. Patients in the Harvard/MIT group, blind for many
years, have seen spots of light.

This is unknown scientific territory. Technology already exists that can
tell
the body to modify its behavior: pacemakers that jolt the heart into pumping

rhythmically and electrical stimulators that allow quadriplegics to grasp,
but
these devices merely provoke muscular contractions. The cochlear implant
basically buys the brain ingredients and then lets it cook the dinner. But
the
goal of artificial vision is to tell the brain something concrete and
specific: We are firing electrodes in a pattern representing a doorway- see
it. For now it's as if, in trying to communicate with the brain, scientists
were writing a note to aliens from another planet. "We don't know the
language," says Rizzo. "It's sort of like having the letters but not knowing

how to combine them into words. And we don't even know all the letters. In
this work, we know that the frequency and strength of the signal matters and

all that, but there's no doubt that there are crucial variables about which
we
have no information or knowledge yet."

Humayun at Hopkins is willing to let the answers work themselves out once
implants are inside people. He puts the timetable for a working, marketable
retinal prosthesis at three to five years. Rizzo says that "if a safe
implant
with a reasonably high chance of success can be built at all," it is likely
to
take five to 10 years. Rizzo's team is not planning to run a trial anytime
soon. "Being first would be nice, but it's not the highest priority," says
Rizzo. "To move ahead with implantations, researchers should have very high
confidence that the device can be left in safely for a long time and a
reasonable level of confidence that the device would provide useful
information to us and benefit to the patient. Right now that's a tall
order."

For his own part, Humayun says: "I hope that, as scientists, we have enough
integrity and love for our patients not to do anything hastily and to put
only
the best device possible in patients. As long as we work ethically and
exercise care, I think we need to work faster so that millions of blind
people, we hope, will be able to see sooner."

One person who agrees is Harry Woehrle. He has another important reason for
wanting to go ahead with the trial. "I have nine grandchildren," he says,
"and
retinitis pigmentosa is an inherited disease. None of them has shown any
sign
of a problem, thank goodness. But if I can do something that might benefit
them or kids in other generations, I'm all for it."

A Taste of Sight

Instead of trying to replicate the intricate workings of the eye, University

of Wisconsin researchers have found a shortcut for transmitting crude
pictures
to the brain. The tongue human-machine interface, developed by Paul Bach-y-
Rita and Kurt Kaczmarek, is a small patch made of tiny disks of gold
attached
to a flexible ribbon cable containing 144 electrodes. The patch can be
connected to a camera and transmitter and activated in patterns to draw a
rough sketch on a person's tongue.

The patch could be placed anywhere on the body, but skin isn't a great
conductor of electrical signals, so the team picked the tongue as the ideal
interface. Packed with nerves and constantly bathed in highly conductive
saliva, it requires only 3 percent of the voltage needed to create the
equivalent sensation on a fingertip.

Those who have tried the patch describe the feeling as a mild tingling,
vibrating, or tickling. So far they have used the patterned pulses to
navigate
mazes or decipher simple graphics and found that their brains quickly adapt
and start to "see" the scene. Bach-y-Rita points out that "the brain is very

malleable," and because it is used to getting information as pulses along a
nerve, "it doesn't matter whether those pulses are coming from the eye or
the
big toe, once the brain has been trained to process them visually."

The current prototype looks like a broad, electrode-studded tongue
depressor;
within five years Bach-y-Rita plans to build a smaller model, which would be

discreetly concealed in a retainerlike frame. The resulting images could
provide vision equivalent to about 20/830. "I don't think anyone's ever
going
to be able to sit down and watch TV with this thing," he says, "but in terms

of recognizing shapes and basic navigation, it's more than adequate." -
Jocelyn Selim and Christine Soares

Who's Got Good Eyes?

If you had the eyesight of an eagle, you could read this article from a
football field away. (Downside: Your eyes would be the size of tennis
balls.)
If you had the eyesight of a dragonfly, you could read this magazine if it
was
held behind your head. (Downside: eyes the size of basketballs.) If you had
the eyesight of a rhesus monkey, you could read this page if it was less
than
an inch in front of your eyes. (Downside: You'd be a rhesus monkey.) In the
context of all creatures, we have eyes that are, well, not bad. "On a scale
of
one to 10, we rate about a seven," says Phillip Pickett, a veterinary
ophthalmologist at Virginia Tech. "Raptors rate a 10. Rats are about a one.
They're good at detecting motion, but that's about it." As Pickett points
out,
when it comes to sight, "best" can be defined several ways. One measure is
distance. Hawks and eagles can spot a mouse in a field from hundreds of feet

in the air. Then there's color. Human beings see three colors- red, green,
and
blue. Pigeons see violet, blue, blue-green, and yellow; bees perceive
ultraviolet light, enabling them to discern the UV color patterns flowers
make
when producing nectar. These evolutionary adaptations allow animals to excel

at a particular task. Humans evolved with senses in balance, so we aren't
reliant on any one in particular. People who can't see have lives as full
and
rich as anyone else. Indeed, it's arguable that our development has been
limited by our eyesight. "Think about how early philosophy and cosmology
were
determined by what we could see- flat-earth theory, geocentrism, and the
like," says Michael Robinson, former director of the National Zoo. "It
wasn't
until we extended our visual capabilities with telescopes and such that we
realized our true place in the universe." - G.C.

Straight to the Brain

"We don't see with our eyes, we see with our brains" is a favorite maxim of
vision researchers- so jacking directly into the visual cortex of the brain
would seem to be the most straightforward way to send it images. However,
the
brain is far more complex than the eye. Neuroscientists are still trying to
figure out how the visual cortex translates a code of electrical pulses from

the eyes into the 3-D color moving pictures we perceive as sight. Figuring
out
how to simulate that effect remains a still taller order.

As early as 1929, brain researchers knew that touching an electrode to the
visual cortex of a conscious test subject produced the perception of a spot
of
light, dubbed a phosphene. Starting in the early 1970s, National Institutes
of
Health researchers worked toward a visual cortex prosthesis, culminating
with
a human experiment in 1995. Thirty-eight electrodes were implanted in the
brain of a 42-year-old blind woman, and the NIH team tried to activate them.

Results were mixed. The study demonstrated that phosphene percepts could be
elicited even after 22 years of blindness, and that simple shapes could be
constructed from the phosphenes. Yet the brightness and duration of the
phosphenes the woman saw didn't correspond predictably to the stimulation.
By
the second month of testing, half the slender electrodes had broken. NIH
pulled the plug on further human experimentation, concluding that visual
cortex work "wasn't ready for prime time in people," says Audrey Penn,
acting
deputy director of the National Institute of Neurological Disorders and
Stroke.

Today, Richard Normann at the University of Utah believes he is close to
solving potential hardware problems for a visual cortex prosthesis with his
Utah Electrode Array. The UEA is a single unit, about 0.16 inch square, with

100 silicon electrodes, each one-third the width of a human hair. Once the
UEA
is inserted, each electrode nestles between many neurons so that the implant

floats with the brain's natural movement inside the skull, reducing the risk

of electrode breakage or tissue damage. Because the electrode tips are in
direct contact with neurons, far less power is needed to produce phosphenes
than an eye chip would require to send a useful signal across retinal
tissue.
Eventually, Normann thinks, a 625-electrode version of the UEA could produce

something on the order of a 625-pixel view of the world- enough perhaps to
read text and probably adequate for navigating everyday terrain. - Christine

Soares

RELATED WEB SITES:

To learn about the research being conducted by the Intraocular Retinal
Prosthesis Group at Johns Hopkins, see www.irp.jhu.edu.

MIT's Retinal Implant Project home page can be found at
rleweb.mit.edu/retina.

Find more about the tongue sensor, as well as a photo, at
www.engr.wisc.edu/news/ headlines/2001/Mar26.html.

Richard Normann's home page is www.bioen.utah.edu/faculty/RAN, and the Web
page of the Center for Neural Interfaces can be found at
www.bioen.utah.edu/cni.

© Copyright 2001 The Walt Disney Company. Back to Homepage.

content frame end


VICUG-L is the Visually Impaired Computer User Group List.
To join or leave the list, send a message to
[log in to unmask]  In the body of the message, simply type
"subscribe vicug-l" or "unsubscribe vicug-l" without the quotations.
 VICUG-L is archived on the World Wide Web at
http://maelstrom.stjohns.edu/archives/vicug-l.html


ATOM RSS1 RSS2