VICUG-L Archives

Visually Impaired Computer Users' Group List

VICUG-L@LISTSERV.ICORS.ORG

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Kelly Pierce <[log in to unmask]>
Reply To:
Kelly Pierce <[log in to unmask]>
Date:
Thu, 27 May 1999 06:15:25 -0500
Content-Type:
TEXT/PLAIN
Parts/Attachments:
TEXT/PLAIN (358 lines)
>From the web page:

http://www.techweek.com/articles/5-17-99/access.htm

tech Week 

Equal Access

Some dedicated people are helping the disabled 
participate in the computer revolution

by David Becker



As founder of the Sun Microsystems accessibility team, Earl Johnson helped
make the Java platform more adaptable for disabled users. 

Of all the groups that stood to benefit from the computer revolution, few
had more to gain than the disabled. Often limited by physical barriers to
information access, networked PCs promised vast new opportunities for
employment, independence and creative expression.

The reality has turned out to be a little more limited. While easier to
access than libraries or Braille books, numerous aspects of hardware and
software design make computer use awkward at best for blind,
mobility-impaired and other disabled users.

And because the disabled comprise a relatively small potential market, they
have been low on the priority list for many engineers and developers.

Progress is being made, however, thanks to a combination of technological
innovation, government pressure and do-it-yourself inventiveness.

"I think we’re in the same place racial integration was 10 years after the
Civil Rights Act," says James Fruchterman, president of Arkenstone, a
Mountain View nonprofit that develops and sells computer products for the
blind and other disabled users. "People understand there’s an obligation to
be inclusive of the disabled, but they’re not always clear what that means."

Advocates for the disabled point to recent advances such as the World Wide
Web Consortium’s newly adopted accessibility guidelines, a set of simple
standards to make Web pages more useful to blind users who rely on devices
that pronounce text on a computer screen. And the federal government
continues to push the Americans With Disabilities Act into cyberspace,
requiring government agencies to improve disabled access and inspiring
private employers to follow suit.

"The federal government is really leading the way," says Earl Johnson,
founder of the accessibility team at Sun Microsystems. "The customers who
are really important are the big customers who write big checks, and a lot
of those are in the realm of government and education. They are more and
more stipulating accessibility 

guidelines, a set of simple standards to make Web pages more useful to
blind users who rely on devices that pronounce text on a computer screen.
And the federal government continues to push the Americans With
Disabilities Act into cyberspace, requiring government agencies to improve
disabled access and inspiring private employers to follow suit.

"The federal government is really leading the way," says Earl Johnson,
founder of the accessibility team at Sun Microsystems. "The customers who
are really important are the big customers who write big checks, and a lot
of those are in the realm of government and education. They are more and
more stipulating accessibility in their contracts."

The payoff is considerable, Johnson notes, given historically high
unemployment among the disabled.

"Computers basically level the employment playing field," he says. "If
they’re designed with accessibility in mind, it removes the physical aspect
of the job. You’re sitting and pressing keys. It makes it possible to
mainstream people with disabilities."

Here’s a look at a few Silicon Valley individuals and companies who are
helping make that happen:

Auditory user interface

The graphical user interface is one of the foundations of the modern PC
revolution, allowing a wide range of users to access information without
learning program commands or other special skills.

The GUI doesn’t do a thing, however, for users who can’t see all the
colorful little icons.

The typical solution for blind users has been to outfit a machine running
Windows or the Mac OS with a screen reader, a hardware-software package
that reads and pronounces every item on the computer screen.

It’s a basic but far-from-elegant approach, given that much of a computer’s
visual input relies on graphic conventions that don’t translate well into
speech. Consider the way a screen reader presents a calendar: "April, Nine,
Nine, Su, Mo, Tu …"

Enter Emacspeak, an open-source program that not only speaks information
but interprets it, presenting it in a clear and logical format for users
who rely on sound.

"It is for the speech user what a Windows 95 desktop is for the sighted
user," says Mountain View programmer T.V. Raman, developer of the auditory
user interface. "It gives you a pleasant auditory interface for all your
computing tasks."

Raman, who works in the Advanced technology Group at Adobe Systems, began
working on the program five years ago, after wrapping up doctorate work at
Cornell University on auditory computer interfaces. Blind since childhood,
the more he worked with the computers the more he found conventional
accessibility solutions awkward and limiting.



----------------------------------------------------------------------------
----

"Things that are a nuisance to you visually are an absolute showstopper for
someone who’s blind."

—T.V. Raman, Ph.D.
AUI Developer



 

 

 


----------------------------------------------------------------------------
----

"Everything that happens on the screen with a standard interface is
designed to make you visually efficient," he says. "For the blind user, the
screen reader is always there on the side trying to read the tea leaves."

Raman decided it was time to build a solution from the ground up. Besides
reading words, his system offers a wealth of auditory cues, switching
voices to distinguish between various applications and presenting quick,
distinctive sound bites to mark commands such as "cut" and "paste."

"You have to think these things through," Raman says. "What does it mean to
tell an e-mail application to talk to you? The screen reader doesn’t
realize it’s an e-mail message, doesn’t distinguish it from any other text."

Raman used Emacs, an open-source system that predates Linux, because of the
ease of access it offers.

"The way Emacs is architectured, it’s very easy to extend it," he says.
"It’s a large open-source system, it’s flexible, it’s powerful."

The initial version of Emacspeak focused on basic tasks such as text
editing, but subsequent additions have turned the program into a
full-featured computing environment—it’s all Raman uses in his demanding
work at Adobe. One of the biggest challenges in extending the audio desktop
was coming up with an auditory browser that would make sense of the visual
clutter of the Web.

"One of the things I think Emacspeak excels at is getting information from
the Web and maneuvering around the various buttons and banners," he says.
"A Web page like Yahoo is just packed with stuff, and a screen reader reads
every piece of text, every link and button, every ad. It becomes so painful
to get what you want that you either need enormous patience or you don’t
use the damn thing. It’s like being dropped into a stranger’s living room
and it’s completely cluttered.

"Things that are a nuisance to you visually are an absolute showstopper for
someone who’s blind."

In true open-source fashion, Raman doesn’t know how many users rely on
Emacspeak. The program, which Raman updates twice a year, can be downloaded
from Raman’s home page. It’s also included on most Linux installation CD-ROMs.

"For the first year or so, I knew every user by name, but now the best I
can tell you is that there are 400 to 500 people on the Emacspeak mailing
list," he says.

Raman notes that while Emacspeak is purely a tool for the blind now, it can
also be considered a prototype for general-purpose computing in the future,
as ubiquitous computing and speech activation take hold. 

"The blind user using a screen reader is a bad model for developers working
on speech for those who can see," he says, noting the push toward
automotive computing. "With a good auditory interface, you could use the
Web or check your e-mail while driving.

"Ultimately it’ll be more than the blind user who benefits from having a
well-designed auditory interface."

Java lends a hand

While tremendous progress has been made in the past few years in making
computers more accessible to the disabled, most of the improvements are
still add-ons. You add a screen-reader, a specialized pointing device or a
piece of speech-synthesis software to a PC, load in the required drivers
and hope it gets along with whatever applications you need to use.

Earl Johnson thought it would be better to build support for such devices
into the software platform. Then developers could easily include
accessibility features in their applications. That’s the idea behind the
new Java Accessibility Application Programming Interface (API).

"It’s a contract between assistive technologies and the Java user interface
components," says Johnson, founder of the accessibility team at Java
originator Sun Microsystems. "The accessibility support is built directly
into the user interface, so the developer doesn’t need to do a lot of
special work to make their application accessible."

The API supports a number of the most common assistive technologies—screen
magnifiers, voice-recognition software, speech synthesis software, screen
readers. Johnson says building support for such systems into the Java
platform offers a number of advantages, most notably making it easy for
programmers to include the disabled when they design an application.

"If you have to add components to add a capability, it becomes a pain in
the butt for developers and they’re more likely to skip it," he says. "But
if you can build in these accessibility features and make it easy,
something developers get for free when they use a component, there’s a much
greater likelihood they’ll pay attention to it."

The API also makes support for such features more stable. You don’t have to
worry about losing the use of a voice recognition program when an operating
system is updated, as happened to Johnson, a quadriplegic, several years ago.

The API approach also improves voice recognition, allowing users to set up
word "clusters" so the program doesn’t have to consult its entire
dictionary to decipher a word. And it expands screen-reader technology by
better recognizing graphical presentations and looking beyond the screen.

"Because of the accessibility API, you can keep track of information
throughout the desktop, even if it isn’t in the area of focus," says
Johnson. "You won’t miss out on a message that the system is going down in
10 minutes because you were using your word processor."

Combined with earlier Java innovations such as Project Swing, which allows
for easy switching to keyboard navigation for those who can’t use a mouse,
Java has emerged into a full-featured language for disabled users.

"The thing that’s really exciting is that we’re releasing assistive
technology so closely on the heels of when Java was originally released,"
says Johnson. "For example, the first version of Windows came in 1988 and
it wasn’t until 1995 that a screen reader came out for the platform. It
took us just three years since Java was introduced to release an assistive
technology that works with the platform."

Custom-made OCR

James Fruchterman knew he had something that could be a tremendous help to
the blind when he helped develop one of the first optical character
recognition (OCR) systems for scanners in the 1980s. The directors of his
company, Calera Recognition Systems, agreed the blind might indeed benefit
from OCR but decided they were too small a market to be a good business
target.

"That’s when I basically decided I needed to go off and start a nonprofit
to work on this project and not to have to worry about profitability," says
Fruchterman, founder of Arkenstone, an organization that develops computer
products for the blind and other disabled users.

The company’s first product combined Fruchterman’s OCR system (licensed
from Caere, the firm that bought Calera) with a scanner and a speech
synthesizer to create an all-purpose reading system for the blind. Place
any printed text on a scanner connected to a PC running Open Book, and the
software interprets, stores and pronounces the words.

"My idea is that the easiest solution that works is the one you should
pick," Fruchterman says. "I wanted something that would allow any
Pentium-class system with a scanner to become a reader."

The Open Talk software subsequently gave birth to WYNN (What You Need Now),
a reading system specially designed for the needs of dyslexic and other
learning-disabled users. Open Talk is also the basis of VERA (Very Easy
Reading Appliance), a stripped-down version of the system that puts
everything in a wooden case with just a few controls to manipulate.

"It’s really for the older user who runs away when they hear ‘computer,’ "
says Fruchterman. "It’s for people who don’t want complex technology but
want reading capability."

Fruchterman says response to the Open Talk series has been gratifying.

"A lot of blind people don’t want to be dependent on someone else," he
says. "You can get books in accessible formats, but there’s a lot of other
necessary reading—mail, newspapers—that isn’t accessible. Most blind people
who don’t have assistive technology have to rely on a spouse or attendant
to read to them.

"One person told me there were two things in life he really wanted to be
able to do—read a book and drive a car. We’d covered the reading part, so
he wanted to know when we were going to have something to let him drive."

While it won’t put blind people behind the wheel, Arkenstone’s latest
product takes a big step toward making the blind more independently mobile.
Atlas Speaks adds a speech interface to digital mapping firm Etak’s huge
database of U.S. street maps. Blind users can type in two addresses and get
precise walking directions, which can then be saved on paper using a
Braille printer.

"By hitting the keys and listening, you get the same information as a
sighted person using a street map," says Fruchterman. "That means getting
to know your own neighborhood, finding intersections, determining the best
walking route to a destination."

For blind users already adept at using canes or guide dogs to get around,
Atlas Speaks allows them to plan a precise and safe route to their
destination. 

"We’ve taken this to blind conventions and people are amazed," Fruchterman
says. "They type in their address and they find things in the neighborhood
they never knew were there."

The next generation of the device will combine the Atlas Speaks software
with sophisticated GPS receivers, which will tell the blind pedestrian
exactly where he is and how to get to his destination. One of Arkenstone’s
sales executives already uses a prototype of the system, dubbed Strider, to
get around strange cities.

But Fruchterman says a number of factors, ranging from the deliberately
fuzzy GPS signals supplied by government satellites to the size and battery
life of the average laptop computer required to run such a system, stand in
the way of putting a final product on the market.

"We’ve stopped making projections for when we’ll have Strider ready for
release, but it’ll happen," he says. 

While Arkenstone is helping the disabled achieve new levels of
independence, it’s also bringing a new business sense to assistive
technology by overcoming the profit limitations that prevent many
corporations from pursuing disabled applications. Fruchterman notes that
Open Book utilizes technology licensed from 15 different for-profit firms,
all of whom were happy to provide their products at discounted rates.

"We’re get incredibly favorable licensing deals and other kinds of help out
of proportion to our size," he says. "Every engineer likes to think he’s
making the world a better place. When they see how we’re going to use their
work, it taps into that spirit."

And it also makes good business sense because assistive devices are often
precursors to mass-market items. Just consider how speech recognition
technology has entered the mainstream.

"Everything we’re doing with assistive technology is going to be a
mass-market item once it reaches the right price point," Fruchterman says.
"We won’t need to exist at some point."


Staff writer David Becker can be reached at [log in to unmask]


VICUG-L is the Visually Impaired Computer User Group List.
To join or leave the list, send a message to
[log in to unmask]  In the body of the message, simply type
"subscribe vicug-l" or "unsubscribe vicug-l" without the quotations.
 VICUG-L is archived on the World Wide Web at
http://maelstrom.stjohns.edu/archives/vicug-l.html


ATOM RSS1 RSS2