VICUG-L Archives

Visually Impaired Computer Users' Group List

VICUG-L@LISTSERV.ICORS.ORG

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Kelly Ford <[log in to unmask]>
Reply To:
Kelly Ford <[log in to unmask]>
Date:
Wed, 1 Sep 1999 12:10:35 -0700
Content-Type:
text/plain
Parts/Attachments:
text/plain (166 lines)
Hi All,

It has been a while since I explored this testing.  Does anyone know what's
happening with respect to access technology and this testing software or in
general how the tests are given to people who are blind these days,
especially if this article is correct that any paper format is being phased
out.  It was my impression that with proper notice one could get an
alternative format of the paper version.

From the web page:

http://www.wired.com/news/print_version/email/explode-infobeat/culture/story
/21531.html?wnpg=all

   updated 11:15 a.m.  1.Sep.99.PDT

An E-Hurdle to Grad School
by Kendra Mayfield

3:00 a.m.  1.Sep.99.PDT

When Amy Cuddy signed up to take the computerized Graduate Record Exam last
year, she felt confident she would do well.
After sailing through the test, she was shocked to see the results of her
score: a 300 on the analytical section and a 550 on the quantitative. She
realized there had to be an error.

"You would have to try pretty hard to get that low of a score," said Cuddy.
"I thought, it's over. I'm not going to graduate school anywhere."

A month later, Cuddy took the exam again, this time with a pencil and
paper. Her analytical score jumped 390 points, bumping her from the 3rd to
the 84th percentile. Cuddy filed a complaint with the Educational Testing
Service, requesting that it remove her computerized score and reimburse her
for the fees.

Two weeks ago, Cuddy won her settlement. ETS agreed to cancel the earlier
scores and refunded approximately US$350.

But graduate school applicants no longer have the option to take the
paper-and-pencil version of the GRE. Like Cuddy, many are daunted by the
weight computerized test scores carry in determining their academic future.

In April, GRE began phasing out pencil-and-paper exams in favor of the
Computer Adaptive Test, or CAT.

The computerized test has some benefits, to be sure. No more paper
registration or waiting weeks for forms to arrive. Instant scoring means
instant gratification, allowing students to report their scores immediately
to their chosen schools.

And test dates are flexible: The computerized GRE is now available 150
dates per year, rather than five or six.

But the change in format has some students wishing they had pencils to chew
on and paper to scratch.

On the computerized exam, students don't have the option to skip or
re-visit questions. And spending too much time on a question can trip up a
meticulous test-taker.

Unlike the pencil-and-paper format, where all test-takers receive the same
set of questions, the computerized test presents each person with a
different set of individually tailored questions. If you get a question
right, the computer shows you a more difficult question. If you get it
wrong, the computer gives you an easier one.
By favoring those who are more inclined to guessing, FairTest public
education director Robert Schaeffer said the computerized format "favors
brash, white males" over women and minorities.

"The computerized format adds to the existing bias of the paper-and-pencil
version of the exam," said Schaeffer. "This penalizes the slow, careful,
methodical student in favor of the fast and superficial."

Test-takers who shift strategies to adapt to the computerized format may
find they have not performed as well as they thought while taking the test.

"People try to gauge the difficulty level as they go along," said
instructor Megan Katin of the The Princeton Review, which offers CAT test
preparation classes and resources.

The higher cost of the computerized exam may be a barrier to some students.
Test fees have increased 75 percent, from $55 for the paper-based GRE to
$96 for the CAT.

Then there is the psychological stress.

"People tend to be a little more panicked about taking the test on the
computer than they ever were with the pencil-and-paper format," said
Cathryn Still, managing director of graduate programs for the Princeton
Review.

For computer-phobic test-takers, the idea of adapting to a computer or
staring at a blank screen while waiting for test scores increases the
anxiety. Although the computerized test requires only basic processing
skills, those who are unfamiliar with computers may be disadvantaged.

"The computerized format does not create a fair testing environment," said
Still, adding, "The architecture of the test is definitely going to affect
performance."

ETS has not released much original practice material for the computerized
tests.

Not that it matters. Still said the practice computerized tests aren't
sufficient preparation for the actual CAT. Test-takers can't check their
answers on the computerized practice test as they could on the old paper
versions, for one thing.

And the official CAT doesn't let test-takers view individual results, either.
"The biggest problem is that the test is not disclosed," said Cuddy, who
was notified of her final score but had no way to check her answers.

Both ETS and Princeton Review say the switch to computerization has not
resulted in a dramatic change of scores across the applicant pool. In
general, computerized scores tend to correlate with those of the written
version.

"The questions and subjects are the same, the only difference is the way
answers are presented," said ETS spokesman Kevin Gonzalez. ETS studies have
found "the mode of testing either helped people or didn't affect them at
all." And, in some cases, students preferred to take the computer-based
version.

Test advocacy groups argue that test-takers should still be able to choose.

"These are clearly not the same tests and are not measuring the same
constructs," said Cuddy. "Until the glitches are worked out, you should
still have a choice."

Cuddy's case is telling for students who may worry about the possibility of
computer error threatening their scores.

During the initial transition to computerized testing, many test centers
experienced technical problems, from computer screen blackouts to frozen
mouses.

Since 1997, the Graduate Management Admission Test has been given only in
computer format, and in December 1998 the entire national computerized GMAT
system crashed. "Black screen death" is enough to increase test anxiety,
even if scores are correctly reported, FairTest's Schaeffer said.

The College Board and the Law School Admission Council are both exploring
possibilities of computerizing the Scholastic Achievement Test and the Law
School Admission Test. The College Board has launched a pilot program to
offer a computerized SAT in schools this spring.

Unlike ETS, which has a monopoly on graduate-level standardized testing,
the College Board cannot yet afford the costs of switching to computerized
format because students might opt to take the alternative ACT Assessment
test.

Widespread computerization of the SAT is probably years away. The College
Board wants to devise a testing network that will minimize costs, ensure
comparable scores, and be able to handle the overwhelming number of college
applicants.


VICUG-L is the Visually Impaired Computer User Group List.
To join or leave the list, send a message to
[log in to unmask]  In the body of the message, simply type
"subscribe vicug-l" or "unsubscribe vicug-l" without the quotations.
 VICUG-L is archived on the World Wide Web at
http://maelstrom.stjohns.edu/archives/vicug-l.html


ATOM RSS1 RSS2