VICUG-L Archives

Visually Impaired Computer Users' Group List

VICUG-L@LISTSERV.ICORS.ORG

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Reply To:
Date:
Fri, 5 Mar 2010 21:20:10 -0500
Content-Type:
text/plain
Parts/Attachments:
text/plain (86 lines)
'Skinput' turns body into touchscreen interface
Tapping on arm allows users to scroll through menus and select options
By Dan Hope
TechNewsDaily
updated 11:44 a.m. ET, Thurs., March. 4, 2010

Touchscreens may be popular both in science fiction and real life as the
symbol of next-gen technology, but an innovation called Skinput suggests the
true interface of the future might be us.

Microsoft and Carnegie Mellon University unveiled Skinput recently, showing
how it can turn your own body into a touchscreen interface.

Skinput uses a series of sensors to track where a user taps on his arm.
Previous attempts at using projected interfaces used motion-tracking to
determine where a person taps.

Skinput uses a different and novel technique: It "listens" to the vibrations
in your body.

Tapping on different parts of your arm creates different kinds of vibrations
depending on the amount and shape of bones, tendons and muscle in that
specific area. Skinput sensors can track those vibrations using an armband
and discern where the user tapped.

"Accuracy is already good, in the high 90s percent accuracy for finger
input," said project team member Chris Harrison, from Carnegie Mellon's
Human-Computer Interaction Institute.

"The arm band is a crude prototype," Harrison said. "The next generation
could be made considerably smaller - likely easily fitting into a
wristwatch."

 From there it's fairly simple to associate those tappable areas with
different commands in an interface, just as different keystrokes and mouse
clicks perform different functions on a computer.

When coupled with a small projector, Skinput can simulate a menu interface
like the ones used in other kinds of electronics. Tapping on different areas
of the arm and hand allow users to scroll through menus and select options.

Skinput could also be used without a visual interface. For instance, with an
MP3 player one doesn't need a visual menu to stop, pause, play, advance to
the next track or change the volume. Different areas on the arm and fingers
simulate common commands for these tasks, and a user could tap them without
even needing to look.

Skinput is the product of a collaboration between Carnegie Mellon's Harrison
and Desny Tan and Dan Morris of Microsoft Research. For now, Skinput is only
a proof-of-concept for alternate ways to interface with electronics, but the
team isn't ruling out that it could become a commercial product someday.

Harrison also pointed out that the next generation of miniature projectors
will be small enough to fit in a wristwatch, making Skinput a complete and
portable system that could be hooked up to any compatible electronics no
matter where the user goes.

Besides being bulky, the prototype has a few other kinks that need to be
worked out. For instance, over time the accuracy of interpreting where the
user taps can degrade.

"We (the researchers) have worn it for extended periods of time," Harrison
told TechNewsDaily. "But it does occasionally need to be retrained. As we
collect more data, and make the machine learning classifiers more robust,
this problem will hopefully reduce."

Skinput, and similar sensor devices developed by the team could have
applications beyond simple menu screens. Tan recently demoed a Skinput-like
interface that allowed him to play Guitar Hero, a popular music game,
without the requisite plastic guitar controller. The results were still a
little crude, but impressive because it proved the viability of game
controllers that don't require physical controls.

This is especially relevant considering the Project Natal technology
Microsoft is developing for the gaming industry and which has been gathering
a lot of attention. Despite working in vastly different ways, both systems
focus on letting users play games with their own bodies, without the need
for accessories and game controllers.


    VICUG-L is the Visually Impaired Computer User Group List.
Archived on the World Wide Web at
    http://listserv.icors.org/archives/vicug-l.html
    Signoff: [log in to unmask]
    Subscribe: [log in to unmask]

ATOM RSS1 RSS2