The Sign
Language Interfacing System: a working platform in which
users can create, edit, save, and retrieve (1) 3D virtual human
gestures, (2) “words” for
sign languages, and (3) virtual signing.
Current stage:
Common body postures (including hand configurations) have been
built and saved in the databases.
The body postures in the databases can be retrieved and edited
for the construction of new hand body postures (and hand
configurations). The newly constructed postures can also saved in the
databases.
Basic facial expressions are available for the creation of sign
language Nonmanual Signals (NMS).
A virtual signing can be built from a set of key postures from
the database and intermediate postures will be automatically
interpolated for a smooth signing animation. The virtual signing
process can be edited to allow for a grammatically correct sign in a
sign language.
Virtual signings can be saved to and retrieved from the
databases.
All the operations are through Graphical User Interfaces.
English and American Sign Language (ASL) are used for testing
the system.
Next stage:
Embed a sign language writing system into the system.
Invite sign language professionals to use the
system.
... ...
The ultimate goal:
To develop a Sign Language “Editor” like Microsoft Word for spoken
languages.
Screenshot 1:
The upper part is for displaying (the two hands and whole body,
either postures, or animation sessions).
The middle part is for selecting common hand configurations.
The lower part is for operations on the virtual body (this one
for adjustment of body joint movements).
Screenshot 2: for
creation, editing, saving, retrieving, and recording virtual signing.
Screenshot 3: how to
retrieve sign language words, phrases, and sentences from the databases
(the display window shows the virtual signing animation process).