Brief Description

of

Virtual Hand: a Human-Hand Simulation System for Human Computer Interactions in Virtual Environment

 

In human computer interaction (HCI), traditional controlling and navigating devices become unsuitable for natural communication between the operator and the computer.  The human hand is considered by many as a reasonable vision-based HCI modality [[1] [2] [3]]. The initial step for using this kind of HCI modality is to create a human hand simulation and training system.

 

Our proposed system, Virtual Hand, will create an anatomically-based hand model and implement hand calibration and constraints, using knowledge gained from mechanical and kinematical studies of the human hand. In addition, Virtual Hand will provide two simulation applications: ground truth data of the hand for vision-based HCI experiments, as well as being an environment for testing (inverse) kinematics on the (robotic) hand.  In the following few paragraphs we give a brief description of the related work and our goals for Virtual Hand.

 

Research work has been reported about the mechanics and motions of the human hand [[4] [5] [6]], about hand modeling in the computer vision field [[7]  [8]  [9]], in graphics where hand modeling is a small part of the creation of the virtual human [[10] [11] [12] [13] [14]], and about hand constraint studying and implementation [[15] [16] [17] [18]]. But there are problems with the current models. For example, they render poorly, consume too much time, or neglect or reduce the complexity of the hand finger movements.  They only combine the static hand constraints and very few of the dynamic constraints and the function of the “thumb” is nearly ignored.

 

We propose Virtual Hand to explore these problems and to combine all these considerations into a real-time hand simulation testbed. Some necessary work has already done for this system [[19]], including a life-size (dynamic) hand model (with little deformation), static hand constraint implementation, and a prototype GUI. Our proposed system will build upon the limited prototype and add the following enhancements: better hand modeling, hand calibration, more constraint implementations, and two simulation applications, as well as enhancing the interface for usability.

 

  • Hand modeling.  The human hand is highly articulate and can be roughly seen as an actuator of 27 degrees of freedom (DOFs), thus generating complex movements. It is also organic in that with different motion various deformations are created. These factors, together with hand constraints, constitute the big issues of hand modeling.
  • Hand model calibration. To simulate the hand movement properly, the hand model must be calibrated according to the physical features of the user.   Some hand features include the thickness and width of the palm, lengths and perimeters of the fingers.
  • Hand constraint implementation.  The human hand cannot make any arbitrary gestures. Thus a hand simulation system must imitate natural hand movement.  Two types of hand constraints need to be implemented: (1) static constraints as a result of hand anatomy (i.e., hand joint motion ranges), and (2) dynamic constraints (i.e., intra-finger and inter-finger constraints).
  • Simulation applications: Two basic applications will also be implemented and combined into Virtual Hand: (1) pinch and grasping functions of the hand—to explore the inverse kinematics of a high-DOF (robotic) hand, and (2) hand ground truth data—to build hand gesture database and then to create natural hand gestures and animation.

 

 

 

References:

 


 



[1] David Joel Sturman. Whole-hand Input.  PhD thesis, Massachusetts Institute of Technology, February 1992.

[2] Ying Wu and Thomas S. Huang. Hand modeling, analysis, and recognition for vision-based human computer interaction. IEEE Signal Processing Magazine, 18(3):51—60, May 2001.

[3]Joseph J. LaViola Jr.  Whole-Hand and Speech Input in Virtual Environments. PhD dissertation, Brown University.

[4] Paul W. Brand. Clinical Mechanics of the Hand. The C. V. Mosby Company, 1985.

[5] Edmund Y. S. Chao and et al. Biomechanics of the Hand: A Basic Research Study. World Scientific Publishing Co. Pte. Ltd., Singapore, 1989.

[6] American Academy of Orthopaedic Surgeons. Joint Motion: Method of Measuring and Recording. Churchill Livingstone, New York, 1988.

[7] Horace H. S. Ip and et al. Animation of hand motion from target posture images using an anatomy-based hierarchical model. Computer and Graphics, 25:121—133, 2001.

[8] James J. Kuch and Thomas S. Huang. Vision-based hand modeling and tracking for virtual teleconferencing and telecollaboration. In IEEE Proceedings of Fifth International Conference on Computer Vision, pages 666—671, June 1995.

[9] John McDonald and et al. An improved articulated model of the human hand. The Visual Computer, 17(3):158—166, May 2001.

[10] Amaury Aubel and Daniel Thalmann. Realistic deformation of human body shapes. Available on-line [cited June, 2003]: http://vrlab.epfl.ch/Publications/publications_index.html.

[11] Norman Badler and et al. Representing and parameterizing agent behaviors. In Proc. Computer Animation, IEEE Computer Society, pages 133—143, Geneva, Switzerland, 2002.

[12] Paul G. Kry and et al. EigenSkin: Real Time Large Deformation Character Skinning in Hardware. In ACM SIGGRAPH Symposium on Computer Animation, 21(3): 153—160, 2002.

[13] Brett Allen and et at. Articulated Body Deformation from Range Scan Data.  ACM Transactions on Graphics 21(3): 612—619, July, 2002.

[14] Alex Mohr and Michael Gleicher. Building Efficient, Accurate Character Skins from Examples.  ACM SIGGRAPH, 2003.

[15] Yoshihiro Yasumuro and et al. Three-dimensional modeling of the human hand with motion constraints. Image and Vision Computing, 17:149—156, 1999.

[16] Chin-Seng Chua and et al. Model-based 3D hand posture estimation from a single 2D image. Image and Vision Computing, 20(3):191-201, March 2002.

[17] Jintae Lee and Tosiyasu L. Kunii. Model-based analysis of hand posture. IEEE Computer Graphics and Applications, pages77—86, September 1995.

[18] John Lin and et al. Modeling the constraints of human hand motion. In Proc. 5th Annual Federated Laboratory Symposium, pages 105—110, Maryland, April 2001.

[19] Development of a Nationally Competitive Program in Computer Vision Technologies for Effective Human-Computer Interaction in Virtual Environments. http://www.cs.unr.edu/~aerol/projecthome/, 

http://www.cs.unr.edu/~b_yi/vHand/