finger tracking system

Published on February 2017 | Categories: Documents | Downloads: 24 | Comments: 0 | Views: 214
of 7
Download PDF   Embed   Report

Comments

Content

Finger Tracking In Human And Computer
Interaction

Abstract--Human Computer Interaction (HCI) is a field where the
developer makes a user friendly system. User will be able to interact
with the system without using the computer’s conventional
peripheral devices. The Research on HCI has been restricted to
techniques based on the use of monitor, keyboard, and mouse.
Recently these paradigms have been changed. Researchers
establish the mechanisms to interact with the computer system
using the computer vision. This interaction is better than the
interaction using normal keyboard and mouse. The techniques
such as vision, sound, speech recognition etc allows better
interaction between machine and human. The use of hand gestures
provides an attractive alternative interface devices. Finger tracking
is the usage of bare hand to operate a computer in order to make
human-computer interaction much faster, better, and easier. Bare
hand communication means that no device or no wires are
connected to the user. He / She control the computer directly with
the movements of his/her hand. There are various approaches used
for Human Computer Interaction. This paper represents most of
the innovative mechanisms of finger tracking used to interact with
the computer system using computer visio [10].

and ears. Similarly, the computers should be able to imitate
those abilities of humans with cameras and microphones.

Keywords--Finger Tracking, Human Computer Interface, HCI,
Computer vision based recognition, Bare hand communication,
Hand gestures, Non conventional interaction

For the proper gesture recognition, various processes that can
be performed on each frame of video include segmentation,
background subtraction, Noise removal and Thresholding etc.
[10].

I.

INTRODUCTION

Nowadays computers have indeed become a need to live life.
HCI involves the planning and designing the interaction
between the users and the computers. An important advantage
of computer vision is its freedom. The user can interact with
the computer systems without wires and devices. Vision based
Finger tracking is an active area of research in Human
Computer Interaction (HCI), as direct use of hands and fingers
is the natural means for humans to communicate with each
other [7].
Today there are many different kinds of devices available for
human-computer interaction. Some examples are keyboard,
mouse, track-ball, track-pad, joystick, electronic pens etc.
More examples include cyber-gloves, 3Dmice (e.g. Labtec’s
Space ball) and magnetic tracking devices (e.g. Polhemus’
Isotrack). But despite the variety of these new devices,
human-computer interaction still differs in many ways from
human-to-human interaction. Natural interaction between
humans doesn’t involve or doesn’t need devices because
humans have the ability to sense their environment with eyes

The researchers developed various techniques to track the
movements of hand / fingers through the web cam to establish
an interaction mechanism between the user and the computer.
The method of tracking the movement of fingers in front of a
web cam is called the finger tracking. We use a colored
substance, motion detection, camera to control the mouse
movement and implement finger tracking.
Finger pointing systems mainly aim to replace pointing
and clicking devices like the mouse with the bare hand of
humans. These applications will require a robust localization
of the fingertip and the recognition of a limited number
of hand postures for clicking-commands. Finger tracking
systems are mainly considered as specialized kind of hand
posture / gesture recognition system.

I.

COMPARISON OF VARIOUS APPROACHES
USED FOR HCI

There are many approaches used for human computer
interaction through finger tracking and computer vision. They
are as follows:
A. HCI without using interface
In this approach there is no material used to interact with the
computer [13].
1) Plain finger tracking with single web cam:
This approach uses the bare hand to track the movement of
fingers in front of the web camera based on the human skin
color. This approach is of low cost, but they need special
lighting effects and conditions related to background. When
the lighting effect and background change, the result also may
change. The accuracy is when there is a constant background
and lighting condition[14].

B. HCI Using Interface
In this approach some materials like data gloves; markers etc
are used to interact with the computers[18].
1) Pasting Marker on Finger:
We paste the colour marker on the finger and then track the
movement of the finger. In these, we have to find particular
colour of the marker from the frame. For finding the colour,
we have to perform some operation like Thresolding. This
approach is better than plain finger tracking because it takes
less time. If the background colour is same as the marker
colour, then we can’t detect the finger movement. So, we must
take the static background to detect the finger movement
easily and fast[18].
2) Using Gloves with Markers and a Simple Web Cam:











Finger painting.
Piano Application.
Free Air Finger Tracking.
Remote Control.
Interacting with a virtual environment.
Augmented Reality.
Finger Mouse.
Free Hand Present.
Brain Storm.

A.

Robot control

The Robot control application is based on the gesture
recognition and it counts the number of fingers shown in front
of camera. Robots can follow the commands which we are
giving through our fingers. Robots can move forward,
backward, left or right according to the commands which will
be given by the fingers[1],[17].

In this approach, simple cloth gloves with specific colour
marker pasted on its finger is used. These markers are used to
uniquely distinguish the fingers based on colour of marker.
Here also static background always increases the accuracy.
For each frame, we are pre-defining the marker colour. So, it
will help in reducing the processing time and gives the precise
result. This approach is also used in cursor movement, virtual
mouse etc[18].
3) Using Gloves with Retro Reflective Markers and an
Infrared Web Cam:
In this approach the gloves are used with Retro reflective
markers and Infrared Web cam. The infrared web cam is used
to avoid lighting effect so that this approach can be used in
different lighting conditions. This Infrared web cam can easily
detect the Retro reflective markers. As result it increases the
number of frames could be processed within a second. This is
the approach used in virtual reality[18].
4) Using Special Hardware:
To track the movement of finger this approach uses special
hardware like charge coupled digital cameras, projectors,
gloves etc. This is the approach used in 3D virtual
environment. Because the approach uses the specialized
hardware, it is costly and less used for designing
applications[18].
III.
APPLICATIONS
Finger tracking via computer vision is mainly used to design
new applications which are free from the conventional
interaction devices such as keyboard, mouse etc. The
applications include the following:


Robot control.

Fig. 1. Robot control
B. Finger Painting
In finger painting, we can track the particular location of the
finger and using that location we can create a painting. We can

generate a digital drawing according to the movements of
fingers happening in front of a webcam. This is useful to the
painters to create their painting directly in the digital form[2].

Fig. 4.

Fig. 5.
Fig. 2.

Fig. 4. and Fig. 5.: Piano Application using Finger Tracking
D.

Free Air Finger Tracking

Free air finger tracking basically contains the zoom in, zoom
out and clicking event. Finger tracking can drag the pictures
and also changing the size of pictures. We can click and open
the data with the help of free air finger tracking[20].

Fig. 3.
Fig. 2. And Fig. 3 : Finger Painting
C. Piano application
In this application, the user track the finger’s position and
pointing out the switches. With the help of the finger pointing,
we can create the sound of the piano[19].

Fig. 6.

Fig. 7.

Fig. 6. and Fig. 7.: Free Air Finger Tracking
E. Remote Control
Remote control is an application to control a TV set. By
moving the user’s finger, a user could control the select
channels. We can also use this remote control application for
CD player or any other remote devices[3].

Fig. 9. And Fig. 10.: Virtual Reality based Game
G. Augmented Reality
Augmented Reality has been demonstrated widely in this real
world. Augmented reality used with many different
applications such as games, navigation and references.
Augmented reality applications became increasingly
interactive[15].
Augmented Reality works on the successful demonstration of
direct free-hand gestures[15].

Fig. 8. Remote Control
F.

Interacting with a Virtual Environment

Fig. 11.: Augmented Reality based scaling

In the Virtual Environment, We interact with the virtual object.
The user feels, he/she is actually interacting with the real
object, while interacting with the digital objects inside the
computer system.
Nowadays with the help of Virtual Environment, We create
virtual games, driving simulations, animation movies etc[19].

Fig. 12.: Augmented Reality in the 4*4 grid
board we use to track the hand.
H. Finger Mouse
Fig. 9.

Fig. 10.

The Finger Mouse system makes it possible to control a
standard11 mouse pointer with the bare hand. If the user
moves an outstretched forefinger in front of the camera, the
mouse pointer follows the finger in real-time. Keeping the
finger in the same position for one second generates a single
mouse click. An outstretched thumb invokes the double-click
command; the mouse-wheel is activated by stretching out all
five fingers (see Fig. 13.). The application mainly
demonstrates the capabilities of the tracking mechanism. The
mouse pointer is a simple and well-known feedback system
that permits us to show the robustness and responsiveness of
the finger tracker. Also, it is interesting to compare the fingerbased mouse-pointer control with the standard mouse as a

reference. This way the usability of the system can easily be
tested[10].

additional usability. It is easy to switch between the different
modes by (stretching out fingers), and the hand movement is
similar to the one used to move around papers on a table
(larger possible magnitude than with a standard mouse). For
projected surfaces the Finger Mouse is easier to use because
the fingertip and mouse-pointer are always in the same place.
Fig. 14. shows such a setup. A user can “paint” directly onto
the wall with his/her finger by controlling the Windows Paint
application with the Finger Mous[10].

Fig. 13.: The Finger Mouse on a projected screen
a.
b.
c.

Moving the mouse pointer
Double-clicking with an outstretched thumb
Scrolling up and down with all five fingers
outstretched.

There are two scenarios where tasks might be better solved
with the Finger Mouse than with a standard mouse:
1) Projected Screens:
Similar to the popular touch-screens, projected screens could
become “touchable” with the Finger Mouse. Several persons
could work simultaneously on one surface and logical objects,
such as buttons and sliders, could be manipulated directly
without the need for a physical object as intermediary[10].
2) Navigation:
For standard workplaces it is hard to beat the point-and click
feature of the mouse. But for other mouse functions, such as
navigating a document, the Finger Mouse could offer

Fig. 14.: Controlling Windows Paint with the bare finger.
I.

Free Hand Present

The second system is built to demonstrate how simple hand
gestures can be used to control an application. A typical
scenario where the user needs to control the computer from a
certain distance is during a presentation. Several projector
manufacturers have recognized this need and built remote
controls for projectors that can also be used to control
applications such as Microsoft PowerPoint.[10]

Our goal is to build a system that can do without remote
controls. The user's hand will become the only necessary
controlling device. The interaction between human and
computer during a presentation is focused on navigating
between a set of slides. The most common command is “Next
Slide”. From time to time it is necessary to go back one slide
or to jump to a certain slide within the presentation. The Free
Hand Present system uses simple hand gestures for the three
described cases. Two fingers shown to the camera invoke the
“Next Slide” command; three fingers mean “Previous Slide”;
and a hand with all five fingers stretched out opens a window
that makes it possible to directly choose an arbitrary slide with
the fingers[10].
J.

Brain Storm

The Brain Storm system is built for the described scenario.
During the idea generation phase, users can type their
thoughts into a wireless keyboard and attach colours to their
input. The computer automatically distributes the user input
on the screen, which is projected onto the wall. The resulting
picture on the wall resembles the old paper-pinning technique
but has the big advantage that it can be saved at any time[10].
For the second phase of the process, the finger-tracking
system comes into action. To rearrange the items on the wall
the participants just walk up to the wall and move the text
lines around with the finger. Fig. 15.b-d shows the arranging
process. First an item is selected by placing a finger next to it
for a second. The user is notified about the selection with a
sound and a color change. Selected items can be moved freely
on the screen. To let go of an item the user has to stretch out
the outer fingers as shown in fig. 15.d.[10].

Storm System
Idea generation phase with projected screen and
wireless keyboard
b) Selecting an item on the wall
c) Moving the item and
d) Unselecting the item
a)

IV. CONCLUSION
Human Computer Interaction (HCI) through computer vision
is more convenient way to interact with the device. A number
of applications for finger tracking are developed by
researchers. They always try to minimize the number of
peripheral devices to interact with the computer
So, it is more useful for the physically disabled people. We
expect that, the researchers will get extra ordinary applications
as an outcome of this effort.
REFERENCES
[1] Asanterabi Malima, Erol Özgür, and Müjdat Çetin, “A Fast
Algorithm For Vision-Based Hand Gesture Recognition For Robot
Control”, Faculty Of Engineering And Natural Sciences, Sabancı
University, Tuzla, İstanbul, Turkey, 2006
[2] Shardul Agravat, Prof. Gopal Pandey”Finger Painting using
Computer Vision” International Research Journal of Computer
Science Engineering and Applications,ISSN 2319-8672, Vol 2 Issue 2
February 2013.
[3] Dejan Chandra Gope” Hand Gesture Interaction with HumanComputer” Global Journal of Computer Science and
Technology,Volume 11 Issue 23 Version 1.0 December 2011
[4] Salil Batra, Chandra Prakash, “Commanding Computer Using
Gesture Based Patterns”, International Journal of Engineering and
Advanced Technology (IJEAT)
ISSN: 2249 – 8958, Volume-1, Issue-5, June 2012
[5] James L. Crowley and François Berard and Jöelle Coutaz,
“Finger Tracking as an Input Device for Augmented Reality”,
IMAG-LIFIA and IMAG-LGI, France, 1992
[6] Wendy H. Chun, Tobias H¨ollerer” Real-time Hand Interaction
for Augmented Reality on Mobile Phones” IUI’13, March 19–22,
2013
[7] Dorfmuller-Ulhaas, Klaus, and Dieter Schmalstieg. "Finger
tracking for interaction in augmented environments." Augmented
Reality, 2001. Proceedings. IEEE and ACM International
Symposium on. IEEE, 2001.
[8] Veeriah, Vivek, and P. L. Swaminathan. "Robust Hand Gesture
Recognition Algorithm for Simple Mouse Control.", International
Journal of Computer and Communication Engineering, Vol. 2, No. 2,
March 2013.

Fig. 15.: The Brain

[9] “Real Time Finger Tracking for interaction”,Image and Signal
Processing and Analysis, 2007. ISPA 2007. 5th International
Symposium on 27-29 Sept. 2007

[10] ”Finger Tracking in real time Human Computer Interaction”.
[11] Richard Harper, Tom Rodden, Yvonne Rogers and Abigail
Sellen,” Being Human: Human-Computer Interaction in the year
2020”, Microsoft Research Ltd,2008.
[12] Siddharth Swarup Rautaray Indian Institute of Information
Technology Allahabad, Anupam Agrawal Indian Institute of
Information Technology Allahabad ,“A Real Time Hand Tracking
System for Interactive Applications”, International Journal of
Computer Applications (0975 – 8887) Volume 18– No.6, March
2011.
[13] Vladimir I. Pavlovic, Student Member, IEEE, Rajeev Sharma,
Member, IEEE, and Thomas S. Huang, Fellow, IEEE, “Visual
Interpretation of Hand Gestures for Human-Computer Interaction: A
Review”, IEEE Transactions on pattern analysis and machine
intelligence, vol. 19, no. 7, july 1997
[14] “Global Journal of Computer Science and Technology”, Type:
Double Blind Peer Reviewed International Research Journal
,Publisher: Global Journals Inc. (USA), Volume 11 Issue 23 Version
1.0 December 2011
[15] James L. Crowley LIFIA-IMAG 46 Ave Félix Viallet 38031
Grenoble, France Francois Berard and Joelle Coutaz LGI-IMAG B.P
53 38041 Grenoble CEDEX 9, France,” Finger Tracking as an Input
Device for Augmented Reality”, Proceedings of the International

Workshop on Face and Gesture Recognition, Zurich, Switzerland,
June 1995.
[16] Branislav Kisaˇcanin Vladimir Pavlovi´c Thomas S. Huang
Advanced Engineering Dept. of Computer Science Beckman Institute
Delphi Corporation Rutgers University University of Illinois
Kokomo, IN 46904, USA Piscataway, NJ 08854, USA Urbana, IL
61801,
USA
[email protected]
[email protected]
[email protected],”Program Chairs’ Introduction to the 2004 IEEE
Workshop on Real-Time Vision for Human-Computer Interaction
at the 2004 IEEE CVPR Conference,Washington, D.C.”, Proceedings
of the 2004 IEEE Computer Society Conference on Computer Vision
and Pattern Recognition Workshops (CVPRW’04) 1063-6919/04 $
20.00 IEEE
[17] Asanterabi Malima, Erol Özgür, and Müjdat Çetin Faculty of
Engineering and Natural Sciences, Sabancı University, Tuzla,
İstanbul, Turkey [email protected],
[email protected], and [email protected], “A Fast
Algorithm for Vision-Based Hand Gesture Recognitionfor robot
control”
[18] “Technology Reforming 11 Ideas”, Global Journal of Computer
Science and Technology., Volume 11 Issue (Ver. 1.0 ),

ONLINE WEB SITES
[19] http://www.openni.org/files/finger- precise-tracking/
[20] http://www.youtube.com/watch?v=QPy6uy3aNGg

Sponsor Documents

Or use your account on DocShare.tips

Hide

Forgot your password?

Or register your new account on DocShare.tips

Hide

Lost your password? Please enter your email address. You will receive a link to create a new password.

Back to log-in

Close