Human Computer Interaction Using Hand Data Glove and Wii Remote
Human Computer Interaction Using Hand Data Glove and Wii Remote
Human Computer Interaction Using Hand Data Glove and Wii Remote
Volume: 3 Issue: 2
ISSN: 2321-8169
338 341
_______________________________________________________________________________________________
Human Computer Interaction Using Hand Data Glove and Wii Remote
Prof. Humera N. Syed
AbstractA real time Human Computer Interaction based on th e hand data glove for gesture recognition is proposed. The
proposed system can capture hand gesture through the use of hand data glove which is equipped with sensor that senses the
movement of hand and pass those movements to the computer through the use of Wii Remotes IR camera in the form of
continuous signals. Using Wii Remote we can calibrate the area to work with and plot the output almost on any platform by using
projector. Gestures are classified as clicking, double clicking, and dragging. Recognizing these gestures relevant actions are taken
such as drawing using paint, sending message and call dialing using GSM modem and GUI application to interact. The results
show that glove used for interaction is better than normal static keyboard and mouse as the interaction process is more accurate
and dynamic in natural environment
Keywords- Human Computer Interaction, Data Glove, Wi i Remote, IR camera, Virtual Reality, Hand Gesture Recognition.
__________________________________________________*****_________________________________________________
I.
INTRODUCTION
Virtual Reality [1] had made its way in increasing the power
of computers making it realistic and interactive. Virtual
Reality has replaced the traditional use of keyboard, mouse
and joystick in HCI environment. Data glove is one such
sensing device used in Virtual Reality for hand gesture
recognition [2] in HCI and it is based on sensor based approach.
Hand data glove [3] [4] is an electronic device which is equipped
with sensors sensing the movements of fingers, and passes
those movements to the computer in the form of signals
continuously. Hand data gloves are used in applications
including virtual reality [5], gaming [6], robotics [7], character
recognition and verification, shopping applications and the
most important use is in medical surgery where it is used for
practical purpose because it is highly accurate. Here Hand data
glove will be used to provide a user interface for painting, call
dialing, message sending and also for opening any icon from
the computer.
This paper represents a special interactive system which
consists of a Bluetooth enabled computer running a calibration
utility software, a projector connected to the computer, a GSM
modem connected to the computer for the call dialing and
message sending purpose, a whiteboard on which projection
can be done, a Hand data glove equipped with sensor,
Nintendos Wii Remote for capturing the signals sent by the
Hand data glove. Some recent works include i) Tarchanidis et
al. [8] developed a data glove based on force sensors that are
attached to each finger. This data glove was used to detect the
tactile sensation, but it had limitations of precise recognition.
ii) Johnny Chung Lee [9] described a classical IR pen or Stylus
338
IJRITCC | February 2015, Available @ http://www.ijritcc.org
_______________________________________________________________________________________
ISSN: 2321-8169
338 341
_______________________________________________________________________________________________
II.
PROPOSED SYSTEM
Figure 5. IR LED
III.
METHODOLOGY
_______________________________________________________________________________________
ISSN: 2321-8169
338 341
_______________________________________________________________________________________________
means of calibration. Calibration [15] is also essential to make
the glove measurements unaffected by the differences in users
hands, finger length and thickness. This calibration is
performed by calibration utility software where the user have
to flex their hand four times.
Whenever a click event is being performed by the Hand data
glove, IR sensor [16] transmits signal to the Wii Remotes IR
camera. IR camera receives the signal and wherever the click
event is being performed, it changes the color of that particular
pixel to black and rest all the pixels are set to grey. IR camera
also finds out the X and Y co-ordinates of that particular pixel.
Nintendos Wii Remotes IR camera [17] [18] is capable of
providing accurate positional information. IR camera is
connected to the laptop via Bluetooth and sends the coordinate data to the Wii library or Wii drivers via Bluetooth.
Wii library or Wii drivers returns the calculated X and Y coordinates to the calibration utility software which is connected
to the Windows Operating System. The co-ordinate data is
then sent to the O.S. The O.S hence controls the mouse cursor
whose effects are seen in the form of output shown through the
use of a projector.
Figure 9. User clicking on the first, second, third and fourth calibration
points respectively and performing free hand drawing.
IV.
RESULTS
V.
CONCLUSION
FUTURE WORK
_______________________________________________________________________________________
ISSN: 2321-8169
338 341
_______________________________________________________________________________________________
keyboard and mouse. High dimensional applications could
also be tried running on this system and all the main controls
required for running the application can be achieved through
the use of Hand data glove. Using the Hand data glove, a
combination of two or more gestures can form a new complex
gesture for a complex task.
REFERENCES
[1] Dr. Manolya Kavakali and Dilshan Jayarathna, Virtu al Hand: An
Interface for Interactive Sketching in Virtual Reality, (CIMCAIAWTIC05),2005, IEEE
[2]
[3] Hanan Teleb and George Chang, Data Glove Integrati on with 3D
Virtual Environments, International Conference on Systems and
Informatics, 2012, IEEE, 107-112.
[4] Chin-Shyurng Fahn and Herman Sun, Development of a Data Glove
With Reducing Sensors Based on Magnetic Induction, IEEE
Transactions on Industrial Electronics, vol. 52, no. 2, April, 2005, 585594.
[5] Tuukka M. Takala, Roberto Pugliese, Pivi Rauhamaa and Tapio Takala,
Reality-based User Interface System (RUIS) , IEEE Symposium on 3D
User Interfaces 2011, IEEE, 141-142.
[6] Shiratori and Hodgins, Accelerometer-based user in terfaces for the
control of a physically simulated character ACM Tr ansactions on
Graphics 27, 5, pp. 19. 2008.
[7] C. Lee, and Y. Xu, Interactive Learning of Gesture s for Human/Robot
Interfaces. In IEEE International Conf erence on Robotics and
Automation, pp. 2982-2987, 1996.
[8] K. N. Tarchanidis and J. N. Lygouras , "Data Glove with a Force Sensor,
" IEEE Transactions on Instrumentation and Measurement, vol. 52, no. 3,
June 2003, pp. 984-989.
[9] Johnny Chung Lee, "Hacking the Nintendo Wii Remote," Pervasive
computing, vol. 7, no. 3, 2008, pp. 39-45.
[10] D Balakrishna, PV Sailaja, RVV Prasad Rao, and Bipin Indurkhya, A
Novel Human Robot Interaction using the Wiimote , Proceedings of the.
2010 IEEE, International Conference on Robotics and Biomimetics,
December 14-18, 2010, Tianjin, China, 645-650
[11] Sven Olufs and Markus Vincze, A Simple Inexpensive Interface for
Robots using the Nintendo Wii Controller, The 2009 IEEE/RSJ
International Conference on Intelligent Robots and Systems October 1115, 473-479.
[12] Riyeth P. Tanyag, Marc Jordan G. Angco, Rowel O. Atienza, InCtrl
HD: Intuitive User Interface Control using Wii Remote for High
Definition Videoconferencing, 2009 International C onference on
Information and Multimedia Technology, 99-103.
[13] Luigi Gallo and Mario Ciampi, Wii Remote-enhanced Hand-Computer
IEEE Interaction for 3D Medical Image Analysis, 2009, IEEE.
[14] Leonidas Deligiannidis, and John Larkin, Navigatin g Inexpensively and
Wirelessly, May 25-27, 2008, IEEE.
[15] Sung-Yeol Kim, Woon Cho, Andreas Koschan, and Mongi A. Abidi,
Depth Data Calibration and Enhancement of Time-of- flight Videoplus-Depth Camera, 2011, IEEE.
[16] Ryan Connaughton and Matthew Modlin, A Modular and Extendable
Robotics Platform for Education, 39th A SEE/IEEE Frontiers in
Education Conference, 2009, IEEE, T2G-1 - T2G-4.
[17] Deliang Zhu, Zhiquan Feng, Bo Yang, Yan Jiang and Tiantian Yang,
The Design and Implementation of 3D Hand-based Hum an-Computer
Interaction Platform, 2010 International Conference on Computer
Application and System Modeling, IEEE, V2-485 - V2-489.
[18] Dan Ionescu, Gesture Control and the New and Intel ligent ManMachine Interface, 6th IEEE International Symposiu m on Applied
Computational Intelligence and Informatics May 19 21, 2011, IEEE.
341
IJRITCC | February 2015, Available @ http://www.ijritcc.org
_______________________________________________________________________________________