Hand Gesture Based Virtual Mouse
Hand Gesture Based Virtual Mouse
Hand Gesture Based Virtual Mouse
https://doi.org/10.22214/ijraset.2023.51731
International Journal for Research in Applied Science & Engineering Technology (IJRASET)
ISSN: 2321-9653; IC Value: 45.98; SJ Impact Factor: 7.538
Volume 11 Issue V May 2023- Available at www.ijraset.com
Abstract: Since the invention of the PC, a method for creating a connection cycle between humans and computers is currently
being developed. A truly innovative piece of HCI(Human Computer Interaction), the mouse. There are still such gadgets around
even though remote control or Bluetooth mouse technology is still in development. A Bluetooth mouse requires a dongle for
connectivity and battery. The difficulty of using a mouse increases when it has additional devices. The suggested mouse
framework goes beyond the point. This study suggest an HCI-based virtual mouse framework that makes use of computer vision
and hand signals. Signals generated with a location method and shading division and captured using a built-in camera and
webcam. The customer will be able to exercise partial control over.
Keywords: open CV, Numpy, PyAutoGUI, Image Processing, Virtual Mouse
I. INTRODUCTION
The size of the devices is decreasing as result of the improvements. Some devices have become remote, while others have been
inactive. The paradigm suggested in the paper could eventually lead to the dormancy of some HCI(Human - computer Interaction)
devices.
The goal is to create a virtual mouse that utilises best recognition we will stop the idea is to use a simple camera as opposite to a
traditional or standard mouse device to control mouse cursor capabilities. All that is required for a virtual mouse to function as a
conduit between the user and the system is a camera. It permits mouse control and facilitates human interaction with a physical
machine divide of a mechanical or physical mechanisms. It is definitely conceivable in this gesture detection system.
This framework utilises the openCV package, which is based on computer vision, and is created in the python programming
language. This system may replace both the conventional moves and the remote machine controller. The lighting situation is the
only hindrance. Because the majority of computers are used in low-light environments, For this reason, the framework can’t be
enough to replace the conventional mouse.
©IJRASET: All Rights are Reserved | SJ Impact Factor 7.538 | ISRA Journal Impact Factor 7.894 | 2458
International Journal for Research in Applied Science & Engineering Technology (IJRASET)
ISSN: 2321-9653; IC Value: 45.98; SJ Impact Factor: 7.538
Volume 11 Issue V May 2023- Available at www.ijraset.com
V. WORKING METHODOLOGY
Any framework sensors are a basic requirement. So a sensor (webcam) is also used in this framework to connect with the
environment. Its function is to capture life footage that the user contributes by way of a hand signal Before passing this information
to openCV, it is analysed. In this case, CV creates a code that is utilized to turn the live video into picture frames. The term “cutting
of video” truly refers to this action. Then, only the images with colors specified in the code are maintained after this frames have
been processed for color recognition. The framework get rid of any left over pictures. The speed that output images are displayed
corresponds to the speed.
VII. RESULTS
©IJRASET: All Rights are Reserved | SJ Impact Factor 7.538 | ISRA Journal Impact Factor 7.894 | 2459
International Journal for Research in Applied Science & Engineering Technology (IJRASET)
ISSN: 2321-9653; IC Value: 45.98; SJ Impact Factor: 7.538
Volume 11 Issue V May 2023- Available at www.ijraset.com
VIII. CONCLUSION
A real-time camera is used to guide the mouse cursor and carry out its tasks with the help of this virtual gesture control mouse. We
practise Mouse movements, Symbol selection, and most actions like left, right, double clicking, and scrolling. To track mouse
movements and identify symbols, this system relies on picture comparison and motion detection. Examining the outcomes, it is
generally believed that, on the off chance that we provide adequate lighting and a decent camera, it can operate in any location.
That’s that when we are going to organise our structure better. In the future, more functions will be integrated using the palm and
other fingers, including association in multiple windows windows, expansion and contraction, window closure, and so on. This
project may help reduce the amount of workspace and the weight on extra equipment. Since this project is more agile than any other
contemporary framework looking for PC association, it will actually want to withstand itself in the majority of circumstances. It
makes the user and the workspace closer than before because it removes the weight of gadgets.
©IJRASET: All Rights are Reserved | SJ Impact Factor 7.538 | ISRA Journal Impact Factor 7.894 | 2460
International Journal for Research in Applied Science & Engineering Technology (IJRASET)
ISSN: 2321-9653; IC Value: 45.98; SJ Impact Factor: 7.538
Volume 11 Issue V May 2023- Available at www.ijraset.com
REFERENCES
[1] D.L. Quam, “Gesture recognition with a Data Glove,”10.1109/NAECON.1990.112862, vol. 2, pp. 755 - 760, 1990.
[2] C.-Chiung Hsieh, D.-Hua Liou, & D. Lee, , “A real timehand gesture recognition system using motion history image,” Proc. IEEE Int'l Conf. Signal Processing
Systems (ICSPS), 2. 10.1109/ICSPS.2010.5555462, 2010.
[3] C.-Y. Kao, C.-S. Fahn, "A Human-Machine InteractionTechnique: Hand Gesture Recognition Based on Hidden Markov Models with Trajectory of Hand
Motion," in Procedia Engineering 15 pp. 3739 – 3743, 2011.
[4] P.S. Neethu,”Real Time Static & Dynamic Hand GestureRecognition,” International Journal of Scientific & Engineering Research, Vol. 4, Issue3, March 2013.
[5] A. M. Patil1, S. U. Dudhane1, M. B. Gandhi1,” CursorControl System Using Hand Gesture Recognition”, International journal of advanced research in
computer and communication engineering, vol. 2, issue: 5, May 2013.
[6] Itay Lieder, Tom Hope, and Yehezkel S. Resheff,Learning TensorFlow : A Guide to Building Deep Learning Systems
[7] https://towardsdatascience.com/sign-languagerecognition-using-deep-learning-6549268c60bd
©IJRASET: All Rights are Reserved | SJ Impact Factor 7.538 | ISRA Journal Impact Factor 7.894 | 2461