Motion-Sensing Home Automation With Vocalization

Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

Volume 8, Issue 7, July 2023 International Journal of Innovative Science and Research Technology

ISSN No:-2456-2165

Motion-Sensing Home Automation with Vocalization


Lalitha S1, Madhusudhan.K.N.2, Shivam Sharma3, Saurav Baid 4, Sanyam Mishra5 , Vishal R Teggi6
1,2,3,4,5,6
Dept. of Electronics and Communication Engineering, BMS College of Engineering, Bangalore, India

Abstract:- Effective communication is essential for Understanding and accepting nonverbal individuals'
conveying information and connecting with others. diverse modes of communication fosters inclusion and makes
However, for folks who are unable to speak due to certain that their opinions are heard. to promote accessible
physical disabilities, expressing their thoughts and ideas environments, support the development and use of
can be challenging. This is particularly true for appropriatecommunication strategies, and advocate for their
individuals who are Deaf and Mute. There are a lot of inclusion in education, employment, and social settings.
people who don't know.sign language, which further
limits communication opportunities for those who rely on By embracing the unique communication needs of
it. While some devices exist that can convert sign language nonverbal individuals and providing appropriate resources
into text and speech in English, there is not enough and support, we can make society more diverse and values
solutions for other devices, that combined this work and and respects the various methods in which individuals express
give the result. In Our work, we propose a system that themselves.
recognizes Customized sign language . The system aims to
provide a more inclusive communication experience by Basically this problem can be solved there are 2
enabling seamless interaction between Mute individuals methods that we can look into.one is the vision based
and others, fostering better understanding and detection and another one by using sensors for the conversion
accessibility. The system will be implemented on an of the sign language.
Android platform, allowing for easy access and We have chosen the sensor and given it the utmost
widespread use. priority as the different ways that relevance of vision-based
Keywords:- Arduino UNO. recognition. Poor lighting conditions, visual obstructions, or
complex backgrounds can hinder the accuracy and reliability
I. INTRODUCTION of vision-based systems. In such cases, sensor-based
recognition may offer more reliable and consistent results
In a diverse world where communication is a since it relies on other data sources not affected by visual
fundamental part of human interaction, there are individuals limitations.
who face challenges in expressing themselves verbally.
These individuals, commonly referred to as nonverbal or non- In our project, we have incorporated a customized sign
speaking, have unique communication needs and language system instead of utilizing an existing, standardized
experiences. Although they might not use spoken language as sign language. This decision was driven by the specific
a primary means of communication, It is crucial to understand requirements and goals of our project, in addition the unique
and support their abilities to connect with others and context in which it operates. By developing a tailored sign
participate in society. language, we were able to propose a communication system
that directly aligns with requisites and capabilities of our
Talking about statistical data specifically addressing the intended users. This customized approach permitted us to
population of nonverbal individuals is difficult to obtain due create a more accessible and efficient means of
to the variety of conditions and circumstances that can lead to communication, taking into consideration the specific
non verbalism. However, it is estimated that approximately gestures, symbols, and expressions that best suit the targeted
1% to 2% of the general population is affected by conditions audience. Through this innovative approach, we aim to bridge
that result in significant speech and language difficulties. communication barriers and empower individuals to
effectively express themselves inside the structure of our
Nonverbal People may include those who suffer from project.
conditions like autism spectrum disorder, developmental
disabilities, neurological disorders, hearing impairments, or II. PROPOSED SOLUTION
physical disabilities affecting speech production. Each
individual's communication abilities and needs can vary The schematic block presented in Fig. 1 illustrates the
greatly, requiring a personalized approach to support and schematic representation of the proposed system, outlining
accommodate their unique circumstances. the various interconnected components and their functional
relationships. The proposed system is based on various sign
Alternative forms of communication, such as sign language, hand actions and customized sentences.In this
language, augmentative and alternative communication proposed system different components have been applied like
(AAC) devices, pictorial systems, or assistive technology, are Arduino uno, RF encoder decoder, HC-05 Module, Flex
essential tools for nonverbal individuals to express Sensor, MPU-6050 module.
themselves and interact with others effectively. These tools
enable them to convey their thoughts, needs, and emotions,
facilitating meaningful connections and can easily socialize
with people around them.

IJISRT23JUL569 www.ijisrt.com 943


Volume 8, Issue 7, July 2023 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165
A. Flex sensor wireless communication is a factor in its popularity.
Flex sensor is a sensor which measures the amount of
bendingand its resistance value changes according to bending C. MPU-6050 module
movement so sometimes it is called a flexible potentiometer. The MPU-6050 module is a popular sensor module used
Flex sensor has a range of around 10k -35k. for motion tracking and orientation sensing. It combines a 3-
axis accelerometer and a 3-axis gyroscope into a single
B. HC-05 Module integrated circuit. MPU stands for Motion Processing Unit.
HC-05 Module is further known as Bluetooth The MPU-6050 module is commonly used in robotics,
module.Bluetooth Communication operates on a 2.4GHz drones, gaming applications, and other projects that require
frequency, making it a widely utilized RF communication motion sensing capabilities. It provides accurate
technology. With a range of approximately 10 meters, measurements of acceleration, angular velocity, and
Bluetooth is often used for a range of applications, including temperature.
data transfers, audio systems, hands-free devices, and
computer peripherals. The fact that it facilitates low range

Fig. 1: A block diagram representing the hand gloves side system.

Fig. 1 shows the input side or gloves side circuit block information to mobile through hc-05 bluetooth module and a
diagram in which Arduino UNO get the data from 4 Flex built application may be found on mobile devices.
Sensor and mpu-6050 where mpu-6050 is a module with a 3-
axis accelerometer and a 3-axis gyroscope and 4 flex sensor To convert the data coming from gloves into speech
provides four different values and combination of this all data. RF encoder encode the information from an Arduino
values provide unique actions It was coded in accordance to Uno and sends the encoded data to home automation board
this unique values. In the gloves side Arduino receives data which is shown in fig. 2
from sensors and process the value accordingly and sends the

Fig. 2: A block diagram representing the home auto side system

Whereas fig.2 shows the output side or home automation board which decodes data receives from rf
automation board side block diagram in which multiple encoder and there is an Arduino board also connected to
output appliances are connected and it receives the data from automation board which process the facts and turn on or off
gloves through rf encoder. There is rf decoder in home the appliances which are connected through relay module.

IJISRT23JUL569 www.ijisrt.com 944


Volume 8, Issue 7, July 2023 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165
D. Flow chart

Fig. 3: A block diagram representing the hand gloves side system.

III. LITERATURE SURVEY language gestures and has suggested possible solutions. It
serves as a valuable resource for understanding the specific
Our project aims to review existing research and studies requirements of Sign Language translation.
relating to the concept of hand movements which are  Saraf, K., & Gandhi, T. (2017). Hand gesture recognition
specifically aimed at converting Customized Sign Language using computer vision techniques: A review. Journal of
into text and speech. The aim of this initiative is to identify Intelligent Systems, 26(2), 351-372.
and comment on the status of techniques, algorithms and  Hand movement recognition using computer vision
approaches used in this area. techniques, which recognising hand motions in sign
 "A Survey of Vision-Based Hand Gesture Recognition" by language proposed system, is covered in this paper. The
Rahmati et al. (2018) paper gives an insight into progress and challenges in this
 The paper focuses on vision-based hand movement area through various ways, such as templates matching,
recognition techniques. It explores various approaches, appearance-based methodologies or machine learning
including depth cameras, and infrared sensors, for methods.
capturing hand movements. The survey covers gesture  Kumar, A., & Devi, V. (2019). Hand gesture recognition
recognition algorithms, datasets, and challenges. Although for Sign Language using computer vision: A review.
not specific to Customized Sign Language, it provides Multimedia Tools and Applications, 78(3), 3699-3724.
valuable insights applicable to the proposed research.  This paper focuses on recognising hand motions in sign
 Prakash, C., & Kumar, M. (2020). Real-time Customize language computer vision techniques. It's collecting
sign language recognition system using deep learning different methods for extracting features, classifying
techniques: A review. Journal of Ambient Intelligence and algorithms related to the recognition of hand gestures.
Humanized Computing, 11(6), 2431-2446.  Gabel, T., & Lange, B. (2017). A systematic review of
 Real time Customized/Customize sign language wearable sensors and devices for recording hand-specific
recognition systems that use deep learning techniques are data. Sensors, 17(12), 2693.
the focus of this paper. It's investigating the use of Deep  An overview of sensors and devices to record wrist specific
Neural Networks, such as CNN and LRLSTM networks for data shall be given in the framework of this systematically
recognition of hand gestures. An in-depth analysis of the reviewed review. It examines a variety of sensor
state-of-the-art methods, datasets, and evaluation metrics technologies, for example accelerometers, gyroscopes and
has been provided by the author. bend sensors, as well as their application in hand gesture
 "Customize sign language Recognition: A Comprehensive recognition. The design and use of sensors in the proposed
Review" by Pandey et al. (2021) system can be influenced by findings from this review.
 This paper focuses on Customized Sign Language
recognition techniques. It discusses various techniques
such as including sensor-based systems, vision-based
systems, and glove material-based systems. The author has
explored various challenges in recognizing Customize sign

IJISRT23JUL569 www.ijisrt.com 945


Volume 8, Issue 7, July 2023 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165
 "Hand gesture Recognition Using Wearable Sensors for 524). IEEE. Various methods and techniques used in the
Sign Language Translation" by Li et al. (2017) field of hand movement recognition such as feature
 A system for recognition of movements on the wrist using extraction vein segmentation and classification are
intrinsic sensors has been described in this paper. Using discussed in this paper. The survey also points to the
machine learning, the system records hand movements, significance of making intercultural considerations part of
finger movements, and recognition of gestures. The study the design of sign language recognition systems. The
discusses the accuracy of sign language gesture recognition author specifically discussed the various technologies and
and the possibility of extending the system to Customize concepts related to 'finger gesture recognition,' as well as
sign language. presented an insight into how different approaches differ in
 Arora, A., & Kaushik, S. (2020). Sign language recognition results.
using deep learning: A survey. Journal of King Saud
University-Computer and Information Sciences, 32(3), IV. RESULTS AND ANALYSIS
285-292.[1] In the fig below the actual hardware implemented
 The author of this essay has employed various ‘deep prototype is shown .Till now we have figured out 12 gesture
learning techniques’ for sign language recognition. It uses movement which is given customized message provided by
Convolutional Neural Networks (CNNs) and Recurrent us. According to accelerometer (mpu6050) which is used as
Neural Networks (RNNs) for recognizing sign language gyroscope that provide 3 axis x,y,z as the main axis and that
gestures, which is the core part of this project. The author can accustomed to provide direction to the hand that
has specifically created models to capture the gestures and accustomed to a different gesture but in our case we have used
run them through databases via Image Processing only 2 axis that is x axis and yaxis. The +ve x,-ve x and +ve
Techniques. y and -ve y axis in this are used for different signals to be
 "Sign Language Recognition using Machine Learning: A forwarded after selecting the angle we have decided on what
Review" by Maheshwari et al. (2020) angle the hardware should give the pre-defined result being
 This paper presents an overview of Sign language set by us.
understanding techniques using machine learning
algorithms. It discusses different aspects related to the We typically used 160 and 180 degree as our benchmark
concept such as such as gloves, accelerometers, and flex and controlling the flex's angle of attack We are giving the
sensors, for capturing hand gestures or movements in required results. The result of this or we can say result is
general. The author has talked about the importance of being obtained on android app which converts the signal
accurate recognition for effective sign language translation provided by bluetooth module - HC05 to the application in
and communication. terms of text is then converted into speech.
 Prasanna, S. R., & Geetha, T. V. (2020). A comprehensive
survey on Customize sign language recognition systems. In The table defined below shows the condition on which
2020 2nd International Conference on Innovative the the output of our system is constrained on ,we have pre-
Mechanisms for Industry Applications (ICIMIA) (pp. 519- fixed the values so that whenever the system triggers the
value it show the correspondant output.

Table 1: The Correspondant Output

The figure below is protoype which is need to be wear arduino through Flex sensor and according to the result is
on the hand so that the actual movements are shared with transmitted via Bluetooth module.

IJISRT23JUL569 www.ijisrt.com 946


Volume 8, Issue 7, July 2023 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165

Fig. 4: Hand glove with sensor

We are integrating our model with home automation consisting of RF encoder and decoder which receives signal
so thatthe appliances in the home can be controlled by hand and works accordingly.
gesture below figure represent the home automation circuit

Fig. 5: Home automation

We are displaying our result using an android application which connects through bluetooth module -HC05 ,and recieve the
signal than it converts the signal into text which further converts in speech.below figure describe the UI of our application.

Fig. 6: Application view (Bluetooth text to speech)

IJISRT23JUL569 www.ijisrt.com 947


Volume 8, Issue 7, July 2023 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165

Fig. 7: Glove sensor with Home Automation

Fig. 8: Final Output (flex 1 <180 & y-axis< -5)

Output of our hand gesture it displays on the screen with devices and systems, such as home automation or assistive
voicestating “hello I am shivam this is my project” from out technologies, using intuitive gestures and vocal commands.
android application. This level of autonomy fosters a sense of empowerment,
enabling individuals to navigate their environment and
V. CONCLUSIONS perform tasks with greater independence.
In conclusion, hand gesture vocalization technology Additionally, the integration of hand gesture
holds tremendous potential for individuals who are unable to vocalization with other assistive technologies, such as text-to-
speak. By providing an alternative means of communication, speech or augmentative and alternative communication
it empowers those with speech disabilities to express (AAC) devices, further expands communication possibilities.
themselves, interact with technology, and engage with the By combining these modalities, individuals who are unable to
world around them. speak can access a broader range of communication tools,
enabling themto effectively express themselves and engage in
For individuals who cannot speak, hand gesture meaningful interactions.
vocalization offers a valuable channel for self-expression and
interaction. It allows them to convey their thoughts, needs, As hand gesture vocalization technology continues to
and emotions through intuitive hand movements, enabling advance, it holds promise for further customization and
effective communication with others. This technology adaptation to individual needs. This personalization aspect is
bridges the communication gap that exists for individuals particularly crucial for individuals with speech disabilities, as
who rely on non-verbal forms of expression. it allows them to tailor the system to their specific gestures
and communication preferences.
Moreover, hand gesture vocalization can significantly
enhance the quality of life for individuals with speech
disabilities. It enables them to independently control various

IJISRT23JUL569 www.ijisrt.com 948


Volume 8, Issue 7, July 2023 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165
Overall, hand gesture vocalization technology offers a REFERENCES
transformative solution for individuals who cannot speak,
empowering them to communicate, control their [1.] S. M. Metev and V. P. Veiko, Laser Assisted
environment,and participate fully in society. By bridging the Microtechnology, 2nd ed., R. M. Osgood, Jr., Ed.
communication gap and providing alternative means of Berlin, Germany: Springer-Verlag, 1998.
expression, this technology enhances the quality of life and [2.] J. Breckling, Ed., The Analysis of Directional Time
promotes inclusivity for individuals who face speech-related Series: Applications to Wind Speed and Direction, ser.
challenges. Lecture Notes in Statistics. Berlin, Germany: Springer,
1989, vol. 61.
The integration of hand gesture vocalization and home [3.] S. Zhang, C. Zhu, J. K. O. Sin, and P. K. T. Mok, “A
automation has the potential to revolutionize the way we novel ultrathin elevated channel low-temperature
interact with our living spaces. By combining the power of poly-Si TFT,” IEEE Electron Device Lett., vol. 20, pp.
gesture recognition technology with voice control 569–571, Nov. 1999.
capabilities,individuals can effortlessly control various home [4.] M. Wegmuller, J. P. von der Weid, P. Oberson, and N.
automation devices and systems using intuitive hand Gisin, “High resolution fiber distributed measurements
movements and vocal commands. with coherent OFDR,” in Proc. ECOC’00, 2000, paper
11.3.4, p. 109.
This innovative approach enhances accessibility and [5.] R. E. Sorace, V. S. Reinhardt, and S. A. Vaughn,
convenience, particularly for individuals with physical “High-speed digital-to-RF converter,” U.S. Patent 5
disabilities or those who prefer hands-free interaction. Hand 668 842, Sept. 16, 1997.
gesture vocalization enables users to seamlessly navigate [6.] (2002) The IEEE website. [Online]. Available:
through different control options, adjust lighting levels, http://www.ieee.org/
regulate temperature settings, control entertainment systems, [7.] M. Shell. (2002) IEEEtran homepage on CTAN.
and more, all with simple gestures and voice commands. [Online]. Available: http://www.ctan.org/tex-
archive/macros/latex/contrib/supported/IEEEtr an/
The integration of hand gesture vocalization and home [8.] FLEXChip Signal Processor (MC68175/D), Motorola,
automation not only enhances the user experience but also 1996.
promotes a more inclusive environment. Individuals who may [9.] “PDCA12-70 data sheet,” Opto Speed SA, Mezzovico,
have limited mobility or difficulties with traditional control Switzerland.
interfaces can now have greater independence and control [10.] Karnik, “Performance of TCP congestion control with
over their surroundings. Moreover, the system's adaptability rate feedback: TCP/ABR and rate adaptive TCP/IP,”
to different languages and its potential for customization M. Eng. thesis, Indian Institute of Science, Bangalore,
ensures that it can cater to diverse user needs and preferences. India, Jan. 1999.
VI. FUTURE WORK [11.] J. Padhye, V. Firoiu, and D. Towsley, “A stochastic
model of TCP Reno congestion avoidance and
Future work in the field of hand gesture vocalization control,” Univ. of Massachusetts, Amherst, MA,
technology involves expanding its language capabilities to CMPSCI Tech. Rep. 99-02, 1999.
include a wider range of languages, addressing the [12.] Wireless LAN Medium Access Control (MAC) and
communication needs of diverse populations. This would Physical Layer (PHY) Specification, IEEE Std. 802.11,
require developing gesture recognition models and language 1997.
processing algorithms specific to each language, enabling
individuals from different linguistic backgrounds to benefit
from this technology. Additionally, efforts can be directed
towards miniaturizing the system and integrating it into a chip
or compact device, making it more portable and easily
accessible. Creating a prototype that is cost-effective and
user-friendly would further promote the widespread adoption
of hand gesture vocalization technology, ensuring that
individuals with speech disabilities can utilize it in various
settings without significant financial barriers. Overall, future
research and development should focus on broadening
language support, optimizing the technology for portability
and affordability, and conducting user studies to assess its
effectiveness and usability in real-world scenarios. By
advancing these aspects, hand gesture vocalization
technology can become a widely accessible and inclusive tool
for individuals who cannot speak, empowering them to
communicate and engage with the world around them more
effectively.

IJISRT23JUL569 www.ijisrt.com 949

You might also like