Part 1
Part 1
Part 1
Introduction
Problem Definition
For decades, keyboards and mouse have been the main input devices for computers. However, with the
rising popularity of ubiquitous and ambient devices, (i.e. PlayStations), and equipment which allows users
to grasp virtual objects, hand or body gesture are becoming essential: gesture controls have become
central to the human computer interaction system. Developing a full screen gestured controlled android
app that has embedded videos and supports basic features of navigation, viewing creator profile and
subscribing to them[1].
On opening the home screen, the first video would start playing.
On swiping up : The video embedded below the current video starts playing.
On swiping down : The video embedded above the current video starts playing.
On swiping right, a dummy profile of the video creator is seen.
On swiping left, a pop-up appears simply saying "Subscribed to----------------------" with the
content creator's name.
1
2. Literature Review
The app takes inspiration from popular video streaming application TikTok, which follows the layout
of a full screen video application where the user can consume videos from various creators on the
platform and switch between those videos with the help of gesture controls. [3]
2
3. Related Theories and Algorithms
JAVA: Java is a popular programming language. Java is used to develop mobile apps, web apps,
desktop apps, games and much more.
XML: Extensible Markup Language is a markup language and file format for storing, transmitting, and
reconstructing arbitrary data. It defines a set of rules for encoding documents in a format that is both
human-readable and machine-readable [4].
Android Studio: Android Studio is the official integrated development environment for Google's
Android operating system, built on JetBrains' IntelliJ IDEA software and designed specifically for Android
development. It is available for download on Windows, macOS and Linux based operating systems
Fundamental algorithms
The app implements the following features: Touch Detection, Splash screen implementation and
Mutiple activity implementation. The app should first detect any touch that occurred on the screen of the
application, using TouchEvent which recognises any type of touch made on the screen. Splash screen is a
screen that occurs when the app is loaded which displays the name and logo of the application. It also
uses the concept implementing multiple activites in the application.
Touch Detection
Splash screen implementation
Multiple activity implementation
3
4. Proposed model/algorithm
Detecting touch on the screen using gestureDetector - It recognises any type of touch made on the
screen [6].
Measure the action event’s value using getAction() - It measures the amount of movement of the
touch by calculating the distance [7].
For top and bottom swipe we use the event variable’s reference and event.ACTION_UP and
event.ACTION_DOWN - These references are used to calculate the type of swipe that was made
on the screen.
For left and right swipe we calculate the difference of horizontal and vertical swipe values - We
get the absolute values of x and y actions and then respectively calculate which swipe occurred.
On the basis of the values we determine the type of gesture and perform the action accordingly
which has been set in a switch case construct.
4
5. Simulation Results and Performance analysis
Experimental set up
Experimental results
The gestures are detected as intended.
The toast to display “Subscribed” works without any delay
The dummy profiles show the basic info of creators
The videos can switched in a circle, we can reach the first video from the last and the last from
the first.
Performance Analysis
The app performs exactly as intended with 100% efficiency.
5
6. Discussion and Conclusion
Conclusion
This concludes the report on our project of “Gesture Controlled Video App”, through this project
we’ve got to the know the basic implementation of gestures in Android using the class
GestureDetector. OnGestureListener and its various functions, coded in Java. Gesture control
technology shows a great potential in education. Although at this stage, practical usage of the
technology in teaching and learning activities are not widely acknowledged. There are two reasons for
this. Firstly, education content differs from subject to subject. In order to use the gesture control
product in academia, it requires extensive customisation of software, often requiring developers. In
most cases, the university does not have such resources to support the work. Secondly, the
effectiveness of the gesture control products still need improving. It takes time to calibrate the product,
and in order to control the product, users will need to spend time training and practicing [8]. Gesture
control products are not yet as intuitive as they claim to be or have the potential to be.
Future Work
The application could be further enhanced to the following features :
User signup and login
Users can upload content
The videos that would be available for streaming would consist of content uploaded by other users
of the platform
Following content creators to see their specific content
Getting a randomized feed in case the user doesn’t follow anyone
Keeping track of the users interactions with the content
References :
1) https://intranet.birmingham.ac.uk/it/innovation/documents/public/Gesture-Control-Technology.pdf
2) https://freebiesui.com/sketch-freebies/sketch-ui-kits/tiktok-ui-kit/
3) https://en.wikipedia.org/wiki/XML
4) https://en.wikipedia.org/wiki/SQL
5) https://developer.android.com/reference/android/view/GestureDetector
6) https://developer.android.com/reference/android/view/MotionEvent
7) https://ieeexplore.ieee.org/document/7489184