A Comprehensive Review of Smart Glasses Technology-Future of Eyewear
A Comprehensive Review of Smart Glasses Technology-Future of Eyewear
A Comprehensive Review of Smart Glasses Technology-Future of Eyewear
2 (2021), 15-26
Research Article
Article History: Received: 11 January 2021; Accepted: 27 February 2021; Published online: 5 April 2021
_____________________________________________________________________________________________________
Abstract: This paper breaks down the technology behind smart glasses along with the advancement in this emerging field
keeping in mind the hardware, software and research work which major companies have put in to develop a real-life model.
This paper also comprises an analysis on the experience of users with different glasses. We have also talked about the new
benchmarks which this technology has set in sectors such as medicine, gaming, corporate, sports, entertainment and many
others.
15
Aman a, Bhavesh b and Rahul c
visual input by means of the wearable presentations. With wrist-worn sensors and wearable showcases, Palm
Type enables typing without expecting clients to hold any gadgets and doesn't require visual attention of their
hands. The latest design of Palm Type incorporates QWERTY design that is 39% quicker than current touchpad-
based keyboard. Palm Type is favored by 92% of the members. Palm Type uses 15 infrared sensors to identify
persons' nger position and taps, and gives visual input by means of Google Glass. [34]
Many companies have also designed Light indicators-based glasses for assisting navigation purposes keeping
in mind near eye visual cues and as per users preferences light can be adjusted. As per feedback when elders used
it, most of the participants suffered from mild dementia, but their navigation task was accompanied by smart
glasses and even bikers can use it since their hands are not free to interact with hand-held devices.
Speech Recognition is also a method for inputs in smart glasses. The choice messages or the default input
messages generated by smart glasses using machine learning based on user behavior and sensory data which
include the most tted item in the menu choice message. If the user fails to interact with the system, the default
input message is selected by the system for proceeding to the next task. The main drawback of default input
messages is the threat of unsatisfactory messages which needs frequent users’ interaction over the time for
training the machine algorithm.[28]
One of the efficient ways for input is Head Gesture recognition which enables the use of simple head gestures
as input. It is accurate in various wearer activities regardless of noise. Glass Gesture achieves a gesture recognition
accuracy near 96%. For authentication, Glass Gesture can accept authorized users in nearly 92% of trials, and
reject attackers in nearly 99% of trials. Head Gesture uses Motion sensors (i.e. the accelerometer and gyroscope)
on Glass are able to measure and detect all kinds of head movements due to their high electromechanical
sensitivity. In some situations, it may be considered inappropriate or even rude to operate Glass through the
provided touchpad or voice commands, in those case scenarios head gesture system has an advantage over other
input methods. Head gestures in comparison, can be tiny and not easily noticeable to mitigate the social
awkwardness. Second, the head gesture user interface can authenticate users which can lead to increased security
of the device. [23]
Another interaction method which is based on Google Glass, has a camera and OHMD. The images taken are
sent to a computer via the local network due to processing limitations in google glasses. There is a L-shaped
crosshair placed at the lower left corner of the OHMD so that the crosshair has minimum interference with the
vision of the user. This pointing system is established by using the camera-eye geometry and distance-pixel curve.
Further, two methods are developed to estimate the viewing point on a plane surface. Various experiments
conducted on this method showed that an angular error of less than 0.32° can be achieved by using this method.
[4]
Table 1.1 Different Input Methods used in Smart Glasses[2] [8] [11] [20] [23] [24]
SR No. Method Challenges Advantages Technology Used
Can be noisy in Provides a hands free
1 Voice Recognition microphone
shared environments experience
Less chances of error
Hand held devices Need of extra in providing input to Depends on the device
2
such as smartphone equipment the glass or accessing which is being used
information
User taps on body
No need to carry any
3 Touch part or wearable touchpad
extra equipment
devices
Getting rid of Errors
No need of surface to Absolute free hand
4 Non- Touch in input and power
interact technique
issues
Accuracy and
Camera and
effectiveness as No need for additional
5 Head Movement sensor-based face
limited amount of sensors or hardware
tracking system
inputs can be given
Provides visual Network of infrared
6 Palm Type Feasibility feedback and detect sensors mounted to
users finger position wrist
4. Components of Smart Glasses
A smart glass comprises many components. Let’s start with display, There are two major displays that most
of the glasses use these days which is shown in table 1.2. The data displayed is collected using a camera sensor
which also plays a vital role for various other purposes such as hand gestures, feed capture and many more. The
different types of cameras that are used in smart glasses are RGB, depth camera, infrared vision which also
support different computer vision tasks. [12]
Due to computational limitations in smart glasses, various computer vision algorithms that are used will be
efficient and should also be able to work precisely with videos of low fps. Eye blink detection has various
applications in health care, human computer interaction and driving safety. As closing of eyelids is an important
17
Aman a, Bhavesh b and Rahul c
step to detect blinking of an eye, eigen-eye approach is used to detect closing of the eyelid and then a Gradient
Boosting (GB) algorithm is used to train the model for eye-blink patterns which is based on the results from
eigen-eye approach. [31]
Table 1.2 Displays used in Smart Glasses [1] [8] [9] [22]
S no Methods Challenges Advantages
The camera is placed in such a way that it points towards the corresponding eye so as to capture various
activities performed by the eye. A mini computer, MK802 is attached to smart glasses for processing ability. The
CPU used in this mini computer is Allwinner A10 1.0 GHz Cortex-A8 along with RAM of 512MB, it supports
Android ICS (4.0) and Linux Linaro as an operating system. Figure 2.1 shows a design of smart glasses used to
detect blinking of the human eye.
There are various sensors that are used within the device for collecting data and processing it based on the
application of the device such as, for voice input, a microphone is used which converts sound into electrical
energy which can be further processed using speech recognition algorithms for further processes. The recent
advancements in this field makes it more accurate and responsive. [28]
Fig. 2.1 Camera setup in smart glasses for blink detection [31] [35]
In our study we also cae across many features that bring the smartglass technology very close to smartphones.
Like to enable the smart glasses to support various applications based on geo locations such as navigation, real
time tracking and many more, Global Positioning System (GPS) is used. Trackpad/ External Controller provides
an easy method to take input. This serves as a medium for interacting with the digital interface of the optical
display of smart glasses. To determine the rate of change of velocity of a body in its own instantaneous rest frame
accelerometer sensor is used. This helps in accessing the activity of the wearer such as walking, sitting, running
or in particular hand gestures. Gyroscope sensor measures the angular velocity and head orientation. Eye tracking
feature takes the user experience to a whole new level. The objective of this feature is to spot and locate the object
that the user intends to select, which is driven by eye movement. Magnetometer sensors are used to access the
strength and direction of magnetic fields which is very important for precise navigation and maps. To carry the
device around, we need a battery and to charge the battery we need a charging system. The specifications of the
battery differs from device to device as the energy requirements of different sensors embedded in smart glasses
are different. [12]
Apart from various other use cases of smart glasses, detecting heart rate of the wearer by using PPG sensor is
another feature which can be added to the smart glasses. Reflectance mode sensors are used for long term HR
monitoring, in which light is allowed to fall on the skin and is reflected by the subcutaneous tissue present under
the skin. In these types of sensors, LED and a photo - detector are situated near to each other on the skin surface
in such a way that there is a good concentration of blood vessels beneath the skin cells. [29]
There are a lot of factors contributing towards the absorption of light while performing PPG. Various
biological substances like bones, skin, tissues and non-pulsatile venous blood and arterial blood (vascular
elements) absorb light constantly that is represented by DC level in plethysmogram. The cardiac cycle consisting
of diastole and systole is represented by the alternating signal because of pulsatile arterial blood flow. [21] [25]
Pulse-Glasses is an example of such a system which consists of a pulse sensor, rechargeable battery and a
microcontroller. The pulse sensors send analog PPG values to the microcontroller which further transfers the data
to an Android mobile phone using BLE (Bluetooth low energy) protocol. The pulse sensor in Pulse - Glasses are
placed on one of the two nose-pads consisting of a green LED which sends light and is detected using a detector
which is facing the skin. All the connections from the pulse sensor to the microcontroller and battery are done
through the plastic frame of the glasses. The microcontroller is placed on one side of the glass frame and the
18
A Comprehensive Review of Smart Glasses Technology- Future of Eyewear
rechargeable battery (3.7 V or 3700 Mah) is placed on the other side of the glass frame. Arduino Blend-Micro
(ABM), manufactured by Red Bear Ltd, as the development board of the Pulse-Glasses. Atmel ATmega32U4 is
the microcontroller which is hosted on the single board circuit of the ABM.
Fig 2.2 Microprocessor and Architecture of a Smart Glass (minimum viable project) [11]
5. Advances in Smart Glasses Technology
In order to assist blind people in navigation tasks, a method is used which calculates the distance between
blind people and the obstacles using deep learning and stereo cameras through smart glasses. In this design of
smart glasses, stereo cameras are used to calculate the distance and gyro sensor is used to differentiate between
obstacles and floor. They have also attached a buzzer and a vibration motor to the device at three positions which
are, front, left and right side. They operate as per the distance between the obstacle and the wearer. The side in
which an obstacle is present, then the buzzer and vibration will be signaled to turn ON of that side. [5]
For example, if the obstacle is approaching from the left side, then the buzzer and vibration motor from the
left side will operate. CDS sensors are used in order to detect day and night. So, at night, to prevent any
mishappening, a LED turns ON and indicates to the other person about the position of the wearer. To detect the
type of obstacle, YOLO v3 has been used as the algorithm for deep learning. This task requires a lot of processing
power so this cannot be performed on a low - level MCU. In order to process the information fast, image data is
sent to the server using wireless communication. All the processing is done by the server and it directly sends
signal to the various sensors as the response via the microcontroller. [18]
Apart from detecting the distance between the user (blind person) and obstacles we also have a system which
can detect the presence of humans at night and in bad lighting conditions using thermal images although they do
not perform upto the mark during the day due to thermal contrast between the environment and the people.
Therefore, in order to detect pedestrians in difficult weather conditions. Augment thermal images with their
saliency maps, which will serve as an attention mechanism for detecting persons on the street, or usage of deep
learning along with saliency maps can be used for pedestrian detectors during the day as well as night.
19
Aman a, Bhavesh b and Rahul c
statistics
.
21
Aman a, Bhavesh b and Rahul c
10. META 420g CPU: Intel AR Integrated Through HDMI 720p Sensor 4 built-in
2 Core Display optical front- array surro
i7- and facing for und
6700(HQ inertial RGB hand soun
), sensors camer interacti d
AM for a for onsand speak
D positio mixed position ers
FX- nal al
9590 trackin trackin
g g
22
A Comprehensive Review of Smart Glasses Technology- Future of Eyewear
on sub
micropho
ne.
The proposed system in [4] is based on Google Glass, which has a camera and OHMD. The images taken are
sent to a computer via the local network due to processing limitations in google glasses. There is a L-shaped
crosshair placed at the lower left corner of the OHMD so that the crosshair has minimum interference with the
vision of the user. This pointing system is established by using the camera-eye geometry and distance-pixel curve.
Further, two methods are developed to estimate the viewing point on a plane surface. Various experiments
conducted on this method showed that an angular error of less than 0.32° can be achieved by using this method.
The authors of [3] have presented a design of smart glasses which consists of a head mounted display (HMD)
which will be used to assist senior citizens in their day to day navigational tasks. The system presented contains
LED’s which will act as indicators and Bluetooth to connect with mobile devices. To assist the user in navigation
tasks, they have made various LED blinking combinations which will allow the user to follow a certain path.
These smart glasses will be connected to an android device via Bluetooth which will receive commands from
remote caretaker through an internet connection which will further blink the LEDs in such a manner that it will
guide the user to follow a particular path.
The glasses are also equipped with a Bluetooth microphone to enable communication between the user and
remote caretaker. It also contains a camera which will be able to send videos and pictures from the user field of
view to the caretaker for increased protection. The target audience of this prototype are senior citizens who are
suffering from memory loss, so the blinking pattern of the leds are simple which makes navigational assistance
easy. [3] [26]
7. Challenges
The challenge faced while using smart glasses is reading on the go, walking has an adverse effect on reading
and it doesn’t matter if you are using a smartphone or a smart glass. It affects comprehension and workload.
Studies have shown that this effect can be reduced by overlaying the text in front of the user’s eye in the middle
23
Aman a, Bhavesh b and Rahul c
of the glass. However, research work has to be done to find the best way to make reading a seamless experience.
[14]
Regular consumers look up to smart glasses as a mode of entertainment and for experience enhancing
purposes. The other main group is professional consumers, who are envisioned to benefit from the technology’s
'hands-free' features. But so far, no company has met these criteria along with a cost-efficient compact design.
Risk potential non-sustainable augmentation or enhancement may reduce some cognitive capacities as they are
“outsourced” to technology, e.g., navigation skills, or delegate crucial tasks to less skilled personnel. [13]
8. Scope of the Technology
Smart glasses are used in various sectors such as, in education, health(fitness tracker - used to count steps,
measure heart and respiration rate), tourism(for giving a tour of the place), retail stores and for entertainment
purposes. Apart from this, a research was also conducted for the input of games on smart glasses. Results show
that clients signi cantly favored non-contact and non-handheld communication overusing handheld info gadgets,
for example, in-air motions. Additionally, for contact contribution without handheld gadgets, clients favored
cooperating with their palms over wearable gadgets (51% versus 20%). Likewise, clients favored interactions
that are less perceptible because of worries with social acknowledgment, and favored in-air signals before the
middle instead of before the face (63% versus 37%). [33] [34] [36]
We can also use smart glasses to highlight the way to the user. If the user is driving a car then it can also
propose a speed for the user. It can also navigate employees in warehouses to navigate them to the products they
need to transport, highlighting multiple products of the same order with the same color.
The eye tracking technology can also be used to track the eye movement of the employee. This will help
determine whether the employee is tired and needs a break or when an employee has finished all the work and is
sitting idle. In construction sites, smart glasses can be augmented with the design of the building which will help
the engineers to find mistakes and it will also help the workers to prevent accidents like drilling through a water
pipe. These are only a fraction of possible scenarios for smart-glass applications. And it is clear that each brings
a series of ethical questions that need to be answered. [26]
24
A Comprehensive Review of Smart Glasses Technology- Future of Eyewear
References
A. Bujnowski, J. Ruminski, P. Przystup, K. Czuszynski, and T. Kocejko, “Self diagnostics using smart
glasses - Preliminary study,” Proc. - 2016 9th Int. Conf. Hum. Syst. Interact. HSI 2016, pp. 511–517,
2016, doi: 10.1109/HSI.2016.7529682.
A. Firouzian, P. Pulli, M. Pleva, J. Juhar, and S. Ondas, “Speech interface dialog with smart glasses,” ICETA
2017 - 15th IEEE Int. Conf. Emerg. eLearning Technol. Appl. Proc., 2017, doi:
10.1109/ICETA.2017.8102483.
A. Firouzian, Z. Asghar, J. Tervonen, P. Pulli, and G. Yamamoto, “Conceptual design and implementation
of Indicator-based Smart Glasses: A navigational device for remote assistance of senior citizens suffering
from memory loss,” Int. Symp. Med. Inf. Commun. Technol. ISMICT, vol. 2015-May, pp. 153–156,
2015, doi: 10.1109/ISMICT.2015.7107518.
B. Hofmann, D. Haustein, and L. Landeweerd, “Smart-Glasses: Exposing and Elucidating the Ethical
Issues,” Sci. Eng. Ethics, vol. 23, no. 3, pp. 701–721, 2017, doi: 10.1007/s11948-016-9792-z.
C. Y. Wang, W. C. Chu, P. T. Chiu, M. C. Hsiu, Y. H. Chiang, and M. Y. Chen, “Palm type: Using palms
as keyboards for smart glasses,” MobileHCI 2015 - Proc. 17th Int. Conf. Human-Computer Interact. with
Mob. Devices Serv., pp. 153–160, 2015, doi: 10.1145/2785830.2785886.
1. H. Le, T. Dang, and F. Liu, “Eye blink detection for smart glasses,” Proc. - 2013 IEEE Int. Symp.
Multimedia, ISM 2013, pp. 305–308, 2013, doi: 10.1109/ISM.2013.59.
2. H. Schweizer, “Smart glasses: technology and applications,” Ubiquitous Comput. Semin., pp. 1–5,
2014,[Online].Available:https://www.vs.inf.ethz.ch/edu/UCS/reports/HermannSchweizer_SmartGl
a ssesTechnologyApplications_report.pdf.
3. H. Zhang, “Head-mounted display-based intuitive virtual reality training system for the mining
industry,” Int. J. Min. Sci. Technol., vol. 27, no. 4, pp. 717–722, 2017, doi:
10.1016/j.ijmst.2017.05.005.
I. Belkacem, I. Pecci, and B. Martin, “Pointing task on smart glasses: Comparison of four
interaction techniques,” 2019, [Online]. Available: http://arxiv.org/abs/1905.05810.
4. J. H. Kim, S. K. Kim, T. M. Lee, Y. J. Lim, and J. Lim, “Smart glasses using deep learning and stereo
camera,” 2019 IEEE 8th Glob. Conf. Consum. Electron. GCCE 2019, vol. 2, pp. 294–295, 2019, doi:
10.1109/GCCE46687.2019.9015357.
5. J. Häkkilä, V. Vahabpour, A. Colley, J. Väyrynen, and T. Koskela, “Design probes study on user
perceptions of a user perceptions of a user perceptions of a smart glasses concept,” ACM Int. Conf.
Proceeding Ser., vol. 30-Novembe, no. Mum, pp. 223–233, 2015, doi: 10.1145/2836041.2836064.
6. J. Ham, J. Hong, Y. Jang, S. H. Ko, and W. Woo, “Poster: Wearable input device for smart glasses
based on a wristband-type motion-aware touch panel,” IEEE Symp. 3D User Interfaces 2014, 3DUI
2014 - Proc., pp. 147–148, 2014, doi: 10.1109/3DUI.2014.6798863.
7. J. Ruminski, A. Bujnowski, K. Czuszynski, and T. Kocejko, “Estimation of respiration rate using an
accelerometer and thermal camera in eGlasses,” Proc. 2016 Fed. Conf. Comput. Sci. Inf. Syst.
FedCSIS 2016, vol. 8, pp. 1431–1434, 2016, doi: 10.15439/2016F329.
8. J. Ruminski, M. Smiatacz, A. Bujnowski, A. Andrushevich, M. Biallas, and R. Kistler, “Interactions
with recognized patients using smart glasses,” Proc. - 2015 8th Int. Conf. Hum. Syst. Interact. HSI
2015, pp. 187–194, 2015, doi: 10.1109/HSI.2015.7170664.
9. K. Matsumoto, W. Nakagawa, H. Saito, M. Sugimoto, T. Shibata, and S. Yachida, “AR visualization
of thermal 3D model by hand-held cameras,” VISAPP 2015 - 10th Int. Conf. Comput. Vis. Theory
Appl. VISIGRAPP, Proc., vol. 3, pp. 480–487, 2015, doi: 10.5220/0005290904800487.
10. K. Tanaka, S. Ishimaru, K. Kise, K. Kunze, and M. Inami, “Nekoze!-Monitoring and detecting head
posture while working with laptop and mobile phone,” Proc. 2015 9th Int. Conf. Pervasive Comput.
Technol. Heal. PervasiveHealth 2015, pp. 237–240, 2015, doi:
10.4108/icst.pervasivehealth.2015.260226.
11. L. H. Lee and P. Hui, “Interaction Methods for Smart Glasses: A Survey,” IEEE Access, vol. 6, no.
February, pp. 28712–28732, 2018, doi: 10.1109/ACCESS.2018.2831081.
12. M. Kristo, M. Ivasic-Kos, and M. Pobar, “Thermal Object Detection in Difficult Weather Conditions
Using YOLO,” IEEE Access, vol. 8, pp. 125459–125476, 2020, doi:
10.1109/ACCESS.2020.3007481.
13. M. Spitzer, I. Nanic, and M. Ebner, “Distance learning and assistance using smart glasses,” Educ.
Sci., vol. 8, no. 1, 2018, doi: 10.3390/educsci8010021.
14. N. Constant, O. Douglas-Prawl, S. Johnson, and K. Mankodiya, “Pulse-Glasses: An unobtrusive,
wearable HR monitor with Internet-of-Things functionality,” 2015 IEEE 12th Int. Conf. Wearable
Implant. Body Sens. Networks, BSN 2015, 2015, doi: 10.1109/BSN.2015.7299350.
15. N. Kommera, F. Kaleem, and S. M. S. Harooni, “Smart augmented reality glasses in cybersecurity
and forensic education,” IEEE Int. Conf. Intell. Secur. Informatics Cybersecurity Big Data, ISI 2016,
pp. 279–281, 2016, doi: 10.1109/ISI.2016.7745489.
25
Aman a, Bhavesh b and Rahul c
16. P. A. Rauschnabel, “Virtually enhancing the real world with holograms: An exploration of expected
gratifications of using augmented reality smart glasses,” Psychol. Mark., vol. 35, no. 8, pp. 557–572,
2018, doi: 10.1002/mar.21106.
17. P. A. Rauschnabel, A. Brem, and B. S. Ivens, “Who will buy smart glasses? Empirical results of two
pre-market-entry studies on the role of personality in individual awareness and intended adoption of
Google Glass wearables,” Comput. Human Behav., vol. 49, no. May, pp. 635–647, 2015, doi:
10.1016/j.chb.2015.03.003.
18. P. Alinia, R. Fallahzadeh, C. P. Connolly, and H. Ghasemzadeh, “ParaLabel: Autonomous Parameter
Learning for Cross-Domain Step Counting in Wearable Sensors,” IEEE Sens. J., vol. XX, no. X, pp.
1–1, 2020, doi: 10.1109/jsen.2020.3009231.
19. P. Krzyzanowski, T. Kocejko, J. Ruminski, and A. Bujnowski, “Enhanced eye-Tracking data: A dual
sensor system for smart glasses applications,” Proc. 2016 Fed. Conf. Comput. Sci. Inf. Syst. FedCSIS
2016, vol. 8, pp. 1417–1422, 2016, doi: 10.15439/2016F538.
20. R. Rzayev, P. W. Woźniak, T. Dingler, and N. Henze, “Reading on Smart glasses: The effect of text
position, presentation type and walking,” Conf. Hum. Factors Comput. Syst. - Proc., vol. 2018-April,
2018, doi: 10.1145/3173574.3173619.
21. S. Feng, W. Zheng, and H. Liu, “Demo abstract: Unobtrusive real-time shopping assistance in retail
stores using smart glasses,” 2015 12th Annu. IEEE Int. Conf. Sensing, Commun. Networking,
SECON 2015, pp. 181–183, 2015, doi: 10.1109/SAHCN.2015.7338313.
22. S. Mitrasinovic et al., “Clinical and surgical applications of smart glasses,” Technol. Heal. Care, vol.
23, no. 4, pp. 381–401, 2015, doi: 10.3233/THC-150910.
23. S. Yi, Z. Qin, E. Novak, Y. Yin, and Q. Li, “GlassGesture: Exploring head gesture interface of smart
glasses,” Proc. - IEEE INFOCOM, vol. 2016-July, 2016, doi: 10.1109/INFOCOM.2016.7524542.
24. S. Yi, Z. Qin, E. Novak, Y. Yint, and Q. Li, “GlassGesture: Exploring head gesture interface of smart
glasses,” Proc. - IEEE INFOCOM, vol. 2016-Septe, pp. 1017–1018, 2016, doi:
10.1109/INFCOMW.2016.7562233.
25. V. N. Herzog, B. Buchmeister, A. Beharic, and B. Gajsek, “Visual and optometric issues with smart
glasses in Industry 4.0 working environment,” Adv. Prod. Eng. Manag., vol. 13, no. 4, pp. 417–428,
2018, doi: 10.14743/apem2018.4.300.
26. Y. Abdelrahman, A. S. Shirazi, N. Henze, and A. Schmidt, “Investigation of material properties for
thermal imaging-based interaction,” Conf. Hum. Factors Comput. Syst. - Proc., vol. 2015-April, pp.
15–18, 2015, doi: 10.1145/2702123.2702290.
27. Y. C. Tung et al., “User-Defined game input for smart glasses in public space,” Conf. Hum. Factors
Comput. Syst. - Proc., vol. 2015-April, pp. 3327–3336, 2015, doi: 10.1145/2702123.2702214.
28. Y. H. Chen, P. C. Su, and F. T. Chien, “Air-writing for smart glasses by effective fingertip detection,”
2019 IEEE 8th Glob. Conf. Consum. Electron. GCCE 2019, pp. 381–382, 2019, doi:
10.1109/GCCE46687.2019.9015389.
29. Y. H. Li and P. J. Huang, “An Accurate and Efficient User Authentication Mechanism on Smart
Glasses Based on Iris Recognition,” Mob. Inf. Syst., vol. 2017, 2017, doi: 10.1155/2017/1281020.
30. Y. Y. Hsieh, Y. H. Wei, K. W. Chen, and J. H. Chuang, “A novel egocentric pointing system based
on smart glasses,” 2017 IEEE Vis. Commun. Image Process. VCIP 2017, vol. 2018-Janua, pp. 1–4,
2018, doi: 10.1109/VCIP.2017.8305042.
26