Computers: An Integrated Mobile Augmented Reality Digital Twin Monitoring System
Computers: An Integrated Mobile Augmented Reality Digital Twin Monitoring System
Computers: An Integrated Mobile Augmented Reality Digital Twin Monitoring System
Article
An Integrated Mobile Augmented Reality Digital Twin
Monitoring System
F. He 1 , S. K. Ong 2, * and A. Y. C. Nee 2
1 NUS Graduate School for Integrative Sciences and Engineering, National University of Singapore,
Singapore 119077, Singapore; [email protected]
2 Mechanical Engineering Department, National University of Singapore, 9 Engineering Drive 1,
Singapore 117576, Singapore; [email protected]
* Correspondence: [email protected]; Tel.: +65-65162222
2. Related Works
Interacting with smart devices is not a new topic. Embedded interactions have been
implemented as efficient ways initially [4], while the mobile-based approach becomes
increasingly more popular since it can be used ubiquitously. Heun et al. [5] demonstrated
a system that integrates graphical user interface and tangible user interface to program
everyday objects into smarter objects with high flexibility. Mayer et al. [6] presented a
model-based interface description scheme to control a smart device with a mobile phone.
Yew et al. [7] designed a novel paradigm for interacting with smart objects via AR interface.
Most of the current work focuses on controlling smart devices and smart objects. However,
the enormous amount of data amassed creates a new challenge on HCI as it cannot be
read and understood easily by novice, sometimes even experienced, users. The interaction
between human and data associated with smart devices has a gap that needs to be filled.
DT is a powerful tool for extracting, analyzing, and displaying knowledge from
device data. The concept of DT can be dated back to NASA’s Apollo project in the late
1960s, during which two identical space vehicles were created and the one on Earth (called
‘the twin’) could mirror the operating condition for simulating the real-time behavior of
the one performing the mission. Later, the emergence of computer-aided design (CAD)
tools contributed to the rise of virtual design and simulation. Advancements in computer
technology has made real-time simulation a reality and the conceptual model of “a virtual,
digital representation equivalent to a physical product” or a “digital twin” was introduced
in 2003, which is regarded as the origin of DT [3]. Since then, DT has attracted more
attention along with the development of new information technologies, such as cloud and
edge computing, the IoT, big data, and artificial intelligence. It shows great potential in
the field of product design, manufacturing, production management and control, etc. [8,9].
With the increasing virtualization of resources and adoption of cloud services, DT is seen
as a pragmatic way for cyber–physical fusion. This creates new opportunity of enhancing
the HCI with human-in-the-loop considerations [10]. However, the current research on the
creation of DT has been focused primarily on technical features and lacks attention to end
user experience [11].
AR on the other hand has proven successful in improving the understanding of digital
information via natural interaction. AR systems have been researched and applied exten-
Computers 2021, 10, 99 3 of 20
sively in engineering analysis and simulation [12], as well as supporting robot control [13],
assembly [14–16], and manufacturing applications [17]. In addition to engineering, it has
good applications in medical, education, and entertainment [18]. Multiple modalities can
be implemented for improving the AR experience [19]. However, the lack of suitable hard-
ware and software usually constrains AR applications to fixed 2D screens or requires the
use of intrusive markers [20]. In recent years, AR research has seen a drastic improvement
due to four main developments [21]:
1. The pervasiveness of low-cost visual sensors, such as smartphone cameras, created
the foundation for AR consumerism.
2. The development in environmental perception algorithms, such as visual simultane-
ous localization and mapping (SLAM) and visual-inertia odometry (VIO), made it
easy for fusing virtual and reality [22,23].
3. The availability of consumer-level AR displays based on advances in optics and
mobile processors.
4. The maturity of multimedia techniques has enriched the content and interaction styles
of AR applications [24,25].
Even though many conceptual frameworks for DT systems have been proposed, few
were implemented. This is mainly due to the challenges in high-fidelity modelling and
simulation, life-cycle data integration and fusion, connection and interconnection [26].
Simulating real-time behavior on a high-fidelity virtual model is computationally costly.
A few researches have been reported the integration of DT with AR [27–30]. Schroeder
et al. [27] developed an AR user interface (UI) for data visualization of a DT system. Rabah
et al. [28] used DT and AR for predictive maintenance. Zhu et al. [29] reported the use of
AR for customizable visualization of DT data depending on the virtual objects selected.
Liu et al. [30] used AR to provide real-time monitoring of the machining processes on a
CNC machine to the users, with data and simulation from the DT machining system. AR
has been used mainly as a visualization tool for monitoring the physical twins in these
reported work. When integrating DT with AR, the rendering of visual content adds load to
the computation resource. This makes it even harder to integrate AR and DT on a mobile
device. To work around these technical limitations, in this research, DT data processing is
offloaded to a cloud server, and only AR rendering of the computed result is performed
on the mobile device. Utilizing DT data, AR can help users understand the operating
conditions more efficiently. When used remotely, it can reduce cost and time of physical
travel on-site for diagnosis. For a smart device that is malfunctioning, AR can provide an
intuitive alternative to traditional digital numerals on the screen. Overall, the integrated
system should allow ubiquitous retrieval of information and intuitive interaction with the
smart devices.
3. System Framework
Figure 1 presents the system architecture for ARDTMS, where a user interacts with the
DT through an AR user interface; the solid lines are data and information flows, and the
dashed lines are interaction and visualization via the AR interface. As the programming of
both DT and AR applications presumes a deep knowledge of the underlying technology,
the goal of such a framework is to simplify the development of the integrated system and
focus on defining the interfaces across each module. As shown in Figure 1, the DT passes
the learned knowledge onto the AR device, which will present the information, and a
user interacts through the same AR interface to control the physical operation and virtual
simulation. The framework is formulated based on the implementation of a model tower
crane case study, but it can be adapted for an industrial machine by replacing the input
data, digital model, as well as the computing engines.
and can mirror its status in the virtual space to display the real-time changes of the
equipment under monitoring. The “live” model can also help users comprehend
operating conditions based on knowledge acquired from the historical data. Various
computation processes can be implemented in the virtual space to integrate advanced
Computers 2021, 10, 99 functions, such as energy consumption analysis and inventory management.4 ofThe 20
computed results will be used for decision support and event identification that can be
visualized using the AR user interface.
the user respond to disruptions in time. In the following sections, the method to apply the
elements in the ARDTMS framework for general smart devices will be discussed in detail.
needs to be filtered to fill missing values, remove duplicates and garbage, and format data
entries. This is the prerequisite that comprehensive knowledge can be derived from a large
quantity of dynamic and ambiguous raw data.
prolonged monitoring and assistive tasks. Development tools, such as ARKit, ARCore, and
Vuforia, are most popular due to the growing number of Android and iOS developers [36].
Although only available on iOS platform, ARKit supports marker-less plane tracking, 3D
object tracking, as well as multiuser sharable experience. Its tracking capability is also
more reliable due to better software-hardware calibration for the iOS devices, which are
more standardized than the vast variety of Android devices. Thus, ARKit is chosen for the
prototype of ARDTMS.
4. Case Study
ARDTMS is demonstrated through the monitoring of the operating status of a scaled-
down industrial prototype of a tower crane model. The implementation process helps to
investigate the benefits, challenges, and limitations of the proposed system in detail. A user
study is conducted to evaluate the advantages and effectiveness of the proposed system
against the live data dashboard. The user study aims to verify whether the integrated sys-
tem yields measurable improvement during the HCI process for users who are unfamiliar
with a particular process.
One of the greatest challenges in industrial tower crane operations is to understand
the wind load. In the prototype, the tower crane will operate in a laboratory environment
under different simulated wind conditions. This aims to simulate the situation that a crane
operator often faces. Although there are general safety guidelines based on measured
wind speed from anemometers, the varying characteristics of the load, the boom, and
even the site can be hard to gauge. In addition, nonlinearity exists in the system as well
as the dynamics coupled in the random movement caused by gust of winds making it
difficult to develop a precise model to describe the system. This makes the model-based
approach, which requires structural parameters including elastic modulus, density, cross-
sectional area, mass, etc., of all parts of the tower crane, difficult to achieve. Although no
precise mathematical model is available, a data-driven method can help with the operation
monitoring process.
Normally, a crane operator will need to judge whether the lifting task can be pro-
ceeded or not based on his/her own experience and knowledge. Comprehensive live
data dashboard is used, from which operating trends can be read to identify anomalies.
However, this can be difficult for novice operators, which could result in fatal accidents. To
help novice operators, this case study aims to assist operators to understand the current
Computers 2021, 10, 99 8 of 20
operating conditions and avoid dangerous operations with real-time data, and knowledge
learned from historical data as well as previous operators, presented through the AR
interface. Figure 2 shows the overview of the implementation of the monitoring system.
as it does not contribute to the learning engine. After the preprocessing, the training data
has 301 rows of data operating at the healthy state.
Figure 6. Real-time dashboard showing data over last 3 min; (left): proximity sensor readings, (right): pressure sensors
readings.
Computers 2021, 10, 99 13 of 20
Figure 7. The mobile user interface and side-by-side comparison of DT and the physical tower crane.
Using the proposed system, when the user is in a remote location, he/she can access
the DT model in his/her current environment to help diagnose remotely before on-site
action is taken. When the computation engine receives a set of sensor readings that
produces a risk score higher than 50%, a warning alert is sent to the technician’s personal
device. The risk level is shown through the UI panel, and the system continues to process
the incoming data. If the risk increases and exceeds 70%, the system determines that the
condition is dangerous and will suspend all control functions to prevent any potential
damage.
5. User Study
A user study was conducted to validate whether the integrated system yields mea-
surable improvement during the human–device interaction process for users that are
unfamiliar with a particular process. This is achieved through a qualitative and quantita-
tive evaluation of the advantages and effectiveness when providing assistance to the users
through the proposed system against the traditional live data dashboard. In the current
procedure, a remote operator would monitor the tower crane operation via the variation of
data displayed in the live data dashboard. However, it is difficult for a novice user without
extensive knowledge on the tower crane operation to identify anomaly by looking at the
data. To eliminate this difficulty, ARDTMS takes over the data comprehension based on
the historical knowledge. A total of 20 students from mechanical and civil engineering
disciplines are invited to participate in the study. They are divided evenly into the control
group (A) and the experimental group (B). Each individual from the control group is paired
with another from the experimental group. The former will monitor through the live
data dashboard while the latter will monitor through the AR interface simultaneously.
A disruptive event will be introduced during the monitoring process and the users are
expected to respond to the event.
The following hypotheses are made prior to conducting the study:
Computers 2021, 10, 99 14 of 20
1. Users using ARDTMS will respond faster to events as compared to users using the
live data dashboard users as they have assistance from a holistic and single view that
focuses on user attention;
2. Users using ARDTMS will perform more consistently at a similar level than users
using the live data dashboard as they are assisted by intuitive AR stimuli;
3. The ARDTMS system can reduce user errors as the historical knowledge that is taken
into account to provide AR assistance is more reliable than the primitive personal
judgement.
means “most negative”. The mean value of the ratings is used to compare each aspect of
the proposed solution to the traditional dashboard.
Response
Group Mobility Intuitiveness Efficiency Satisfaction
Time (s)
A B A B A B A B A B
1 2 5 1 5 3 4 2 5 6.6 4.1
2 5 4 2 4 4 5 3 4 8.6 4.8
3 5 5 4 2 4 4 4 4 9.0 5.0
4 5 4 3 4 3 4 3 4 7.38 5.2
5 2 4 3 5 3 4 4 4 Inf 7.0
6 2 4 2 4 1 3 2 4 Inf 5.0
7 4 5 4 5 5 5 4 5 Inf 5.5
8 5 5 5 5 5 5 5 5 6.6 4.7
9 4 4 3 5 4 5 3 4 2.0 4.7
10 5 4 4 3 5 4 5 5 9.0 4.7
Lastly, the ARDTMS users were asked to rate the remote control function via the
mobile interface using the same four dimensions. Through ARDTMS, users were able to
control the tower crane operation in real-time from their mobile device; they could also
view the motion through the AR model, which fully reflects the operation of the physical
tower crane. The ratings show positive scores (above 3) in all aspects, including mobility,
intuitiveness, mobility, and satisfaction for the remote control function. This is because the
immersive visualization provides immediate feedback that can help users take appropriate
actions based on the monitoring result.
6. Conclusions
This paper has described a framework for integrating AR and DT to improve HCI
with smart devices by allowing remote monitoring and control, extracting knowledge from
data, and presenting in AR. The prototype ARDTMS shows a preliminary case study on a
tower crane. Through the user study, all three hypotheses were verified. Additionally, it is
encouraging to observe the significant statistical separation between the mean response
time for users of both systems. The qualitative result shows a preference for novice users
in selecting ARDTMS as the monitoring system. The study highlights some capabilities
which can be translated into HCI with other smart devices:
1. Utilize real-time data and historical data to provide assistive knowledge to improve
performance of novice users;
2. Visualizing real-time results in AR format, which is intuitive and easy to understand;
3. Implementable with commercially available hardware (mobile phones) and software
(Unity3D, Microsoft Azure, ARKit) to ensure the system can be built for every user
cost-effectively.
In the case study, the machine learning model showcases one possible way of using
DT data. The framework is compatible with more advanced algorithms. As an example,
if structural analysis is available for the DT, a finite element model can be integrated. As
a conceptual example, ARFEA presented a real-time simulation method based on the
quasi–static approximation to update the FEA result only on the region where the load is
added based on the data from the force sensor [41]. To integrate into ARDTMS, the same
computation algorithm can be preserved and transferred from a local computer to a cloud
server. Only the rendered result is needed to be transferred to the mobile device, and the
interaction scheme should be redesigned to fit the point-and-click interaction of a smart
phone.
The scope of the case study is limited by the availability of data. In this work, only
one device is studied, while multi-device interaction will have more complex effects in an
actual application. Further development can be performed by incorporating data from
a production shop floor. Nevertheless, the approach of utilizing real-time and historical
data to simulate and visualize device data using AR and DT is applicable for various smart
devices.
Data security is another issue overlooked in this study. Currently the data transmitted
is not encrypted. Additionally, as the XML data and JSON database are shared by multiple
users, there is a higher risk of losing proprietary information if any of the personal device is
lost or stolen. This can endanger the safety of operation and business performance if falling
into wrong hands. It is suggested that an in-house or outsourced information security
officer be involved when implementing the system for any business entity.
Ultimately, the framework of ARDTMS allows customized utilization and visualiza-
tion of DT data to improve a user’s decision-making process and take further actions. The
user or the service professional can investigate and decide what insight is important and
which analytics technique to integrate based on the application. In this way, ARDTMS
opens new opportunities to harness the power of DT data and AR interface to enhance
HCI for users to work smarter, more efficiently, and more effectively.
Author Contributions: Conceptualization, F.H., S.K.O. and A.Y.C.N.; methodology, F.H.; software,
F.H.; writing—original draft preparation, F.H.; writing—review and editing, F.H., S.K.O. and A.Y.C.N.;
Computers 2021, 10, 99 19 of 20
supervision, S.K.O. and A.Y.C.N. All authors have read and agreed to the published version of the
manuscript.
Funding: This research received no external funding.
Institutional Review Board Statement: Not applicable.
Informed Consent Statement: Not applicable.
Data Availability Statement: Data is contained within the article.
Conflicts of Interest: The authors declare no conflict of interest.
References
1. Brown, M.; Coughlan, T.; Lawson, G.; Goulden, M.; Houghton, R.J.; Mortier, R. Exploring Interpretations of Data from the Internet
of Things in the Home. Interact. Comput. 2013, 25, 204–217. [CrossRef]
2. Kang, H.S.; Lee, J.Y.; Choi, S.; Kim, H.; Park, J.H.; Son, J.Y.; Kim, B.H.; Noh, S.D. Smart manufacturing: Past research, present
findings, and future directions. Int. J. Precis. Eng. Manuf. Green Technol. 2016, 3, 111–128. [CrossRef]
3. Grieves, M. Digital Twin: Manufacturing Excellence through Virtual Factory Replication. 2014. Available online:
http://www.themanufacturinginstitute.org/~{}/media/E323C4D8F75A470E8C96D7A07F0A14FB/DI_2018_Deloitte_MFI_
skills_gap_FoW_study (accessed on 25 June 2021).
4. Kranz, M.; Holleis, P.; Schmidt, A. Embedded interaction: Interacting with the internet of things. IEEE Internet Comput. 2009, 2,
46–53. [CrossRef]
5. Heun, V.; Kasahara, S.; Maes, P. Smarter objects: Using AR technology to program physical objects and their interactions. In
Proceedings of the CHI’13 Extended Abstracts on Human Factors in Computing Systems, Paris, France, 27 April–2 May 2013; pp.
961–966.
6. Mayer, S.; Tschofen, A.; Dey, A.K.; Mattern, F. User interfaces for smart things—A generative approach with semantic interaction
descriptions. ACM Trans. Comput. Hum. Interact. 2014, 21, 12. [CrossRef]
7. Yew, A.W.W.; Ong, S.K.; Nee, A.Y.C. Augmented Reality Interfaces for Smart Objects in Ubiquitous Computing Environments. In
Human-Computer Interfaces and Interactivity: Emergent Research and Applications; Isaias, P., Blashki, K., Eds.; Information Science
Reference: Hershey, PA, USA, 2014; pp. 208–229.
8. Tao, F.; Cheng, J.; Qi, Q.; Zhang, M.; Zhang, H.; Sui, F. Digital twin-driven product design, manufacturing and service with big
data. Int. J. Adv. Manuf. Technol. 2018, 94, 3563–3576. [CrossRef]
9. Zhuang, C.; Liu, J.; Xiong, H. Digital twin-based smart production management and control framework for the complex product
assembly shop-floor. Int. J. Adv. Manuf. Technol. 2018, 96, 1149–1163. [CrossRef]
10. Schirner, G.; Erdogmus, D.; Chowdhury, K.; Padir, T. The future of human-in-the-loop cyber-physical systems. Computer 2013, 1,
36–45. [CrossRef]
11. Ardito, C.; Buono, P.; Desolda, G.; Matera, M. From smart objects to smart experiences: An end-user development approach. Int.
J. Hum. Comput. Stud. 2018, 114, 51–68. [CrossRef]
12. Li, W.; Nee, A.Y.C.; Ong, S.K. A state-of-the-art review of augmented reality in engineering analysis and simulation. Multimodal
Technol. Interact. 2017, 1, 17. [CrossRef]
13. Novak-Marcincin, J.; Barna, J.; Janak, M.; Novakova-Marcincinova, L. Augmented reality aided manufacturing. Procedia Comput.
Sci. 2013, 25, 23–31. [CrossRef]
14. Radkowski, R.; Herrema, J.; Oliver, J. Augmented reality-based manual assembly support with visual features for different
degrees of difficulty. Int. J. Hum. Comput. Interact. 2015, 31, 337–349. [CrossRef]
15. Radkowski, R. Object tracking with a range camera for augmented reality assembly assistance. J. Comput. Inf. Sci. Eng. 2016, 16,
011004. [CrossRef]
16. Wang, X.; Ong, S.K.; Nee, A.Y.C. A comprehensive survey of augmented reality assembly research. Adv. Manuf. 2016, 4, 1–22.
[CrossRef]
17. Nee, A.Y.C.; Ong, S.K.; Chryssolouris, G.; Mourtzis, D. Augmented reality applications in design and manufacturing. CIRP Ann.
2012, 61, 657–679. [CrossRef]
18. Van Krevelen, D.W.F.; Poelman, R. A survey of augmented reality technologies, applications and limitations. Int. J. Virtual Real.
2010, 9, 1–20. [CrossRef]
19. Vazquez-Alvarez, Y.; Aylett, M.P.; Brewster, S.A.; Jungenfeld, R.V.; Virolainen, A. Designing interactions with multilevel auditory
displays in mobile audio-augmented reality. ACM Trans. Comput. Hum. Interact. 2016, 23, 3. [CrossRef]
20. Fiorentino, M.; Uva, A.E.; Gattullo, M.; Debernardis, S.; Monno, G. Augmented reality on large screen for interactive maintenance
instructions. Comput. Ind. 2014, 65, 270–278. [CrossRef]
21. Ling, H. Augmented reality in reality. IEEE MultiMedia 2017, 24, 10–15. [CrossRef]
22. Li, P.; Qin, T.; Hu, B.; Zhu, F.; Shen, S. Monocular visual-inertial state estimation for mobile augmented reality. In Proceedings
of the 2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Nantes, France, 9–13 October 2017;
pp. 11–21.
Computers 2021, 10, 99 20 of 20
23. Mur-Artal, R.; Montiel, J.M.M.; Tardos, J.D. ORB-SLAM: A versatile and accurate monocular SLAM system. IEEE Trans. Robot.
2015, 31, 1147–1163. [CrossRef]
24. He, F.; Ong, S.K.; Nee, A.Y.C. A Mobile Solution for Augmenting a Manufacturing Environment with User-Generated Annotations.
Information 2019, 10, 60. [CrossRef]
25. Özacar, K.; Hincapié-Ramos, J.D.; Takashima, K.; Kitamura, Y. 3D Selection Techniques for Mobile Augmented Reality Head-
Mounted Displays. Interact. Comput. 2016, 29, 579–591. [CrossRef]
26. Tao, F.; Zhang, M.; Nee, A.Y.C. Digital Twin Driven Smart Manufacturing; Academic Press: Cambridge, MA, USA, 2019.
27. Schroeder, G.; Steinmetz, C.; Pereira, C.E.; Muller, I.; Garcia, N.; Espindola, D.; Rodrigues, R. Visualising the digital twin using
web services and augmented reality. In Proceedings of the 2016 IEEE 14th International Conference on Industrial Informatics,
Poitiers, France, 19–21 July 2016; pp. 522–527.
28. Rabah, S.; Assila, A.; Khouri, E.; Maier, F.; Ababsa, F.; Bourny, V.; Maier, P.; Merienne, F. Towards improving the future of
manufacturing through digital twin and augmented reality technologies. Procedia Manuf. 2018, 17, 460–467. [CrossRef]
29. Zhu, Z.; Liu, C.; Xu, X. Visualisation of the Digital Twin data in manufacturing by using Augmented Reality. Procedia CIRP 2019,
81, 898–903. [CrossRef]
30. Liu, S.; Lu, S.; Li, J.; Sun, X.; Lu, Y.; Bao, J. Machining process-oriented monitoring method based on digital twin via augmented
reality. Int. J. Adv. Manuf. Technol. 2021, 113, 3491–3508. [CrossRef]
31. Gaj, P.; Jasperneite, J.; Felser, M. Computer communication within industrial distributed environment—A survey. IEEE Trans. Ind.
Inform. 2013, 9, 182–189. [CrossRef]
32. Varsaluoma, J.; Väätäjä, H.; Heimonen, T.; Tiitinen, K.; Hakulinen, J.; Turunen, M.; Nieminen, H. Usage Data Analytics for
Human-Machine Interactions with Flexible Manufacturing Systems: Opportunities and Challenges. In Proceedings of the 2017
21st International Conference Information Visualisation (IV), London, UK, 11–14 July 2017; pp. 427–434.
33. Salomon, D. Data Compression: The Complete Reference; Springer: London, UK, 2004.
34. Gandomi, A.; Haider, M. Beyond the hype: Big data concepts, methods, and analytics. Int. J. Inf. Manag. 2015, 35, 137–144.
[CrossRef]
35. Söderberg, R.; Wärmefjord, K.; Carlson, J.S.; Lindkvist, L. Towards a Digital Twin for real-time geometry assurance in individual-
ized production. CIRP Ann. 2017, 66, 137–140. [CrossRef]
36. Linowes, J.; Babilinski, K. Augmented Reality for Developers: Build Practical Augmented Reality Applications with Unity,
ARCore, ARKit, and Vuforia. 2014. Available online: https://www.packtpub.com/web-development/augmented-reality-
developers (accessed on 25 June 2021).
37. Chen, Y.; Garcia, E.K.; Gupta, M.R.; Rahimi, A.; Cazzanti, L. Similarity-based classification: Concepts and algorithms. J. Mach.
Learn. Res. 2009, 10, 747–776.
38. Hornbæk, K. Current practice in measuring usability: Challenges to usability studies and research. Int. J. Hum. Comput. Stud.
2006, 64, 79–102. [CrossRef]
39. Sauro, J.; Lewis, J.R. Quantifying the User Experience: Practical Statistics for User Research; Morgan Kaufmann: Cambridge, MA,
USA, 2016.
40. Nielsen, J. Usability Engineering; Morgan Kaufmann: San Francisco, CA, USA, 1994.
41. Huang, J.M.; Ong, S.K.; Nee, A.Y.C. Real-time finite element structural analysis in augmented reality. Adv. Eng. Softw. 2015, 87,
43–56. [CrossRef]