Enhancing Multi-Modal Perception and Interaction: An Augmented Reality Visualization System for Complex Decision Making
Abstract
:1. Introduction
2. Literature Review
2.1. Visualization Systems
2.2. Augmented Reality Technology
2.3. Virtual User Interfaces
3. Materials and Methods
3.1. System Framework
- Member management module: This module is a comprehensive system that includes system tutorials, system experiments, data recording, and data processing and analysis. Participants can familiarize themselves with augmented reality systems through this module, conduct interactive experiments, and record real-time data. The module provides a foundation for analyzing the behavior and attention distribution of participants, ensuring the accuracy and reliability of experimental results.
- Augmented reality interface module: This module provides researchers with a user-friendly and reliable platform for conducting experiments and refining AR experiences. It utilizes Unity, HoloLens device, Vuforia platform, and the Mixed Reality Toolkit to create immersive AR scenes, seamlessly integrating virtual objects into real environments enabling device locomotion-based virtual content tracking, specific image recognition, and various interaction modalities. This integration establishes a unified framework, enhancing the overall cohesion and functionality of the augmented reality system.
- User behavior interaction module: This module enables users to interact with the augmented reality environment through various input methods, including voice commands, gestures, and user interfaces. It provides a flexible and intuitive way for users to manipulate virtual objects and navigate the system.
- Eye-tracking data acquisition module: This module stores the aggregated gaze data locally, providing spatial location and timing information for subsequent statistical analyses. This accurate and convenient platform offers researchers valuable insights into users’ visual behavior patterns and interface design issues in virtual environments.
- AR experiment process management module: This module ensures the smooth execution and management of augmented reality experiments. It provides tools for designing and conducting experiments, collecting data, and managing experimental processes, helping to improve the efficiency and accuracy of experiments while also promoting the work of researchers.
3.2. Member Management Module
- System tutorial: Before conducting the visualization system experiment, participating members first undergo system tutorial learning. Through the system tutorial, participants can gain a detailed understanding of the operational steps and processes in the augmented reality experimental environment using the Hololens device. The system tutorial aims to provide necessary guidance, enabling participants to familiarize themselves with the system’s functionality and interaction methods, and ensuring their correct usage of the system for subsequent experiments.
- System experiment: After completing the system tutorial, participants enter the formal system experiment phase. Participating members interact with the augmented reality system scenario through actions such as clicking, gazing, and voice commands. The experiment design allows participants to freely explore the system’s features and characteristics, collecting data during the experiment.
- Data recording: During the experiment, the system can record real-time experimental data of participating members. This includes recording system interaction videos, eye gaze coordinates, gaze duration, and eye gaze trajectory heatmaps. Accurate recording of participants’ behavior and attention focus provides a necessary foundation for subsequent data analysis.
- Data processing and analysis: After the experiment, the experimental data for each participant are processed and analyzed. This includes experiment replays, analysis of participants’ gaze data, and plotting scatter diagrams representing participants’ eye gaze ranges. Through statistical analysis and visualization techniques, the behavior patterns and attention distributions of participants during the experiment can be revealed, supporting further analysis and conclusions.
3.3. Augmented Reality Interface Module
3.4. User Behavior Interaction Module
3.5. Eye-Tracking Data Acquisition Module
3.5.1. Data Collection Methods
3.5.2. Data Processing Methods
3.6. AR Experiment Process Management Module
4. Results
4.1. AR Interactive Experiment
4.1.1. Interactive Experiment Design
4.1.2. Experiment Results
4.2. Eye-Tracking Experiment
4.2.1. Eye-Tracking Experiment Design
4.2.2. Experiment Results
5. Discussion
5.1. Analysis of System Advantages and Usability Experiment Results
5.2. System Modules’ Usability Analysis
5.3. Multi-Modal User Data Acquisition Method
5.4. System Limitations and Future Expansion Points
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
MDPI | Multidisciplinary Digital Publishing Institute |
DOAJ | Directory of open access journals |
TLA | Three letter acronym |
LD | Linear dichroism |
AR | Augmented Reality |
UI | User interfaces |
TUIs | Tangible user interfaces |
VR | Virtual Reality |
GUIs | Graphical user interfaces |
HMDs | Head-mounted displays |
MRTK | Mixed Reality Toolkit |
References
- Cui, W. Visual Analytics: A Comprehensive Overview. IEEE Access 2019, 7, 81555–81573. [Google Scholar] [CrossRef]
- Chen, C. Information visualization. Wiley Interdiscip. Rev. Comput. Stat. 2010, 2, 387–403. [Google Scholar] [CrossRef]
- Zhan, T.; Yin, K.; Xiong, J.; He, Z.; Wu, S.T. Augmented Reality and Virtual Reality Displays: Perspectives and Challenges. iScience 2020, 23, 101397. [Google Scholar] [CrossRef] [PubMed]
- Satriadi, K.A.; Smiley, J.; Ens, B.; Cordeil, M.; Czauderna, T.; Lee, B.; Yang, Y.; Dwyer, T.; Jenny, B. Tangible Globes for Data Visualisation in Augmented Reality. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems, New York, NY, USA, 29 April–5 May 2022. [Google Scholar] [CrossRef]
- Sadiku, M.; Shadare, A.; Musa, S.; Akujuobi, C.; Perry, R. Data Visualization. Int. J. Eng. Res. Adv. Technol. (IJERAT) 2016, 12, 2454–6135. [Google Scholar]
- Keim, D. Information visualization and visual data mining. IEEE Trans. Vis. Comput. Graph. 2002, 8, 1–8. [Google Scholar] [CrossRef]
- Xu, H.; Berres, A.; Liu, Y.; Allen-Dumas, M.R.; Sanyal, J. An overview of visualization and visual analytics applications in water resources management. Environ. Model. Softw. 2022, 153, 105396. [Google Scholar] [CrossRef]
- Zheng, J.G. Data visualization for business intelligence. In Global Business Intelligence; Routledge: London, UK, 2017; pp. 67–82. [Google Scholar] [CrossRef]
- Preim, B.; Lawonn, K. A Survey of Visual Analytics for Public Health. Comput. Graph. Forum 2020, 39, 543–580. [Google Scholar] [CrossRef]
- White, S.; Kalkofen, D.; Sandor, C. Visualization in mixed reality environments. In Proceedings of the 2011 10th IEEE International Symposium on Mixed and Augmented Reality, Basel, Switzerland, 26–29 October 2011; p. 1. [Google Scholar] [CrossRef]
- Martins, N.C.; Marques, B.; Alves, J.; Araújo, T.; Dias, P.; Santos, B.S. Augmented Reality Situated Visualization in Decision-Making. Multimed. Tools Appl. 2022, 81, 14749–14772. [Google Scholar] [CrossRef]
- Chen, K.; Chen, W.; Li, C.; Cheng, J. A BIM-based location aware AR collaborative framework for facility maintenance management. Electron. J. Inf. Technol. Constr. 2019, 24, 360–380. [Google Scholar]
- Ma, N.; Liu, Y.; Qiao, A.; Du, J. Design of Three-Dimensional Interactive Visualization System Based on Force Feedback Device. In Proceedings of the 2008 2nd International Conference on Bioinformatics and Biomedical Engineering, Shanghai, China, 16–18 May 2008; pp. 1780–1783. [Google Scholar] [CrossRef]
- Han, W.; Schulz, H.J. Exploring Vibrotactile Cues for Interactive Guidance in Data Visualization. In Proceedings of the 13th International Symposium on Visual Information Communication and Interaction, VINCI ’20, New York, NY, USA, 8–10 December 2020. [Google Scholar] [CrossRef]
- Su, Y.P.; Chen, X.Q.; Zhou, C.; Pearson, L.H.; Pretty, C.G.; Chase, J.G. Integrating Virtual, Mixed, and Augmented Reality into Remote Robotic Applications: A Brief Review of Extended Reality-Enhanced Robotic Systems for Intuitive Telemanipulation and Telemanufacturing Tasks in Hazardous Conditions. Appl. Sci. 2023, 13, 12129. [Google Scholar] [CrossRef]
- Azuma, R.T. A Survey of Augmented Reality. Presence Teleoper. Virtual Environ. 1997, 6, 355–385. [Google Scholar] [CrossRef]
- Tarng, W.; Tseng, Y.C.; Ou, K.L. Application of Augmented Reality for Learning Material Structures and Chemical Equilibrium in High School Chemistry. Systems 2022, 10, 141. [Google Scholar] [CrossRef]
- Gavish, N. The Dark Side of Using Augmented Reality (AR) Training Systems in Industry. In Systems Engineering in the Fourth Industrial Revolution: Big Data, Novel Technologies, and Modern Systems Engineering; Wiley Online Library: Hoboken, NJ, USA, 2020; pp. 191–201. [Google Scholar] [CrossRef]
- Wu, H.K.; Lee, S.W.Y.; Chang, H.Y.; Liang, J.C. Current status, opportunities and challenges of augmented reality in education. Comput. Educ. 2013, 62, 41–49. [Google Scholar] [CrossRef]
- Akcayr, M.; Akcayır, G. Advantages and challenges associated with augmented reality for education: A systematic review of the literature. Educ. Res. Rev. 2017, 20, 1–11. [Google Scholar] [CrossRef]
- Nishimoto, A.; Johnson, A.E. Extending Virtual Reality Display Wall Environments Using Augmented Reality. In Proceedings of the Symposium on Spatial User Interaction, SUI ’19, New York, NY, USA, 19–20 October 2019. [Google Scholar] [CrossRef]
- Liu, B.; Tanaka, J. Virtual Marker Technique to Enhance User Interactions in a Marker-Based AR System. Appl. Sci. 2021, 11, 4379. [Google Scholar] [CrossRef]
- Gao, Q.H.; Wan, T.R.; Tang, W.; Chen, L. A Stable and Accurate Marker-Less Augmented Reality Registration Method. In Proceedings of the 2017 International Conference on Cyberworlds (CW), Chester, UK, 20–22 September 2017; pp. 41–47. [Google Scholar] [CrossRef]
- Ye, H.; Leng, J.; Xiao, C.; Wang, L.; Fu, H. ProObjAR: Prototyping Spatially-Aware Interactions of Smart Objects with AR-HMD. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, CHI ’23, New York, NY, USA, 23–28 April 2023. [Google Scholar] [CrossRef]
- Al-Ansi, A.M.; Jaboob, M.; Garad, A.; Al-Ansi, A. Analyzing augmented reality (AR) and virtual reality (VR) recent development in education. Soc. Sci. Humanit. Open 2023, 8, 100532. [Google Scholar] [CrossRef]
- Goh, E.S.; Sunar, M.S.; Ismail, A.W. Tracking Techniques in Augmented Reality for Handheld Interfaces. In Encyclopedia of Computer Graphics and Games; Lee, N., Ed.; Springer International Publishing: Cham, Switzerland, 2019; pp. 1–10. [Google Scholar] [CrossRef]
- Moro, M.; Marchesi, G.; Hesse, F.; Odone, F.; Casadio, M. Markerless vs. Marker-Based Gait Analysis: A Proof of Concept Study. Sensors 2022, 22, 2011. [Google Scholar] [CrossRef]
- Zhang, Z.; Wen, F.; Sun, Z.; Guo, X.; He, T.; Lee, C. Artificial Intelligence-Enabled Sensing Technologies in the 5G/Internet of Things Era: From Virtual Reality/Augmented Reality to the Digital Twin. Adv. Intell. Syst. 2022, 4, 2100228. [Google Scholar] [CrossRef]
- Syed, T.A.; Siddiqui, M.S.; Abdullah, H.B.; Jan, S.; Namoun, A.; Alzahrani, A.; Nadeem, A.; Alkhodre, A.B. In-Depth Review of Augmented Reality: Tracking Technologies, Development Tools, AR Displays, Collaborative AR, and Security Concerns. Sensors 2023, 23, 146. [Google Scholar] [CrossRef]
- Khurshid, A.; Grunitzki, R.; Estrada Leyva, R.G.; Marinho, F.; Matthaus Maia Souto Orlando, B. Hand Gesture Recognition for User Interaction in Augmented Reality (AR) Experience. In Virtual, Augmented and Mixed Reality: Design and Development; Chen, J.Y.C., Fragomeni, G., Eds.; Springer: Cham, Switzerland, 2022; pp. 306–316. [Google Scholar]
- Aouam, D.; Benbelkacem, S.; Zenati, N.; Zakaria, S.; Meftah, Z. Voice-based Augmented Reality Interactive System for Car’s Components Assembly. In Proceedings of the 2018 3rd International Conference on Pattern Analysis and Intelligent Systems (PAIS), Tebessa, Algeria, 24–25 October 2018; pp. 1–5. [Google Scholar] [CrossRef]
- Kaimoto, H.; Monteiro, K.; Faridan, M.; Li, J.; Farajian, S.; Kakehi, Y.; Nakagaki, K.; Suzuki, R. Sketched Reality: Sketching Bi-Directional Interactions Between Virtual and Physical Worlds with AR and Actuated Tangible UI. In Proceedings of the 35th Annual ACM Symposium on User Interface Software and Technology, UIST ’22, New York, NY, USA, 29 October–2 November 2022. [Google Scholar] [CrossRef]
- Ishii, H.; Ullmer, B. Tangible Bits: Towards Seamless Interfaces between People, Bits and Atoms. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems, CHI ’97, New York, NY, USA, 22–27 March 1997; pp. 234–241. [Google Scholar] [CrossRef]
- Löffler, D.; Tscharn, R.; Hurtienne, J. Multimodal Effects of Color and Haptics on Intuitive Interaction with Tangible User Interfaces. In Proceedings of the Twelfth International Conference on Tangible, Embedded, and Embodied Interaction, TEI ’18, New York, NY, USA, 18–21 March 2018; pp. 647–655. [Google Scholar] [CrossRef]
- Shaer, O.; Hornecker, E. Tangible User Interfaces: Past, Present, and Future Directions. Found. Trends Hum.-Comput. Interact. 2010, 3, 1–137. [Google Scholar] [CrossRef]
- Zuckerman, O.; Gal-Oz, A. To TUI or not to TUI: Evaluating performance and preference in tangible vs. graphical user interfaces. Int. J. Hum.-Comput. Stud. 2013, 71, 803–820. [Google Scholar] [CrossRef]
- Baykal, G.; Alaca, I.V.; Yantaç, A.; Göksun, T. A review on complementary natures of tangible user interfaces (TUIs) and early spatial learning. Int. J. Child-Comput. Interact. 2018, 16, 104–113. [Google Scholar] [CrossRef]
- He, F.; Hu, X.; Shi, J.; Qian, X.; Wang, T.; Ramani, K. Ubi Edge: Authoring Edge-Based Opportunistic Tangible User Interfaces in Augmented Reality. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, CHI ’23, New York, NY, USA, 23–28 April 2023. [Google Scholar] [CrossRef]
- Unity. Vuforia SDK Overview. Available online: https://docs.unity3d.com/2018.4/Documentation/Manual/vuforia-sdk-overview.html. (accessed on 13 November 2023).
- Filonik, D.; Buchan, A.; Ogden-Doyle, L.; Bednarz, T. Interactive Scenario Visualisation in Immersive Virtual Environments for Decision Making Support. In Proceedings of the 16th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry, VRCAI ’18, New York, NY, USA, 2–3 December 2018. [Google Scholar] [CrossRef]
- Prange, S.; Shams, A.; Piening, R.; Abdelrahman, Y.; Alt, F. PriView—Exploring Visualisations to Support Users’ Privacy Awareness. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, CHI ’21, New York, NY, USA, 8–13 May 2021. [Google Scholar] [CrossRef]
- Bermejo Fernandez, C.; Lee, L.H.; Nurmi, P.; Hui, P. PARA: Privacy Management and Control in Emerging IoT Ecosystems Using Augmented Reality. In Proceedings of the 2021 International Conference on Multimodal Interaction, ICMI ’21, New York, NY, USA, 18–22 October 2021; pp. 478–486. [Google Scholar] [CrossRef]
Scene | Description |
---|---|
Privacy Scene | Participants were asked to imagine setting up corresponding settings in privacy scenarios to minimize the risk of privacy exposure. |
Leaving Scene | Participants were asked to imagine setting up energy-saving, home-cleaning, and house safety functions when leaving home for work. |
Parlor Scene | Participants were asked to imagine having friends as guests at home and to provide a light and comfortable environment. They were also asked to make corresponding settings while confidently chatting. |
Sleeping Scene | Participants were asked to imagine preparing to sleep at night and needing a quiet environment. They were also asked to make corresponding settings to avoid exposing their privacy. |
Click Counts | Time Consumed (s) | p | r | |||
---|---|---|---|---|---|---|
M | SD | M | SD | |||
Privacy Scene | 34.08 | 17.04 | 89.2 | 53.69 | p < 0.0001 | 0.78 |
Leaving Scene | 13.72 | 7.84 | 56.24 | 37.66 | p < 0.0001 | 0.83 |
Parlor Scene | 12.12 | 8.27 | 58.40 | 39.46 | p < 0.0001 | 0.71 |
Sleeping Scene | 13.28 | 7.93 | 54.64 | 39.61 | p < 0.0001 | 0.56 |
Total | 73.2 | 41.08 | 258.48 | 170.42 | p < 0.0001 | 0.73 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Chen, L.; Zhao, H.; Shi, C.; Wu, Y.; Yu, X.; Ren, W.; Zhang, Z.; Shi, X. Enhancing Multi-Modal Perception and Interaction: An Augmented Reality Visualization System for Complex Decision Making. Systems 2024, 12, 7. https://doi.org/10.3390/systems12010007
Chen L, Zhao H, Shi C, Wu Y, Yu X, Ren W, Zhang Z, Shi X. Enhancing Multi-Modal Perception and Interaction: An Augmented Reality Visualization System for Complex Decision Making. Systems. 2024; 12(1):7. https://doi.org/10.3390/systems12010007
Chicago/Turabian StyleChen, Liru, Hantao Zhao, Chenhui Shi, Youbo Wu, Xuewen Yu, Wenze Ren, Ziyi Zhang, and Xiaomeng Shi. 2024. "Enhancing Multi-Modal Perception and Interaction: An Augmented Reality Visualization System for Complex Decision Making" Systems 12, no. 1: 7. https://doi.org/10.3390/systems12010007