A Systematic Review of A Virtual Reality System From The Perspective of User Experience

Download as pdf or txt
Download as pdf or txt
You are on page 1of 19

International Journal of Human–Computer Interaction

ISSN: 1044-7318 (Print) 1532-7590 (Online) Journal homepage: https://www.tandfonline.com/loi/hihc20

A Systematic Review of a Virtual Reality System


from the Perspective of User Experience

Yong Min Kim, Ilsun Rhiu & Myung Hwan Yun

To cite this article: Yong Min Kim, Ilsun Rhiu & Myung Hwan Yun (2020) A Systematic Review
of a Virtual Reality System from the Perspective of User Experience, International Journal of
Human–Computer Interaction, 36:10, 893-910, DOI: 10.1080/10447318.2019.1699746

To link to this article: https://doi.org/10.1080/10447318.2019.1699746

Published online: 13 Dec 2019.

Submit your article to this journal

Article views: 648

View related articles

View Crossmark data

Citing articles: 2 View citing articles

Full Terms & Conditions of access and use can be found at


https://www.tandfonline.com/action/journalInformation?journalCode=hihc20
INTERNATIONAL JOURNAL OF HUMAN–COMPUTER INTERACTION
2020, VOL. 36, NO. 10, 893–910
https://doi.org/10.1080/10447318.2019.1699746

A Systematic Review of a Virtual Reality System from the Perspective of User


Experience
a b a
Yong Min Kim , Ilsun Rhiu , and Myung Hwan Yun
a
Department of Industrial Engineering and Institute for Industrial System Innovation, Seoul National University, Seoul, South Korea; bDivision of Big
Data and Management Engineering, Hoseo University, Asan, South Korea

ABSTRACT
Virtual reality (VR) is receiving attention enough to be considered as its revival age in both industrial and
academic field. Since VR systems have various types of interaction with users and new types of
interaction are constantly being developed, various studies investigating user experience (UX) of VR
systems are continuously needed. However, there is still a lack of research on the taxonomy that can
recognize the main characteristics of VR system at a glance by reflecting the influencing factors of UX.
Therefore, we collected and reviewed the research related to the UX evaluation of the VR system in
order to identify the current research status and to suggest future research direction. To achieve this,
a systematic review was conducted on UX studies for VR, and taxonomies of VR system including
influencing factors of UX were proposed. A total of 393 unique articles were collected, and 65 articles
were selected to be reviewed via Systematic Reviews and Meta-Analyses methodology. The selected
articles were analyzed according to predefined taxonomies. As a result, current status of research can be
identified base on the proposed taxonomies. Besides, issues related to VR devices and technology, and
research method for future research directions can be suggested.

1. Introduction field of view and high resolution. In addition, advanced track-


ing technology with low latency and high accuracy is being
Although the concept of virtual reality (VR) has been
implemented and developed. Compared to these HMDs, more
described using various definitions and discussed for a long
affordable HMDs, such as Samsung Gear or Google
time, there are some common properties among the defini-
Cardboard VR, are also available.
tions. The common properties of VR are computer-generated
VR systems have application not only in the field of enter-
digital environment, interaction, and immersion (Jayaram,
tainment but also in various other fields such as medicine,
Connacher, & Lyons, 1997; Jerald, 2015; Pantelidis, 1993;
rehabilitation, education, engineering, and military. Howard
Pratt, Zyda, & Kelleher, 1995). In other words, the meaning
(2017) reviewed previous studies using VR-based rehabilita-
of VR is not limited to an artificial space synthesized in
tion programs and found these programs are more effective
a computer environment. VR is a computer environment in
than the traditional one. More specifically, VR based-
which a user can interact with system components, obtaining
rehabilitation has been found to be more effective in motor
a sense of immersion. VR systems are being implemented in
control of people with stroke (Henderson, Korner-Bitensky, &
a conventional personal computer (PC) environment; a head-
Levin, 2007; Saposnik, Levin, & Group, 2011) and cerebral
mounted display (HMD) platform, which has been commer-
palsy (Reid, 2002). Experimental evidence of learning effec-
cialized recently; or a cave automatic virtual environment
tiveness in education fields have also been demonstrated. For
(CAVE), which is a wall-sized platform surrounding a user.
example, the use of VR simulators has improved orthopedic
Today, VR is receiving attention enough to be considered
technological skills of surgeons (Aïm, Lonjon, Hannouche, &
as its revival age. In fact, VR is not a field that has suddenly
Nizard, 2016). The VR system can also help children recog-
emerged. The first HMD was developed in the 1960s and
nize pedestrian safety and improve their crossing behavior
introduced as an ultimate display (Sutherland, 1965). Since
(McComas, MacKay, & Pivik, 2002). In addition, Lau and
then, technologies related to the implementation of VR sys-
Lee (2015) showed that a VR-based learning platform pro-
tems have continued to evolve. In particular, the revival of VR
vides students with a positive learning experience. In the
accelerated as low-priced immersive HMD-based VRs became
military field as well, VR technology has been successfully
commercially available to the public (Wang & Lindeman,
utilized (Lele, 2013). For example, the VR system has been
2015). Representative examples of personal HMD VR include
effectively adopted for simulated training (Bhagat, Liou, &
Oculus Rift, HTC Vive, and Sony PlayStation VR, which
Chang, 2016; Koźlak, Kurzeja, & Nawrat, 2013) or for treating
provide a high degree of immersion by providing a wide

CONTACT Ilsun Rhiu [email protected] Division of Big Data and Management Engineering, Hoseo University, Asan, South Korea.
Color versions of one or more of the figures in the article can be found online at www.tandfonline.com/hihc.
© 2019 Taylor & Francis Group, LLC
894 Y. M. KIM ET AL.

anxiety disorders and posttraumatic stress disorder (Botella, overall UX. Cobb, Nichols, Ramsey, and Wilson (1999)
Serrano, Baños, & Garcia-Palacios, 2015; Pallavicini, assessed the effects related to health and safety for various
Argenton, Toniazzi, Aceti, & Mantovani, 2016). As such, the virtual environments (VEs), and demonstrated that the fol-
effectiveness of VR applications has been recognized in var- lowing symptoms can be defined as virtual reality-induced
ious industries, and thus, this area has high prospects. symptoms and effects (VRISEs): simulator sickness, postural
Studies on systems that actively interact with users (e.g., instability, psychomotor control, perceptual judgment, con-
PC and smartphone) should be conducted from the perspec- centration, stress, and ergonomics effects. Nichols and Patel
tives of human–computer interaction (HCI) and user experi- (2002) reviewed the empirical evidence of health and safety
ence (UX). From these perspectives, the VR system also issues in VR, and found VR-induced sickness to be a major
requires UX research in the following three aspects. First, problem. The authors suggested an experimental procedure
the VR system is constructed through a combination of var- model to manage VRISE with a minimized level, emphasizing
ious components, and the interactions performed in the sys- that someone can be inevitably impacted by the VR experi-
tem vary. These can be specified as contextual components ence. In addition, Stanney, Mollaghasemi, Reeves, Breaux, and
(e.g., users, devices, and interactions), which can influence UX Graeber (2003) suggested a hierarchical model of usability
(Forlizzi & Battarbee, 2004; Hassenzahl & Tractinsky, 2006). criteria for VE and its side-effect was declared as one of the
Especially, the VR system has many device combinations and major elements for usability of VE, which needs to be mini-
interaction methods that can be adopted. For example, HMD, mized. As such, the symptoms caused from VR experience
large screen, wall-sized projectors, or conventional monitors have been raised from the past. However, these problems
can be used for visual display, and speakers or headphones remain in immersive VRs today. Jerald (2018), who presented
can be selected for auditory feedback. In the case of a tracking five essential guidelines for human-centered VR design,
system, devices for full- or local-body tracking, such as head emphasized that VR developers should understand any type
or hand, can be adopted. Besides, the same task can be of adverse effects of VR experience and recognize these issues.
performed with various interaction techniques. For example, Furthermore, there may be potential problems when new VR
Boletsis (2017) reported that locomotion techniques that have technologies or platforms are developed. Therefore, UX study
been implemented so far can be classified into eleven criteria in a VR system is required for users to experience a VR
(e.g., real-walking, walking-in-place, controller, teleportation, system safely and pleasantly.
redirected walking, arm swinging, and human joystick). Prior to reviewing UX evaluation in VR systems, we first
Furthermore, new VR equipment and interaction techniques suggest a framework for evaluating the UX in VR systems,
are constantly being developed and introduced. Therefore, VR which forms the basis of the review results. A framework-
systems specified with various usage contexts can provide based review can provide a more structured overview of the
different UX, so continuous research on UX in VR systems UX evaluation in today’s VR systems, and is expected to
is needed to understand the effect of the newly adopted usage provide researchers with insights into building VR systems
context and provide better experience. and performing UX assessments.
Second, there is a need to strengthen the UX components Previously, VR systems were generally classified or
such as the presence, immersion, and engagement that VR described in terms of the following immersion level criteria –
seeks. The sense of presence is one of the representative UX non-immersive, semi-immersive, and full immersive
components in VR (Schuemie, Van Der Straaten, Krijn, & (Henderson et al., 2007; Kozhevnikov & Gurlitt, 2013;
Van Der Mast, 2001; Takatalo, Nyman, & Laaksonen, 2008), Kyriakou, Pan, & Chrysanthou, 2017; Ma & Zheng, 2011;
and it can be described as the subjective perception of being Moro et al., 2014). However, such classification of VR systems
in a mediated environment (Slater & Wilbur, 1997; Stanney & has several limitations. This classification criterion is limited
Salvendy, 1998). According to Bulu (2012), presence and to the visual display characteristic. VR is a system composed
immersive tendencies of learners have positive correlation of various devices. Although the characteristics of the visual
with satisfaction in the virtual world. In addition, in a VR display have a dominant influence on the subjective sense of
system that provides more sense of presence and immersion, immersion, the immersive feeling can be improved or reduced
the task performance is also increased. For example, task by factors other than the visual display. Especially, there is no
performance is more successfully completed under objective criterion for the semi-immersive level. For this rea-
a stereoscopic display condition, which provides better sense son, the same system can be classified differently. In addition,
of immersion compared to a non-stereoscopic display (Loup- the immersion level may differ even for the same classification
Escande, Jamet, Ragot, Erhel, & Michinov, 2017). In the condition. For example, typical desktop VRs are classified as
medicine field, surgeons can perform the tasks in the surgical non-immersive VRs. However, there is no clear basis for
simulation system more accurately with the haptic feedback whether desktop VRs with added tracking technology should
than the case without it (Girod, Schvartzman, Gaudilliere, be classified as non-immersive.
Salisbury, & Silva, 2016). Moreover, the target UX compo- Although Muhanna (2015) proposed a hierarchical struc-
nents can be different for each research topic. Therefore, UX ture for VR systems, the taxonomy did not deviate completely
study in VR is necessary to effectively achieve the purpose of from the existing immersion-based criteria and focused on
VR (e.g., presence, immersion, pleasure, learning effect, and CAVE. A detailed classification of VR sub-elements has also
training effect). been performed. Anthes, García-Hernández, Wiedemann, and
Third, it is necessary to reduce the side-effects caused by Kranzlmüller (2016) proposed a structural overview for cur-
VR experience, which can have a negative impact on the rent input and output devices. Bowman and Hodges (1999)
INTERNATIONAL JOURNAL OF HUMAN–COMPUTER INTERACTION 895

systematized the interaction methods and techniques in detail. haptic quality), and emotional reactions (e.g., subjective feelings,
However, in this case, there is a limitation in understanding motor expressions, and physiological reactions). Thus, in VR
the characteristics of each element of the VR system to which systems as well, attributes of users, devices, and interaction can
various contexts are applied. Therefore, it is necessary to be specified as the major factors influencing UX. In addition, the
reorganize the systematic classification according to the UX evaluation method should be adopted appropriately in
main characteristics of the VR systems. accordance with the research goal and UX components that
Therefore, this study aims to organize the various studies need to be observed (Hartson & Pyla, 2012). Even if the UX
focusing on UX as it relates to VR systems together, based on components are the same, the resulting data type and interpreta-
the following research purposes: tion may differ if the evaluation method is different.
As a result, “users”, “devices”, “user activity,” and “evalua-
● To provide a structural methodology for categorizing tion” are included in the UX evaluation framework in VR
the current VR studies. systems as influencing factors of UX (Figure 1). User character-
● To classify and summarize studies related to UX in VR. istics include demographic information and health status of
● To clarify the current research limitations for future people who experience VR systems. Devices are hardware that
research directions. constitutes a VR system and are divided into input and output
devices. User activity includes interaction elements that can be
The remainder of this paper is structured as follows. used to identify specific usage contexts such as task type, inter-
Section 2 illustrates the classification framework of VR sys- action partner, and posture. Evaluation factors include attributes
tems from a UX perspective and describes an elaborated of evaluation methods and characteristics of acquired data.
taxonomy of VR systems. Section 3 presents a methodology As aforementioned, previous studies have suggested
to extract the articles. Section 4 presents the analysis results of a detailed classification for the sub-parts of a VR system
the collected papers. Section 5 contains discussions and sug- (Anthes et al., 2016; Boletsis, 2017; Bowman & Hodges,
gestions for future research. Finally, Section 6 concludes the 1999); however, the establishment of a VR system from over-
paper with a brief summary and remarks. all perspectives has been hardly studied. Therefore, this study
formulates the details of the framework by comprehensively
referring to a well-organized handbook on a VR system and
2. UX framework in VR system previous research related to 3D interaction and the attributes
Before evaluating UX, it is necessary to first clarify the factors of UX evaluation (Bowman, Kruijff, LaViola, & Poupyrev,
influencing UX (Schulze & Krömker, 2010). According to 2001; Burdea & Coiffet, 2003; Jerald, 2015; Rajeshkumar,
Forlizzi and Battarbee (2004), UX in interactive systems can be Omar, & Mahmud, 2013; Vermeeren et al., 2010). In addition,
viewed from the product-centric, user-centric, and interaction- the classification criteria are added and some elements are
centric perspectives. Thüring and Mahlke (2007) proposed modified. Each factor is detailed as follows.
a components of user experience (CUE) model of the human–
technology interaction system. In this model, the interaction
2.1. Users
characteristics are influenced by system properties, user charac-
teristics, task, and context, and affect the UX components such In user characteristics, demographic information, knowledge,
as instrumental quality (e.g., controllability, effectiveness, and personality, and cognitive or physical impairment are selected
learnability), non-instrumental quality (e.g., visual esthetics and as major categories. The demographic information contains

Figure 1. A suggested UX framework of VR system.


896 Y. M. KIM ET AL.

age, sex, occupation, education level, or race. In addition, hand input device, whether tracking technology is applied is
there is a domain knowledge gap between the end-user and added as a detailed criterion. If tracking system is available,
expert, which might result in different perspectives on UX body tracking, head tracking, eye tracking, microphone, and
issues. According to Kober and Neuper (2013), individual treadmill are included. In the absence of tracking, non-
differences such as personality can reveal different appear- tracked and non-hand devices are added, which can include
ances for presence experience. VR systems are also actively pedal-type inputs. In case of whole-body tracking, the entire
applied to physical rehabilitation or trauma treatment in the body of the user is tracked and the user’s movement pattern is
medicine field and may require changes in interaction pat- recognized. Head tracking is generally available when the user
terns or assessment methods for the VR system depending on wears an HMD such as Oculus Rift, HTC Vive, or Sony
the patient’s physical or perceptual limitations. PlayStation VR. Eyeglasses with a tracking sensor can also
track head movements. Under this condition, a VR environ-
ment corresponding to the head movement of the user is
2.2. Device
presented. Eye tracking tracks the user’s eye movement such
In device characteristics, input and output devices are the as the gaze point. A microphone can be used for system
main categories of the VR system hardware (Anthes et al., command by tracking the user’s voice information. When
2016; Li, Yi, Chi, Wang, & Chan, 2018; Zhang, 2017). The a treadmill is used as an input device, the user can actually
input device delivers the physical signal provided by the user walk on the treadmill and the gait motion is transmitted to
in digital form to the VR engine, while the output device the VR engine, which changes the virtual world according to
provides the user with a specific modality (e.g., visual, audi- the user’s motion. It includes not only traditional treadmill,
tory, and haptic) in response to the collected information. It is but omnidirectional treadmill such as Omni-treadmill and
important to select an adequate combination of devices con- VirtuSphere.
sidering interaction fidelity and characteristics of the selected When tracking is applied to a hand input device, it
device to provide positive VR experience, such as enhance- includes input types of a tracked hand-held controller, hand-
ment of sense of immersion or presence, and high perfor- worn, and bare-hand. The tracked hand-held controllers
mance. Figure 2 shows the classification scheme for input and include controllers such as Oculus Touch, Wii Remote
output devices according to the detailed criterion. Controller, and Sony Move, which track the hand movement
and have operation buttons on their surface. In the case of the
2.2.1. Input device hand-worn type, users wear the device in their hand directly,
The input device is divided into non-hand and hand input and the Data Glove is one of the representative examples. In
devices depending on whether hands are required. For a non- bare-hand type, which is generally known as a natural user

Figure 2. Device taxonomy in VR (up: input devices; down: output devices).


INTERNATIONAL JOURNAL OF HUMAN–COMPUTER INTERACTION 897

interaction method, users do not need to wear any device and indicates a feedback obtained from structures built in the
either their hand movement is tracked or hand gestures can real world in a VR environment, while an active-type haptic
be recognized. The cases in which the hand input device does is a feedback received from a haptic device. The active haptic
not have tracking enabled are classified as world-grounded is again classified as tactile feedback or proprioceptive force
type. A world-grounded device includes a keyboard and feedback. Tactile feedback is transmitted to the skin through
a mouse, which are generally used in desktop VR systems. If vibration, and proprioceptive force provides force feedback to
it is not world-grounded type, it is classified as a non-tracked the user. In addition, independent of the feedback properties,
hand-held controller, and gamepads are typically included in active haptic can be installed in the real world or worn by
this criterion. users. In the handbook of Jerald (2015), a motion platform is
classified as sensory cue, and provides feedback, such as
motion, to the user. For example, a vehicle mimicking
2.2.2. Output device a roller coaster corresponds to a motion platform when it
Output devices can first be sorted based on sensory cue, and moves in response to a rail slope presented in a VE. Since the
each type includes detailed display types depending on the motion platform can contribute to the immersive experience
sub-criteria. Among the sensory cues, the visual cue is essen- of the user as well as cause motion sickness (Jerald, 2015;
tial and can dominantly influence users’ perception on a VR Riecke, Schulte-Pelkum, Caniard, & Bulthoff, 2005), it can be
experience. Types of visual display can be broadly divided into seen as a key factor of UX evaluation in VR systems. The
two categories depending on whether they are fixed in the passive motion platform defines the case where the user is
world. The world-fixed type is installed in the real world and affected by the system, while the active motion platform
its position does not change with the user’s movement. defines the case where the user operates the motion platform
A conventional monitor and screen or projector-based display directly.
belong to this category, which can be installed with one or
more displays. Multiple displays can provide a wide field of
view. If it is not world-fixed type, it is classified as HMD type, 2.3. User activity
which can be classified into non-see through HMD and video
see-through HMD. There is also see-through HMD, but this In user activity characteristics, three main categories of task,
case is excluded from the VR display category because it environment, and application are selected.
corresponds to a smart glass applied with augmented reality.
The non-see-through HMD is again classified into 2.3.1. Task
a smartphone-based HMD (e.g., Samsung Gear) and an Task type, posture, and interaction partner are selected as
assembled HMD (e.g., Oculus Rift and HTC Vive). CAVE is detailed classification criteria related to task (Figure 3).
also included in the world-fixed display type. The first CAVE
was introduced in 1992 (Cruz-Neira, Sandin, DeFanti, 2.3.1.1. Task type. In a VR system, the main task types that
Kenyon, & Hart, 1992) and had a cubic structure of 10 × 10 involve interaction are navigation, selection & manipulation,
× 10 ft3. A common feature of CAVE is that the displays are and system control. These tasks are also representative of the
wall-sized and surround the users to provide more immersive 3D interaction (Bowman et al., 2001; Reyes-Lecuona & Diaz-
experience (Kageyama & Tomiyama, 2016; Manjrekar et al., Estrella, 2006). Navigation is a core task that is evaluated in
2014; Muhanna, 2015). the VR system (Santos et al., 2009), and can be implemented
Auditory feedback can be provided through an earphone in situations where avatars, cars, or airplanes are moving. In
or a headphone. Speakers can also be adopted as world-fixed selection & manipulation, selection is a task of picking
type. Both types of auditory displays can provide 3D sound to a specific object. Manipulation is the task of transforming or
enhance user immersion. The manner in which haptic feed- moving an object and the manipulation task follows after the
back is provided can be broadly divided into passive or active selection task. Thus, the selection and manipulation task types
type. According to Jerald (2015), a passive-type haptic are categorized in one category. System control is the task of

Figure 3. User activity-related component taxonomy in VR.


898 Y. M. KIM ET AL.

selecting a menu bar or activating a specific system in a VR environment since it can be seen as a visual-spatial platform
environment. The three types of tasks described above are the (Figure 4).
main tasks of 3D interaction, but we add the task of watching
type to it. In most cases except for special cases, users pas-
sively accept the visual, auditory, or haptic information pro- 2.3.3. Application
vided by the VR system in the watching task. In case of using VR applications are classified based on the industrial field and
HMD, however, users can actively use visual control by con- its purpose. The official classification of VR applications does
trolling virtual camera with head movements. not currently exist but is typically used in the following areas:
Education & training, Entertainment, Healthcare & Therapy,
Product development, Architectural & Urban design. Note
2.3.1.2. Interaction partner. Interaction partner can be cate- that these are representative VR applications and not all
gorized into three categories depending on whether users applications of the VR system.
experience VE while sharing information in the constructed
VR system. A single user is a situation where there is no other
human partner. In this case, a user only interacts with VR
devices. A situation in which multiple users experience the VE 2.4. Evaluation
simultaneously in the same VE system can be classified as co- When conducting UX evaluation, it is important to adopt an
located multi-user. Multiple users who are in different loca- appropriate UX evaluation method and understand its attri-
tions may simultaneously connect to one VR and interact butes (Hartson & Pyla, 2012). Measurement, measure, evalua-
with each other, and this case can be classified as a remote tor, location, system development phase, and period of
multi-user. experience are selected as the main attributes of UX evalua-
tion. The measurement can be classified as subjective or
objective. In a VR system, presence, flow, or engagement
2.3.1.3. Posture. When experiencing VR, the user’s posture
belongs to subjective measurement, while error rate or task
can be limited or completely free. In most cases, users experi-
completion time belongs to objective measurement. The col-
ence VR in a sitting or standing posture. The users may feel
lected data can be classified as quantitative or qualitative
a higher sense of immersion in a situation where their posture
measure. Focus group interview, think-aloud, and deep-
is not limited; however, the postures can be limited to increase
interview are representative methods for collecting qualitative
interaction fidelity or the degree of real system imitation. For
data, and questionnaires, physiological signals can be used to
example, when experiencing a car simulation in a VR system,
collect quantitative data. Evaluator refers to the subject who
the user’s posture may be limited to a sitting posture that
evaluated UX and can include general users and experts. In
matches the actual driving situation. Therefore, posture selec-
addition, UX evaluation can be conducted in the laboratory,
tion can be an important consideration for UX when experi-
field, or online, and can be divided into fully functional
encing VR.
systems, functional prototypes, conceptual design ideas in
a very early phase, and nonfunctional prototypes depending
2.3.2. Environment on the time of system development. Fully functional systems
Environment characteristics are classified depending on mean that all devices and content utilized in the VR system
whether or not the riding platform is provided. The riding are currently being commercialized. When the device or con-
platform is generally provided for increasing interaction mod- tent developed by the researchers is included in the VR
ality and can include car seats, treadmills, or cockpit. The system, it is defined as functional prototype. The period of
riding platform is further classified as non-movable or mova- experience can be classified as before usage, during usage,
ble platform. The movable platform corresponds to the after usage, or overtime (Roto, Law, Vermeeren, &
motion platform. In addition, CAVE is added to the VR Hoonhout, 2011).

Figure 4. Environment taxonomy in VR.


INTERNATIONAL JOURNAL OF HUMAN–COMPUTER INTERACTION 899

Figure 5. Flow diagram of study selection.

3. Method system, characteristics of participant, study design, and eva-


luation methods) on article selection.
In this study, a systematic review was conducted according to
the Preferred Reporting Items for Systematic Reviews and Meta-
Analyses (PRISMA) (Liberati et al., 2009). The papers were
searched through several web databases on May 27, 2019. The 3.4. Quality assessment
publication date was limited from 2009 to 2018. The selection In addition to eligibility criteria screening, quality assessment was
criteria and review procedures are detailed as follows. performed to select the final papers for review. To assess the
collected studies, QualSyst standard was used (Kmet, Cook, &
Lee, 2004). This tool consists of 14 criteria evaluating appropriate-
3.1. Information source ness of study design, research question, participant selection,
sample size, outcomes, and conclusion (See. Appendix). Each
A total of six web databases were selected: Scopus, Web of
criterion was graded according to fulfillment level (2 = yes, 1 =
Science, Science direct, IEEE Xplore, EBSCO, and ProQuest.
partial, 0 = no). Criteria that do not apply to a particular study
These search engines cover a wide spectrum of perspectives as
design were marked as ‘n/a’ and were excluded from the summary
well as engineering and medical perspectives (Powers,
score calculation. The final score was obtained by dividing the
Bieliaieva, Wu, & Nam, 2015).
sum of the points by the maximum possible points. For example,
if there is one ‘n/a’, the maximum possible score is 26 points (13
criteria x 2 points = 26 points). Two reviewers (YMK and IR)
3.2. Inclusion and prescreening criteria
evaluated each study independently and disagreements were
The journal articles in English were selected for the review resolved by consensus or the third reviewer (MHY). Papers with
only, and short reports, news, proceeding papers, books, and a score less than 55 were indicated as weak quality (Van Cutsem
dissertations were excluded. “virtual reality”, “virtual environ- et al., 2017) and were excluded from this study.
ment”, “VR,” and “VE” were selected as keywords for virtual
reality, while “user experience”, “UX,” and “human experi-
ence” were selected for user experience. Therefore, we 3.5. Study selection
searched a total of 12 combinations in each search engine.
The review procedure and the number of selected papers accord-
ing to each procedure are shown in Figure 5. As a result of the
keyword search, a total of 635 papers were collected. The number
3.3. Eligibility criteria
of initially collected papers for each search engine was as follows:
After the screening process, the paper was selected by reading Scopus (302), Web of Science (148), Science Direct (58), IEEE
the entire text. We selected a study that collected UX-related Xplore (30), EBSCO (32), and ProQuest (65). After removing
indicators by evaluating UX on a VR platform as an essential duplicate papers, 393 papers were left. As a result of the initial
selection condition. Thus, the studies selected should (1) screening process, 208 papers remained. After that, we reviewed
utilize VR platform and (2) evaluate UX in VR. Except these the full text of these papers by carefully considering the eligibility
two criteria, there were no restrictions (e.g., type of VR criteria, and performed quality assessment for the papers which
900 Y. M. KIM ET AL.

met the eligibility criteria. As a result, 65 papers were selected with studies were conducted on physically and mentally healthy sub-
the purpose of this study. jects. Only three studies were conducted on people who were
physically uncomfortable. There was only one case in which the
experiment was conducted on both healthy and impaired people.
4. Results In addition, most studies were concentrated on non-elderly sub-
jects. The elderly was studied in only four papers.
4.1. User characteristics Figure 8 shows the sex ratio of subjects in the experiment.
As shown in Figure 6, the experimental group had its average Specifically, it represents the ratio of male subjects to the total
age concentrated in 20s. The 30s, 40s, and 70s group, respec- number of subjects. In other words, exactly 50% means that
tively, were studied only in one case, and there were two cases the number of male and female subjects is exactly the same.
in each of 10s, 50s, 60s, and 80s. If the age information was There were 13 cases in the range of 45–54%, and this range
provided as a range or if it was difficult to know the average means that the number of male and female subjects is set to
value because no age information was provided at all, it was approximately the same. There were five cases where exactly
classified as no information. the same number of men and women participated in the
Figure 7 shows the results of classifying the selected papers experiment. Although there may not be a significant differ-
based on the subject’s health status and age group. In most cases, ence in UX by gender, several studies proved that gender can

Figure 6. Classification of studies by age.


The total number of cases might be greater than the number of papers selected because there were cases where multiple experiments were performed in one paper.

Figure 7. Classification of studies by type of participant.

Figure 8. Classification by sex ratio in experiment.


There were 17 cases on no information. The total number of cases might be greater than the number of papers selected because there were cases where multiple
experiments were performed in one paper.
INTERNATIONAL JOURNAL OF HUMAN–COMPUTER INTERACTION 901

be one of the significant factors in VR experience such as adjust the volume of a specific region on which they are concen-
cybersickness (Baños et al., 2004) and presence (Narciso, trating, using real-time gaze tracking system. In addition, Lin,
Bessa, Melo, Coelho, & Vasconcelos-Raposo, 2017). Breugelmans, Iversen, and Schmidt (2017) utilized an eye-
However, the results showed that the male ratio was the tracking system as a non-intrusive interaction method for patients
most frequent in the range of 75–100%, and the sum of the with arthritis in the hand, replacing conventional computer
cases in which the male ratio was higher than 55% was larger devices such as keyboard and mouse. Electroencephalogram
than the sum of cases in which the male ratio was less (EEG) signals combined with the brain–computer interface
than 44%. (BCI) application were also used in VR systems. Vourvopoulos
and Liarokapis (2014) found that commercial BCI can be used
effectively for robot navigation in a VE. Tidoni et al. (2017)
applied BCI and robotics to VR, and found that the participant
4.2. Device – input device characteristics
exhibited good performance for BCI within the immersive sce-
The types of input devices used in VR systems are classified in narios. There was one case where the text input was enabled by
Table 1. In 27 cases, both hand and non-hand input devices recognizing the voice of the user via a microphone (Pick, Weyers,
were used simultaneously. There were also 21 cases where Hentschel, & Kuhlen, 2016) and one where a pressure sensor and
only the hand input device was used whereas the frequency actuators were attached to the shoe insole to provide haptic feed-
of using only non-hand input device was relatively low. back according to the walking style (Turchet, Burelli, & Serafin,
The results of a detailed classification of the types of input 2013). In addition, there was one case where the omnidirectional
devices are presented in Table 2. In non-hand input type, the head treadmill, VirtuSphere, was used for virtual navigation task
tracking method was widely used in comparison with other (Monteiro et al., 2018). In the absence of tracking condition,
methods. Besides tracking the head, other body parts were also pedals were used as a non-tracked device in a car simulation
tracked to recognize specific body gestures or to track the trajec- scenario (Georgiou & Demiris, 2017).
tory of the body parts. For example, a participant can perform In the hand input device, the number of cases in which
navigation tasks in a VE through specific actions defined by tracking was impossible was more than that in the case where
researchers, such as walking in place (Monteiro, Carvalho, Melo, tracking was possible. Thus, the traditional input device has
Branco, & Bessa, 2018), placing the right foot in front, or rotating been adopted more for UX research in a VR system so far.
the shoulder (Brade et al., 2017). In case of the hand input applied tracking technology, the
In some cases, an eye tracker was used for applying the eye tracked hand-held controllers were used in nine cases. While the
movement as an input channel. Vinnikov, Allison, and Fernandes tracked hand-held controllers were commercial products such as
(2017) developed a gaze-contingent display that allows the user to Wii remote controller (Jonsdottir et al., 2018), trackable pen
(Anton, Kurillo, & Bajcsy, 2018; Rieuf & Bouchard, 2017; Son,
Table 1. Classification by input device used in experiment.
Shin, Choi, Kim, & Kim, 2018), or gripper (Morán et al., 2015),
researchers have also built trackable controllers by attaching
Input device types No. of case
tracking sensors to specific products (Wang & Lindeman,
Non-hand input device only 16
Hand input device only 21 2015). In addition, bare-hand type was used in 13 cases and
Both 27 Leap Motion device was used in most of these to recognize the
No information 3
N/A 2 hand movement or hand gesture. Hand-worn type was used in
The total number of cases might be greater than the number of papers selected three cases and include gloves (Lin et al., 2017; Xiong, Wang,
because there were cases where multiple experiments were performed in one Huang, & Xu, 2016) and bracelets (Camporesi & Kallmann,
paper. 2016). In the no tracking condition for hand input devices, the
frequencies of using non-tracked hand-held controllers (e.g.,
Table 2. Classification by hand and non-hand input devices. gamepad) and world-grounded devices (e.g., mouse and key-
Main No. of board) were similar. Furthermore, de Jesus Oliveira, Nedel, and
categories subcategories Items case Maciel (2018) implemented touch screen for articulatory inter-
Non-hand Tracking Head tracking 31 face by attaching a smartphone to the back of the HMD, and this
input Eye tracking 5 input device belongs to non-tracked and non-hand-held devices.
device Body tracking (body gesture, upper or 8
lower body movement)
Microphone 1
Physiological signal (EEG) 2 4.3. Device – output device characteristics
Treadmill 1
Pressure sensor 1 The classification by sensory cue used in the VR system is
No tracking Non-tracked & non-hand devices 1 presented in Figure 9. Visual feedback was provided in all
Hand input Tracking Tracked hand held controller 9
device Hand worn 3 studies, while auditory feedback was relatively less provided.
Bare hand 13 In addition, haptic feedback was provided in 14 studies,
No tracking Non-tracked hand held controller 14
World-grounded devices 15 commonly through an input device that utilizes hands. For
Non-tracked & non-hand held devices 1 example, Phantom Omni (Culbertson & Kuchenbecker, 2017;
No information 4 Erfanian, Hu, & Zeng, 2017; Schvartzman, Silva, Salisbury,
If no information about the input device corresponding to the two main Gaudilliere, & Girod, 2014), Novint Falcon (Ahn, Fox, Dale, &
categories is given, it is classified as no information. The total number of
cases might be greater than the number of papers selected because there Avant, 2015; Jin, 2013), steering wheel (Georgiou & Demiris,
were cases where multiple experiments were performed in one paper. 2017) or head bend (de Jesus Oliveira et al., 2018), which
902 Y. M. KIM ET AL.

Figure 9. Classification by sensory cue.

provide vibration or force feedback, were used. Especially, interaction in a VR system. In this system, an assembled
Wang and Lindeman (2015) provided tactile feedback with HMD and a wearable tablet display were worn on user’s
blowing wind through fans to enhance the sense of motion in head and non-dominant forearm, respectively. The UX eva-
the VR system. luation results from various angles including subjective assess-
There was only one case that examined the effect of olfac- ment, task performance, interview, and video observation
tory feedback on presence (Baus & Bouchard, 2017). These showed that this system can have a positive impact on UX.
authors identified that unpleasant odor had a statistically sig- In the case of world-fixed type, a conventional monitor-based
nificant effect on the sense of presence and argued that type was most frequently adopted for providing visual feed-
exposure to unpleasant odors may increase their presence back. Screen and projector type were adopted in 5 and 8 cases,
because there is no obvious visual clue to connect the odor respectively. On the other hand, the CAVE system was used in
to the visual scene in a VE. In addition, motion platform was relatively few cases. A certain simulator for virtual robotic
used in three cases. Bian et al. (2016) installed a dynamic seat surgery was used in one case (Tergas et al., 2013). In addition,
that provides motion feedback corresponding to the situation Harish and Narayanan (2013) developed a multi-planar dis-
presented in the VR. Monteiro et al. (2018) used VirtuSphere, play by applying a novel rendering scheme. These authors
which allows users to move omnidirectionally in the virtual built a spherical display using multiple polygonal facets and
world. Pedroli et al. (2018) provided a bike-type motion plat- demonstrated that the task performance was better in the
form integrated in the CAVE system, and the users performed multi-planar display compared to a flat panel display.
the given task while cycling.
Visual feedback is essential in VR systems, and a detailed
classification by visual feedback devices is presented in Table 4.4. User activity – interaction partner, posture, and task
3. In non-world-fixed type, which is generally classified as type
HMD type, assembled HMDs, including Oculus Rift and
Table 4 shows the classification by user activity including
HTC Vive, were most frequently used. On the other hand, interaction partner, posture, and task type. In a VR system,
there was one case that used an HMD with a smartphone. a single-user system is commonly used where the user inter-
Especially, Wang and Lindeman (2015) developed coordi-
acts only with the input and output devices. A total of 58 cases
nated hybrid VR system to provide more seamless 3D were a single user. In three cases, multi-users utilized one
system in the same location. This is the case where each
Table 3. Classification by types of output devices. input device is given to users and the task is performed in
Main No. of
categories Subcategories Items case
Table 4. Classification by interaction attributes – interaction partner, posture,
Non-world- Non-see-through Smartphone HMD 1 and task type.
fixed HMD Assembled HMD 19
Video-see-through - 0 Main No. of
HMD categories Subcategories case
Others Hybrid VR 1 Interaction Single user 58
World-fixed Non-surrounding Conventional monitor based 23 partner Co-located multi-user 3
display Screen based 5 Remote multi-user 5
Projector based 8 Posture Sitting 28
Surrounding CAVE 8 Standing 22
display Sitting/standing 8
Others da Vinci skill simulator, Multi- 2 No information 8
planar display Task type Passive – watching only 10
No information 7 Active – navigation, selection, manipulation or 59
If no information about the visual display corresponding to the two main system control
categories is given, it is classified as no information. The total number of The total number of cases might be greater than the number of papers selected
cases might be greater than the number of papers selected because there because there were cases where multiple experiments were performed in one
were cases where multiple experiments were performed in one paper. paper.
INTERNATIONAL JOURNAL OF HUMAN–COMPUTER INTERACTION 903

the same VE. Users can perform a collaborative assembly task were cases where UX was evaluated according to the feedback
in a VE (Erfanian et al., 2017) or explore education activity modality, interaction type, and display type. For the case of
together (Alves Fernandes et al., 2016; Naya & Ibáñez, 2015). applying the VR system directly to a specific field besides
In five cases, multi-users experienced one VE in different basic application or utilizing it for a specific field, most
locations, which was classified as remote multi-user. In this cases were in the Education and training field. The numbers
system, users could perform collaborative tasks by continu- of cases according to other VR applications are as follows:
ously sharing information (Anton et al., 2018; de Jesus healthcare & therapy (5), communication (5), entertainment
Oliveira et al., 2018; Oprean, Simpson, & Klippel, 2018) and (5), product development (4), architectural & urban design
it was possible to have a virtual meeting (Sutcliffe & Alrayes, (2), transportation (1), digital marketing (1), art (1), forestry
2012) or virtual learning (Vosinakis, Anastassakis, & (1), and exhibition & tour (1).
Koutsabasis, 2018). The sitting position was adopted more
than the standing position, and special vehicles were rarely
4.7. Evaluation
provided. When experiencing VR in standing position, users
could walk freely in a limited space or body gestures were The results of classification by each attribute of the UX
tracked. Task type was divided into active or passive tasks, evaluation method are presented in Table 6. In the measure-
and in most cases, an active task type was adopted. This ment characteristics, most cases used both subjective and
indicates that UX evaluation has more been focussed on active objective measurements, whereas only two cases used only
tasks such as navigation, selection, manipulation, and system objective measurement. In other words, in most studies, sub-
control compared to passive task.
Table 5. Classification by VR application.
Main categories subcategories No. of case
4.5. Environment
VR applications Education & training 10
As shown in Figure 10, most studies did not establish Healthcare & Therapy 5
Communication 5
a specific platform other than the input and output devices. Entertainment 5
There were three cases where the movable platform was used. Product development 4
Architectural & Urban design 2
These correspond to motion platform described in Section Transportation 1
4.3. For non-movable platform, fixed seat was provided for Digital marketing 1
vehicular navigation content such as driving simulation Art 1
Forestry 1
(Georgiou & Demiris, 2017). In addition, few CAVE systems Exhibition & tour 1
were used. The movable platform and CAVE system would be Basic 27
expensive to build and would require technical experts. In
particular, the installation space should be ensured sufficiently
for the CAVE system. jective measurements were evaluated. This result shows that
most studies evaluated subjective feelings such as presence
and immersion, which are representative UX components
4.6. Application that VR system pursues to provide to users, rather than
Unexpectedly, VR systems in most of the abovementioned evaluating performance measurements of function implemen-
studies could not be established for specific applications, and tation. In measure characteristic, a quantitative method was
these cases were classified as basic application (Table 5), adopted more frequently to evaluate UX in a VR system
which corresponds to the case where UX evaluation is per- compared to a qualitative method.
formed on various factors implemented in a VR system. In As a quantitative method, a questionnaire was frequently
other words, the findings of studies classified as basic applica- used, and the performance was evaluated using the completion
tion can be applied to any VR application. For example, there time or error rate. The adapted or entire version of well-

Figure 10. Classification by environmental attributes.


904 Y. M. KIM ET AL.

Table 6. Classification by attributes of UX evaluation method in VR. 5. Discussion and Recommendation


No. of
Main categories Subcategories case The proposed taxonomies were developed before analyzing
Measurement Subjective 64 the collected papers and were used as the basis for reporting
Objective 32 the review results. The results were mostly reported in corre-
Subjective only 30
Objective only 2 spondence with the proposed taxonomies. However, there was
Both 34 a case where VR system used in previous study was not
Measure Quantitative 65 assigned to the proposed taxonomy, and this part has been
Qualitative 26
Quantitative only 40 adjusted. For example, in hand input category, the non-
Qualitative only 0 tracked and non-hand-held devices were added based on the
Both 26
Evaluator Users 57 analysis of collected papers. In the future, input devices can be
Expert 9 mounted on other VR system components or a new interac-
Location Laboratory 64 tion paradigm can emerge. Thus, although the suggested
Field 1
Online 1 taxonomies in this study can be enough to cover the recent
System development Fully functional system 10 studies on UX in VR system, the taxonomies need to be
phase Functional prototypes 55
Conceptual design ideas in very early 0 extended and refined in the future according to the develop-
phases ment of VR system.
Nonfunctional prototypes 0 In this paper, there are two main points of discussion
Period of experience Before usage (Anticipated UX) 0
During usage (Momentary UX) 0 about UX studies in a VR system. The first is related to the
After usage (Episodic UX) 62 implementation of equipment and technology including input
Overtime (Cumulative UX) 3
devices, output devices, feedback forms, platforms, and appli-
The total number of cases might be greater than the number of papers selected
because there were cases where multiple experiments were performed in one cations and the other is related to research methods including
paper. user characteristics, interactions, and evaluation method.

structured questionnaires was mainly used, and the question- 5.1. Issues related to VR devices and technology
naires varied. For example, to evaluate certain UX components ● In the non-hand input category, assembled-HMDs were
(e.g., presence, immersion, engagement, usability, and simulator mainly used, and efforts were made to apply new ways
sickness) various questionnaires were used such as Presence of interaction in addition to head tracking. However,
Questionnaire (Witmer & Singer, 1998), The Independent UX research on input methods other than head tracking
Television Commission’s Sense of Presence Inventory is still lacking. The non-hand type input interaction
(Lessiter, Freeman, Keogh, & Davidoff, 2001), System method using eye movement, EEG signal or voice com-
Usability Scale (Brooke, 1996), and Simulator Sickness mand needs to be actively studied in the future because
Questionnaire (Kennedy, Lane, Berbaum, & Lilienthal, 1993). people can experience VR even under conditions where
In addition, most of the studies focused on general users. they cannot use their hands freely because of being
The laboratory was the most widely used experimental loca- mentally or physically impaired. Furthermore, if various
tion, and there was one case in each field and the online interaction methods can be applied in the same VR
environment. In addition, for the system development phase, system, users can experience VR by selecting the inter-
all VR systems were included in the second half of the phase. action type according to their physical ability and
More specifically, most of VR systems were functional proto- preference.
types. In this case, the self-development contents or tracking ● In the hand input category, there is a high proportion
systems were implemented in VR systems. On the other hand, of types of no tracking device. This means that the
there were few experiments that used both content and gamepad, keyboard, and mouse conventionally used in
devices that were being commercialized. In other words, the a PC environment are constantly being used in a VR
number of experiments confirming the effect of specific fac- system. However, these devices can limit the natural
tors in a well-structuralized experiment was relatively large, hand movement of the user. On the other hand, bare-
and there were few studies that focused on a VR system for hand or hand-worn interaction can realize a natural
a specific application. user interface, which promotes the degree of immer-
Meanwhile, as for the period of experience, there was no sion when performing tasks. Nonetheless, the bare-
case where the UX was evaluated before and during the VR hand interaction still has a main issue in that the
experience; in most cases, UX was evaluated after the VR system cannot provide haptic feedback to users
experience. This also means that UX was generally evaluated (Koutsabasis & Vosinakis, 2018). On the other hand,
after completing certain tasks. Only three cases were included hand-worn devices are encumbered input devices,
in the cumulative UX criterion. This includes cases where the which means that users need to wear physical hard-
degree of rehabilitation progress was assessed after experien- ware (Jerald, 2015). This can cause problems with
cing the VR system several times (Jonsdottir et al., 2018; physical comfort, installation complexity, or tracking
Schwenk et al., 2014) and a case where evaluation was per- reliability. Therefore, continuous effort is required to
formed after experiencing the system for a relatively long solve these problems for more seamless and natural
period (Newe, Becker, & Schenk, 2014). interaction in VR systems.
INTERNATIONAL JOURNAL OF HUMAN–COMPUTER INTERACTION 905

● Furthermore, tracked hand-held controllers have been used valuably for building a safe and enjoyable VR
actively used in VR systems. These controllers can system. Moreover, since VR systems are expected to
inevitably cause physical loads on users’ body parts; be used in various industrial domains, efforts should
however, studies on the risks that may arise from be made to maximize the effects of VRs applied to
ergonomic aspects are lacking. Therefore, it is desir- specific applications.
able that the risk factors such as wrist load due to the
weight of the controller and long-term use, excessive
force due to the design features, or excessive bending
during work are studied in combination with overall 5.2. Issues related to the research method
UX in a VR system.
● In feedback modality, visual stimulation was provided ● The UX evaluation study in a VR system is characterized
in all studies, while other stimuli were provided rela- as being concentrated in younger adult and healthy
tively less. In particular, there was only one case in people. However, there is a need to expand the age
which olfactory stimulation was used. Since multi- range of subjects, including the elderly and children,
sensory feedback helps improve immersion in a VR since the response to the same stimulus can be differ-
system (Leonardis, Frisoli, Barsotti, Carrozzino, & ently revealed from different psychological or physical
Bergamasco, 2014; Mikropoulos & Natsis, 2011), aspects. Especially, since elderly people are mentally and
further UX studies on the combinations of multiple physically more vulnerable than younger adults due to
feedback forms are needed. In addition, though a new aging, the potential problems caused by their declined
motion platform for a VR system has been introduced functional capabilities should be clarified and research-
and developed, the UX on it has been rarely been ers should consider these issues. Furthermore, as the VR
experimentally investigated. While the motion plat- system is being applied in the field of rehabilitation and
form can enhance immersion, it can also cause therapy, it is necessary to focus on discovering UX issues
motion sickness. Thus, it needs to focus on the overall related to VR experience for people with disabilities.
UX research on the motion platform before the ● In interaction partner category, the single-user system
motion platform becomes popular in the theme park was more frequently used than the collaborative system.
or VR experience space. However, researches on the collaborative VR system
● Compared to assembled HMD, world-fixed displays should be expanded and the related technologies should
such as conventional monitor, screen or projector- also be advanced. In daily life, people communicate and
based displays have been adopted more often as visual collaborate with each other. Thus, in the future, there is
displays. However, about 84% of the total cases of using a high possibility of collaborative VR systems being
HMDs belong to after the year 2016, when the Oculus implemented for various fields such as experience cen-
Rift was launched to the public and HMDs began to ter, product design, virtual meeting, and multiuser
gain popularity. As an increasing number of advanced entertainment. More importantly, this collaborative VR
HMDs are constantly being released to the public today, system can require different interaction patterns and
we expect more UX studies on HMDs in the future. provide totally different UX compared with a single-
● The CAVE system is one of the high immersive VR user system.
systems; however, it has been rarely researched. This is ● In case of task type, active interactions were more fre-
because CAVE systems are not only costly to build but quently performed than passive tasks. Today, users can
also require a special setup and large space. Thus, there experience various contents photographed through
would be practical difficulties for small groups of a 360°camera or videos taken in bird’s eye view through
researchers to build a CAVE system and conduct UX a drone by wearing an HMD. Hence, research on UX
research, so there is a need to expand the investment for can be expanded on this area. Additionally, in contrast
the CAVE system. The CAVE system has been effec- to our expectation, there were many cases in which the
tively used in various industries such as military, educa- exact tasks performed were not clearly defined. To pro-
tion, medicine, and scientific visualizations (Muhanna, vide appropriate reference and direction to the research-
2015), and thus the findings of UX research on CAVE ers who study the UX in VR in the future, it is necessary
system might be worthily utilized in these fields. It is to describe the task performed in detail.
noteworthy that half of studies adopted CAVE were in ● In the evaluation characteristic, the qualitative mea-
2018. Thus, it is expected that more valuable results of sure is relatively less adopted for evaluating UX.
UX evaluation for the CAVE system will be actively However, it would be valuable to evaluate UX through
presented in the future. a deep interview or observation in terms of obtaining
● In addition, studies conducted on UX in a VR system rich evidence on subjective assessment and unex-
have focussed more on factors related to the imple- pected contextual issues. In addition, UX was mainly
mentation of the VR function, rather than focusing on evaluated after experiencing VR using questionnaires.
the specific application. In other words, research has However, if negative effects can be predicted by ana-
been conducted to discover and establish empirical lyzing a user’s physiological signal or behavior pattern
evidence on the effects of implementation of VR sys- while experiencing a VR, a safer VR experience can be
tem components and functions. This evidence will be promoted. Besides, when evaluating the user’s
906 Y. M. KIM ET AL.

immersion through the questionnaire, it is difficult to Journal of Arthroscopic & Related Surgery, 32(1), 224–232.
know at what point the feeling was aroused and there doi:10.1016/j.arthro.2015.07.023
is a possibility to evaluate it differently from the Alves Fernandes, L. M., Cruz Matos, G., Azevedo, D., Rodrigues Nunes, R.,
Paredes, H., Morgado, L., … Cardoso, B. (2016). Exploring educational
situation felt during an actual VR experience.
immersive videogames: An empirical study with a 3D multimodal inter-
Therefore, evaluation methodologies that can evaluate action prototype. Behaviour & Information Technology, 35, 907–918.
the users’ subjective state without interfering with doi:10.1080/0144929X.2016.1232754
their sense of immersion during the VR experience Anthes, C., García-Hernández, R. J., Wiedemann, M., & Kranzlmüller, D.
will be constantly attempted to be developed. (2016). State of the art of virtual reality technology. Paper presented at
the Aerospace Conference, 2016 IEEE, Big Sky, MT.
Anton, D., Kurillo, G., & Bajcsy, R. (2018). User experience and inter-
action performance in 2D/3D telecollaboration. Future Generation
Computer Systems, 82, 77–88. doi:10.1016/j.future.2017.12.055
6. Conclusion Baños, R. M., Botella, C., Alcañiz, M., Liaño, V., Guerrero, B., & Rey, B.
(2004). Immersion and emotion: Their impact on the sense of
This paper proposed systematic taxonomies for classifying the presence. CyberPsychology & Behavior, 7(6), 734–741. doi:10.1089/
types of VR systems and conducted a systematic review the cpb.2004.7.734
previous studies from the perspectives of HCI and UX. Today, Baus, O., & Bouchard, S. (2017). Exposure to an unpleasant odour
with the commercialization of HMDs and the introduction of increases the sense of presence in virtual reality. Virtual Reality, 21,
59–74. doi:10.1007/s10055-016-0299-3
advanced technologies related to VR systems, the prospects of Bhagat, K. K., Liou, W.-K., & Chang, C.-Y. (2016). A cost-effective inter-
VR industry are highly appreciated. In line with this trend, UX active 3D virtual reality system applied to military live firing training.
studies in VR systems have been increased significantly since Virtual Reality, 20(2), 127–140. doi:10.1007/s10055-016-0284-x
2017. However, in comparison with the development of the VR Bian, Y., Yang, C., Gao, F., Li, H., Zhou, S., Li, H., … Meng, X. (2016).
technology, the research of UX in VR needs to be further A framework for physiological indicators of flow in VR games:
Construction and preliminary evaluation. Personal and Ubiquitous
studied mainly from two aspects: issues related to VR devices Computing, 20, 821–832. doi:10.1007/s00779-016-0953-5
and technology and those related to the research method. Boletsis, C. (2017). The new era of virtual reality locomotion:
A myriad of usage contexts can be defined due to the A systematic literature review of techniques and a proposed
combination of the components constituting the VR system typology. Multimodal Technologies and Interaction, 1(4), 24.
and the result of the UX evaluation can be derived differently doi:10.3390/mti1040024
Botella, C., Serrano, B., Baños, R. M., & Garcia-Palacios, A. (2015).
according to the context. Therefore, there may be uncon- Virtual reality exposure-based therapy for the treatment of
firmed negative effects on the VR context that is not currently post-traumatic stress disorder: A review of its efficacy, the adequacy
introduced. In addition, as various contents are newly devel- of the treatment protocol, and its acceptability. Neuropsychiatric
oped and released, it is likely to be extended to several indus- Disease and Treatment, 11, 2533. doi:10.2147/NDT
tries such as education, e-commerce, and healthcare beyond Bowman, D. A., & Hodges, L. F. (1999). Formalizing the design, evalua-
tion, and application of interaction techniques for immersive virtual
the entertainment market. Furthermore, because Information environments. Journal of Visual Languages & Computing, 10(1),
and Communications Technologies and machine learning are 37–53. doi:10.1006/jvlc.1998.0111
being extensively studied in the academic field, research on Bowman, D. A., Kruijff, E., LaViola, J. J., Jr, & Poupyrev, I. (2001). An
utilizing sensor data of the VR system might be performed introduction to 3-D user interface design. Presence: Teleoperators &
sufficiently. Hence, a comprehensive UX study in VR systems Virtual Environments, 10(1), 96–108. doi:10.1162/105474601750182342
Brade, J., Lorenz, M., Busch, M., Hammer, N., Tscheligi, M., &
with diverse use environments should be conducted, and the Klimant, P. (2017). Being there again – Presence in real and
proposed taxonomies and findings of this study are expected virtual environments and its relation to usability and user experi-
to contribute to future research on this field. ence using a mobile navigation task. International Journal of
Human Computer Studies, 101, 76–87. doi:10.1016/j.
ijhcs.2017.01.004
Funding Brooke, J. (1996). SUS-A quick and dirty usability scale. Usability
Evaluation in Industry, 189(194), 4–7.
This work was supported by the Basic Science Research Program through
Bulu, S. T. (2012). Place presence, social presence, co-presence, and
the National Research Foundation of Korea (NRF) funded by the
satisfaction in virtual worlds. Computers & Education, 58(1),
Ministry of Education [NRF-2017R1D1A3B03034321].
154–161. doi:10.1016/j.compedu.2011.08.024
Burdea, G. C., & Coiffet, P. (2003). Virtual reality technology. Hoboken,
NJ: John Wiley & Sons.
ORCID Camporesi, C., & Kallmann, M. (2016). The effects of avatars, stereo
vision and display size on reaching and motion reproduction. IEEE
Yong Min Kim http://orcid.org/0000-0003-4796-490X Transactions on Visualization and Computer Graphics, 22, 1592–1604.
Ilsun Rhiu http://orcid.org/0000-0001-8229-7220 doi:10.1109/TVCG.2015.2440231
Myung Hwan Yun http://orcid.org/0000-0001-8554-3132 Cobb, S. V., Nichols, S., Ramsey, A., & Wilson, J. R. (1999). Virtual
reality-induced symptoms and effects (VRISE). Presence:
Teleoperators & Virtual Environments, 8(2), 169–186. doi:10.1162/
References 105474699566152
Cruz-Neira, C., Sandin, D. J., DeFanti, T. A., Kenyon, R. V., & Hart, J. C.
Ahn, S. J., Fox, J., Dale, K. R., & Avant, J. A. (2015). Framing virtual (1992). The CAVE: Audio visual experience automatic virtual
experiences: Effects on environmental efficacy and behavior over time. environment. Communications of the ACM, 35(6), 64–73.
Communication Research, 42, 839–863. doi:10.1177/0093650214534973 doi:10.1145/129888.129892
Aïm, F., Lonjon, G., Hannouche, D., & Nizard, R. (2016). Effectiveness of Culbertson, H., & Kuchenbecker, K. J. (2017). Importance of matching
virtual reality training in orthopaedic surgery. Arthroscopy. The physical friction, hardness, and texture in creating realistic haptic
INTERNATIONAL JOURNAL OF HUMAN–COMPUTER INTERACTION 907

virtual surfaces. IEEE Transactions on Haptics, 10, 63–74. doi:10.1109/ Koutsabasis, P., & Vosinakis, S. (2018). Kinesthetic interactions in
TOH.2016.2598751 museums: Conveying cultural heritage by making use of ancient
de Jesus Oliveira, V. A., Nedel, L., & Maciel, A. (2018). Assessment of an tools and (re-) constructing artworks. Virtual Reality, 22(2),
articulatory interface for tactile intercommunication in immersive 103–118. doi:10.1007/s10055-017-0325-0
virtual environments. Computers & Graphics, 76, 18–28. doi:10.1016/ Kozhevnikov, M., & Gurlitt, J. (2013). Immersive and non-immersive
j.cag.2018.07.007 virtual reality system to learn relative motion concepts. Paper presented
Pick, S., Weyers, B., Hentschel, B., & Kuhlen, T. W. (2016). Design and at the 3rd Interdisciplinary Engineering Design Education Conference
evaluation of data annotation workflows for CAVE-like virtual (IEDEC), Santa Clara, CA.
environments. IEEE Computer Society 22, 1452–1461. Koźlak, M., Kurzeja, A., & Nawrat, A. (2013). Virtual reality technology
Erfanian, A., Hu, Y., & Zeng, T. (2017). Framework of multiuser satisfac- for military and industry training programs. In A. Nawrat & Z.
tion for assessing interaction models within collaborative virtual Kuś (Eds.), Vision based systems for UAV applications (pp. 327–334).
environments. IEEE Transactions on Human-Machine Systems, 47, Heidelberg, Germany: Springer International Publishing.
1052–1065. doi:10.1109/THMS.2017.2700431 Kyriakou, M., Pan, X., & Chrysanthou, Y. (2017). Interaction with virtual
Forlizzi, J., & Battarbee, K. (2004). Understanding experience in interac- crowd in immersive and semi-immersive virtual reality systems.
tive systems. Paper presented at the proceedings of the 5th conference Computer Animation and Virtual Worlds, 28, e1729. doi:10.1002/
on designing interactive systems: Processes, practices, methods, and cav.1729
techniques, Cambridge, MA. Lau, K. W., & Lee, P. Y. (2015). The use of virtual reality for creating
Georgiou, T., & Demiris, Y. (2017). Adaptive user modelling in car unusual environmental stimulation to motivate students to explore
racing games using behavioural and physiological data. User creative ideas. Interactive Learning Environments, 23(1), 3–18.
Modeling and User-adapted Interaction, 27, 267–311. doi:10.1007/ doi:10.1080/10494820.2012.745426
s11257-017-9192-3 Lele, A. (2013). Virtual reality and its military utility. Journal of Ambient
Girod, S., Schvartzman, S. C., Gaudilliere, D., Salisbury, K., & Silva, R. Intelligence and Humanized Computing, 4(1), 17–26. doi:10.1007/
(2016). Haptic feedback improves surgeons’ user experience and frac- s12652-011-0052-4
ture reduction in facial trauma simulation. Journal of Rehabilitation Leonardis, D., Frisoli, A., Barsotti, M., Carrozzino, M., & Bergamasco, M.
Research & Development, 53(5), 561–570. doi:10.1682/ (2014). Multisensory feedback can enhance embodiment within an
JRRD.2015.03.0043 enriched virtual walking scenario. Presence: Teleoperators and Virtual
Harish, P., & Narayanan, P. J. (2013). Designing perspectively correct Environments, 23(3), 253–266.
multiplanar displays. IEEE Transactions on Visualization and Lessiter, J., Freeman, J., Keogh, E., & Davidoff, J. (2001). A cross-media
Computer Graphics, 19, 407–419. doi:10.1109/TVCG.2012.135 presence questionnaire: The ITC-sense of presence inventory.
Hartson, R., & Pyla, P. S. (2012). The UX book: Process and guidelines for Presence: Teleoperators & Virtual Environments, 10(3), 282–297.
ensuring a quality user experience. San Francisco, CA: Elsevier. doi:10.12968/bjon.2001.10.5.12353
Hassenzahl, M., & Tractinsky, N. (2006). User experience-a research Li, X., Yi, W., Chi, H.-L., Wang, X., & Chan, A. P. (2018). A critical
agenda. Behaviour & Information Technology, 25(2), 91–97. review of virtual and augmented reality (VR/AR) applications in
doi:10.1080/01449290500330331 construction safety. Automation in Construction, 86, 150–162.
Henderson, A., Korner-Bitensky, N., & Levin, M. (2007). Virtual reality doi:10.1016/j.autcon.2017.11.003
in stroke rehabilitation: A systematic review of its effectiveness for Liberati, A., Altman, D. G., Tetzlaff, J., Mulrow, C., Gøtzsche, P. C.,
upper limb motor recovery. Topics in Stroke Rehabilitation, 14(2), Ioannidis, J. P., … Moher, D. (2009). The PRISMA statement for
52–61. doi:10.1310/tsr1402-52 reporting systematic reviews and meta-analyses of studies that evalu-
Howard, M. C. (2017). A meta-analysis and systematic literature review ate health care interventions: Explanation and elaboration. PLoS
of virtual reality rehabilitation programs. Computers in Human Medicine, 6(7), e1000100. doi:10.1371/journal.pmed.1000100
Behavior, 70, 317–327. doi:10.1016/j.chb.2017.01.013 Lin, Y., Breugelmans, J., Iversen, M., & Schmidt, D. (2017). An Adaptive
Jayaram, S., Connacher, H. I., & Lyons, K. W. (1997). Virtual assembly Interface Design (AID) for enhanced computer accessibility and
using virtual reality techniques. Computer-aided Design, 29(8), rehabilitation. International Journal of Human-computer Studies, 98,
575–584. doi:10.1016/S0010-4485(96)00094-2 14–23. doi:10.1016/j.ijhcs.2016.09.012
Jerald, J. (2015). The VR book: Human-centered design for virtual reality. Loup-Escande, E., Jamet, E., Ragot, M., Erhel, S., & Michinov, N. (2017).
New York, NY: Morgan & Claypool. Effects of stereoscopic display on learning and user experience in an
Jerald, J. (2018). Human-centered VR design: Five essentials every engi- educational virtual environment. International Journal of Human–
neer needs to know. IEEE Computer Graphics and Applications, 38(2), Computer Interaction, 33(2), 115–122. doi:10.1080/
15–21. doi:10.1109/MCG.2018.021951628 10447318.2016.1220105
Jin, S.-A.-A. (2013). The moderating role of sensation seeking tendency Ma, M., & Zheng, H. (2011). Virtual reality and serious games in
in robotic haptic interfaces. Behaviour and Information Technology, healthcare. In S. Brahnam & L. C. Jain (Eds.), Advanced computational
32, 862–873. doi:10.1080/0144929X.2012.687769 intelligence paradigms in healthcare 6. Virtual reality in psychotherapy,
Jonsdottir, J., Bertoni, R., Lawo, M., Montesano, A., Bowman, T., & rehabilitation, and assessment (pp. 169–192). Heidelberg, Germany:
Gabrielli, S. (2018). Serious games for arm rehabilitation of persons with Springer-Verlag Berlin Heidelberg.
multiple sclerosis. A randomized controlled pilot study. Multiple Sclerosis Manjrekar, S., Sandilya, S., Bhosale, D., Kanchi, S., Pitkar, A., &
and Related Disorders, 19, 25–29. doi:10.1016/j.msard.2017.10.010 Gondhalekar, M. (2014). CAVE: An emerging immersive technology–
Kageyama, A., & Tomiyama, A. (2016). Visualization framework for A review. Paper presented at the Computer Modelling and Simulation
CAVE virtual reality systems. International Journal of Modeling, (UKSim), 2014 UKSim-AMSS 16th International Conference on,
Simulation, and Scientific Computing, 7(04), 1643001. Washington, DC.
Kennedy, R. S., Lane, N. E., Berbaum, K. S., & Lilienthal, M. G. (1993). McComas, J., MacKay, M., & Pivik, J. (2002). Effectiveness of virtual
Simulator sickness questionnaire: An enhanced method for quantify- reality for teaching pedestrian safety. CyberPsychology & Behavior, 5
ing simulator sickness. The International Journal of Aviation (3), 185–190. doi:10.1089/109493102760147150
Psychology, 3(3), 203–220. doi:10.1207/s15327108ijap0303_3 Mikropoulos, T. A., & Natsis, A. (2011). Educational virtual environments:
Kmet, L. M., Cook, L. S., & Lee, R. C. (2004). Standard quality assessment A ten-year review of empirical research (1999–2009). Computers &
criteria for evaluating primary research papers from a variety of fields. Education, 56(3), 769–780. doi:10.1016/j.compedu.2010.10.020
Edmonton, AB: Alberta Heritage Foundation for Medical Research. Monteiro, P., Carvalho, D., Melo, M., Branco, F., & Bessa, M. (2018).
Kober, S. E., & Neuper, C. (2013). Personality and presence in virtual Application of the steering law to virtual reality walking navigation
reality: Does their relationship depend on the used presence measure? interfaces. Computers & Graphics, 77, 80–87. doi:10.1016/j.cag.2018.10.003
International Journal of Human-computer Interaction, 29(1), 13–25. Morán, A., Ramírez-Fernández, C., Meza-Kubo, V., Orihuela-Espina, F.,
doi:10.1080/10447318.2012.668131 García-Canseco, E., Grimaldo, A. I., & Sucar, E. (2015). On the effect of
908 Y. M. KIM ET AL.

previous technological experience on the usability of a virtual rehabilita- clinicians. Stroke, Strokeaha, 42, 1380–1386. doi:10.1161/
tion tool for the physical activation and cognitive stimulation of elders. STROKEAHA.110.605451
Journal of Medical Systems, 39, 104. doi:10.1007/s10916-015-0297-0 Schuemie, M. J., Van Der Straaten, P., Krijn, M., & Van Der Mast, C. A.
Moro, S. B., Bisconti, S., Muthalib, M., Spezialetti, M., Cutini, S., (2001). Research on presence in virtual reality: A survey. CyberPsychology
Ferrari, M., … Quaresima, V. (2014). A semi-immersive virtual reality & Behavior, 4(2), 183–201. doi:10.1089/109493101300117884
incremental swing balance task activates prefrontal cortex: Schulze, K., & Krömker, H. (2010). A framework to measure user experi-
A functional near-infrared spectroscopy study. Neuroimage, 85, ence of interactive online products. Paper presented at the proceedings
451–460. doi:10.1016/j.neuroimage.2013.05.031 of the 7th international conference on methods and techniques in
Muhanna, M. A. (2015). Virtual reality and the CAVE: Taxonomy, behavioral research, Eindhoven, The Netherlands.
interaction challenges and research directions. Journal of King Saud Schvartzman, S. C., Silva, R., Salisbury, K., Gaudilliere, D., & Girod, S. (2014).
University-Computer and Information Sciences, 27(3), 344–361. Computer-aided trauma simulation system with haptic feedback is easy
doi:10.1016/j.jksuci.2014.03.023 and fast for oral-maxillofacial surgeons to learn and use. Journal of Oral
Narciso, D., Bessa, M., Melo, M., Coelho, A., & Vasconcelos-Raposo, J. and Maxillofacial Surgery, 72, 1984–1993. doi:10.1016/j.joms.2014.05.007
(2017). Immersive 360∘ video user experience: Impact of different Schwenk, M., Grewal, G. S., Honarvar, B., Schwenk, S., Mohler, J.,
variables in the sense of presence and cybersickness. Universal Access Khalsa, D. S., & Najafi, B. (2014). Interactive balance training inte-
in the Information Society, 1–11. doi:10.1007/s10209-017-0581-5 grating sensor-based visual feedback of movement performance:
Naya, V. B., & Ibáñez, L. A. H. (2015). Evaluating user experience in joint A pilot study in older adults. Journal of Neuroengineering and
activities between schools and museums in virtual worlds. Universal Access Rehabilitation, 11, 164. doi:10.1186/1743-0003-11-164
in the Information Society, 14(3), 389–398. doi:10.1007/s10209-014-0367-y Slater, M., & Wilbur, S. (1997). A framework for immersive virtual
Newe, A., Becker, L., & Schenk, A. (2014). Application and evaluation of environments (FIVE): Speculations on the role of presence in virtual
interactive 3D PDF for presenting and sharing planning results for liver environments. Presence: Teleoperators & Virtual Environments, 6(6),
surgery in clinical routine. PloS One, 9. doi:10.1371/journal.pone.0115697 603–616. doi:10.1162/pres.1997.6.6.603
Nichols, S., & Patel, H. (2002). Health and safety implications of virtual Son, H., Shin, S., Choi, S., Kim, S.-Y., & Kim, J. R. (2018). Interacting
reality: A review of empirical evidence. Applied Ergonomics, 33(3), automultiscopic 3D with haptic paint brush in immersive room. IEEE
251–271. doi:10.1016/S0003-6870(02)00020-0 Access, 6, 76464–76474. doi:10.1109/Access.6287639
Oprean, D., Simpson, M., & Klippel, A. (2018). Collaborating remotely: Stanney, K., & Salvendy, G. (1998). Aftereffects and sense of presence in
An evaluation of immersive capabilities on spatial experiences and virtual environments: Formulation of a research and development
team membership. International Journal of Digital Earth, 11(4), agenda. International Journal of Human-computer Interaction, 10(2),
420–436. doi:10.1080/17538947.2017.1381191 135–187. doi:10.1207/s15327590ijhc1002_3
Pallavicini, F., Argenton, L., Toniazzi, N., Aceti, L., & Mantovani, F. Stanney, K. M., Mollaghasemi, M., Reeves, L., Breaux, R., &
(2016). Virtual reality applications for stress management training in Graeber, D. A. (2003). Usability engineering of virtual environments
the military. Aerospace Medicine and Human Performance, 87(12), (VEs): Identifying multiple criteria that drive effective VE system
1021–1030. doi:10.3357/AMHP.4596.2016 design. International Journal of Human-computer Studies, 58(4),
Pantelidis, V. S. (1993). Virtual reality in the classroom. Educational 447–481. doi:10.1016/S1071-5819(03)00015-6
Technology, 33(4), 23–27. Sutcliffe, A., & Alrayes, A. (2012). Investigating user experience in second
Pedroli, E., Greci, L., Colombo, D., Serino, S., Cipresso, P., Arlati, S., … life for collaborative learning. International Journal of Human Computer
Goulene, K. (2018). Characteristics, usability, and users experience of Studies, 70, 508–525. doi:10.1016/j.ijhcs.2012.01.005
a system combining cognitive and physical therapy in a virtual envir- Sutherland, I. E. (1965). The ultimate display. In Multimedia: From
onment: Positive bike. Sensors, 18(7), 2343. doi:10.3390/s18072343 Wagner to virtual reality (pp. 506–508). London: Macmillan and Co.
Powers, J. C., Bieliaieva, K., Wu, S., & Nam, C. S. (2015). The human Takatalo, J., Nyman, G., & Laaksonen, L. (2008). Components of human
factors and ergonomics of P300-based brain-computer interfaces. experience in virtual environments. Computers in Human Behavior,
Brain Sciences, 5(3), 318–356. doi:10.3390/brainsci5030318 24(1), 1–15. doi:10.1016/j.chb.2006.11.003
Pratt, D. R., Zyda, M., & Kelleher, K. (1995). Virtual reality: In the mind Tergas, A. I., Sheth, S. B., Green, I. C., Giuntoli, R. L., II, Winder, A. D., &
of the beholder. Computer, 28, 17–19. Fader, A. N. (2013). A pilot study of surgical training using a virtual
Rajeshkumar, S., Omar, R., & Mahmud, M. (2013). Taxonomies of user robotic surgery simulator. JSLS-Journal of the Society of Laparoendoscopic
experience (UX) evaluation methods. Paper presented at the Research Surgeons, 17, 219–226. doi:10.4293/108680813X13654754535872
and Innovation in Information Systems (ICRIIS), 2013 International Thüring, M., & Mahlke, S. (2007). Usability, aesthetics and emotions in
Conference on, Kuala Lumpur, Malaysia. human–Technology interaction. International Journal of Psychology,
Reid, D. T. (2002). Benefits of a virtual play rehabilitation environment for 42(4), 253–264. doi:10.1080/00207590701396674
children with cerebral palsy on perceptions of self-efficacy: A pilot study. Tidoni, E., Abu-Alqumsan, M., Leonardis, D., Kapeller, C., Fusco, G.,
Pediatric Rehabilitation, 5(3), 141–148. doi:10.1080/1363849021000039344 Guger, C., … Aglioti, S. M. (2017). Local and remote cooperation with
Reyes-Lecuona, A., & Diaz-Estrella, A. (2006). New interaction paradigms virtual and robotic agents: A P300 BCI study in healthy and people living
in virtual environments. Paper presented at the electrotechnical con- with spinal cord injury. IEEE Transactions on Neural Systems and
ference. MELECON 2006. IEEE Mediterranean, Malaga, Spain. Rehabilitation Engineering, 25, 1622–1632. doi:10.1109/
Riecke, B. E., Schulte-Pelkum, J., Caniard, F., & Bulthoff, H. H. (2005). TNSRE.2016.2626391
Towards lean and elegant self-motion simulation in virtual reality. Paper Turchet, L., Burelli, P., & Serafin, S. (2013). Haptic feedback for enhan-
presented at the Virtual Reality, 2005. Proceedings. VR 2005. IEEE, Arles, cing realism of walking simulations. IEEE Transactions on Haptics, 6,
Camargue-Provence, France. 35–45. doi:10.1109/TOH.2012.51
Rieuf, V., & Bouchard, C. (2017). Emotional activity in early immersive Van Cutsem, J., Marcora, S., De Pauw, K., Bailey, S., Meeusen, R., &
design: Sketches and moodboards in virtual reality. Design Studies, 48, Roelands, B. J. S. M. (2017). The effects of mental fatigue on physical
43–75. doi:10.1016/j.destud.2016.11.001 performance: A systematic review. Sports Medicine (Auckland, N.Z.),
Roto, V., Law, E., Vermeeren, A., & Hoonhout, J. (2011). Bringing clarity 47(8), 1569–1588. doi:10.1007/s40279-016-0672-0
to the concept of user experience. Retrieved from www.allaboutux. Vermeeren, A. P., Law, E. L.-C., Roto, V., Obrist, M., Hoonhout, J., &
org/files/UX-WhitePaper.pdf Väänänen-Vainio-Mattila, K. (2010). User experience evaluation meth-
Santos, B. S., Dias, P., Pimentel, A., Baggerman, J.-W., Ferreira, C., ods: Current state and development needs. Paper presented at the
Silva, S., & Madeira, J. (2009). Head-mounted display versus desktop proceedings of the 6th Nordic conference on human-computer inter-
for 3D navigation in virtual reality: A user study. Multimedia Tools action: Extending boundaries, Reykjavik, Iceland.
and Applications, 41(1), 161. doi:10.1007/s11042-008-0223-2 Vinnikov, M., Allison, R. S., & Fernandes, S. (2017). Gaze-contingent auditory
Saposnik, G., & Levin, M., & Group, S. O. R. C. W. (2011). Virtual reality displays for improved spatial attention in virtual reality. ACM Transactions
in stroke rehabilitation: A meta-analysis and implications for On Computer-Human Interaction, 24, 1–38. doi:10.1145/3067822
INTERNATIONAL JOURNAL OF HUMAN–COMPUTER INTERACTION 909

Vosinakis, S., Anastassakis, G., & Koutsabasis, P. (2018). Teaching and About the Authors
learning logic programming in virtual worlds using interactive micro-
world representations. British Journal of Educational Technology, 49 Yong Min Kim is PhD candidate in Department of Industrial
(1), 30–44. doi:10.1111/bjet.2018.49.issue-1 Engineering at Seoul National University, South Korea. He received BS
Vourvopoulos, A., & Liarokapis, F. (2014). Evaluation of commercial degree in Biosystems Engineering from Seoul National University in
brain-computer interfaces in real and virtual world environment: A pilot 2014. His research focuses on human-computer interaction, user-cen-
study. Computers and Electrical Engineering, 40, 714–729. Elsevier Ltd. tered design, and user experience, especially virtual reality systems and
Wang, J., & Lindeman, R. (2015). Coordinated hybrid virtual envir- augmented reality.
onments: Seamless interaction contexts for effective virtual reality.
Computers & Graphics, 48, 71–83. doi:10.1016/j.cag.2015.02.007 Ilsun Rhiu is currently an assistant professor in the Division of Big Data
Witmer, B. G., & Singer, M. J. (1998). Measuring presence in virtual and Management Engineering at the Hoseo University, South Korea. He
environments: A presence questionnaire. Presence, 7(3), 225–240. received PhD degree in Industrial Engineering from Seoul National
doi:10.1162/105474698565686 University, South Korea, in 2015. His research interests include
Xiong, W., Wang, Q.-H., Huang, Z.-D., & Xu, Z.-J. (2016). A framework for human-computer interaction, user-centered design, and user research
interactive assembly task simulation in method.
virtual environment. International Journal of Advanced Manufacturing Myung Hwan Yun is Professor in the Department of Industrial
Technology, 85, 955–969. doi:10.1007/s00170-015-7976-3 Engineering at Seoul National University. He received his PhD degree
Zhang, H. (2017). Head-mounted display-based intuitive virtual reality train- in Industrial and Manufacturing Engineering at Penn State University,
ing system for the mining industry. International Journal of Mining Science USA in 1994. His research interests include human factors, user-centered
and Technology, 27(4), 717–722. doi:10.1016/j.ijmst.2017.05.005 design, affective engineering, and intelligent human–machine interface.
910 Y. M. KIM ET AL.

Appendix
C1: Question/objective sufficiently described?
C2: Study design evident and appropriate?
C3: Method of subject/comparison group selection or source of information/input variables described and appropriate?
C4: Subject (and comparison group, if applicable) characteristics sufficiently described?
C5: If interventional and random allocation was possible, was it described?
C6: If interventional and blinding of investigators was possible, was it reported?
C7: If interventional and blinding of subjects was possible, was it reported?
C8: Outcome and (if applicable) exposure measure(s) well defined and robust to measurement/misclassification bias? Means of assessment reported?
C9: Sample size appropriate?
C10: Analytic methods described/justified and appropriate?
C11: Some estimate of variance is reported for the main results?
C12: Controlled for confounding?
C13: Results reported in sufficient detail?
C14: Conclusions supported by the results?

You might also like