eISSN: 2093-8462 http://jesk.or.kr
Open Access, Peer-reviewed
Junyoung Ahn
, Seungho Choi
, Minjae Lee
, Kyungdoh Kim
10.5143/JESK.2017.36.4.267 Epub 2017 August 29
Abstract
Objective: The aim of this study is to investigate key user experience factors of interactions for Head Mounted Display (HMD) devices in the Virtual Reality Environment (VRE).
Background: Virtual reality interaction research has been conducted steadily, while interaction methods and virtual reality devices have improved. Recently, all of the virtual reality devices are head mounted display based ones. Also, HMD-based interaction types include Remote Controller, Head Tracking, and Hand Gesture. However, there is few study on usability evaluation of virtual reality. Especially, the usability of HMD-based virtual reality was not investigated. Therefore, it is necessary to study the usability of HMD-based virtual reality.
Method: HMD-based VR devices released recently have only three interaction types, 'Remote Controller', 'Head Tracking', and 'Hand Gesture'. We search 113 types of research to check the user experience factors or evaluation scales by interaction type. Finally, the key user experience factors or relevant evaluation scales are summarized considering the frequency used in the studies.
Results: There are various key user experience factors by each interaction type. First, Remote controller's key user experience factors are 'Ease of learning', 'Ease of use', 'Satisfaction', 'Effectiveness', and 'Efficiency'. Also, Head tracking's key user experience factors are 'Sickness', 'Immersion', 'Intuitiveness', 'Stress', 'Fatigue', and 'Ease of learning'. Finally, Hand gesture's key user experience factors are 'Ease of learning', 'Ease of use', 'Feedback', 'Consistent', 'Simple', 'Natural', 'Efficiency', 'Responsiveness', 'Usefulness', 'Intuitiveness', and 'Adaptability'.
Conclusion: We identified key user experience factors for each interaction type through literature review. However, we did not consider objective measures because each study adopted different performance factors.
Application: The results of this study can be used when evaluating HMD-based interactions in virtual reality in terms of usability.
Keywords
Virtual reality Interaction User experience factors Literature review
VR (Virtual Reality) refers to the reality where a person feels that he/she is located, although the person is not physically located there, and a VR device is defined as a means to send a person to such a reality (Rebelo et al., 2012). A VR device provides not only visual immersion, but feedback such as sound and tactile feeling (Desai et al., 2014). Diverse types of devices to implement VR exist, and various classifications have been found from the past. In a study of Buxton and Fitzmaurice (1998), a VR system was classified into Head Mounted Display (HMD) VR, Caves, and Chameleon style VR. HMD VR was suggested in a study of Sutherland (1968) for the first time, and it refers to wearing a helmet or goggles in the face. Caves was proposed in a study of Cruz-Neira et al. (1992) for the first time, and Caves refers to showing 3D virtual reality in a hexagonal shape. Usually, a person goes into a hexagonal-shaped chamber and checks VR. Lastly, chameleon style VR was suggested in a study of Fitzmaurice (1993) for the first time, and it refers to experiencing virtual reality through a handheld display style VR device.
Meanwhile, Moon (2014) classified the VR system environment into four types: First, immersive VR means a system enabling a user to be immerged to a vivid environment using special devices including HMD, data glove, and data suit. Second, tele-robotics VR is a combination of an immersion system and a robot, and it means a system offering an effect of a user present in the long distance space using a robot. Third, desktop VR is a VR system which a user can easily access relatively using such devices as 3D glasses and a joystick on a general desktop monitor. Lastly, third person VR means a system enabling a user to feel as if he/she existed in the VR by letting his/her image appear on the virtual space of a computer through a video camera.
All VR devices, among diverse VR systems, commercialized as of 2016 took on HMD type. Representative HMD-based VR devices are Samsung Gear VR, Oculus Rift, HTC Vive, and Sony PlayStation VR. As VR devices are commercialized from such manufacturers as Oculus and HTC, users can experience VR in a more sophisticated and cost-effective way, compared to the past (Castaneda and Pacampara, 2016; Rhee and Kim, 2016).
The most remarkable feature of HMD is full immersion. The concept of VR having immersion was reported in a study of Sutherland (1965) first-ever, and such full immersion means offering a feeling of 'being in', rather than a feeling of just 'looking at' (Shneiderman, 1998). According to a study of Kjeldskov (2001), the interaction method of HMD can be divided into six methods (head tracking, joystick, trackball, position tracking, virtual hand, and virtual pointer), and interaction type in the VR environment can be classifeid into three types (orientating, moving, and acting). Orientating, the first interaction type, is to look around VR, and head tracking, a joystick and a trackball are used in the state of wearing an HMD. Moving, the second interaction type, is to move around VR, and it uses position tracking, a joystick, and a trackball. In case the space of VR is bigger than the space to move around in reality in the state of wearing an HMD, the position tracking by which moving around VR through actual moving user's body can cause a problem, due to limit of space. Lastly, acting, the third interaction type, refers to picking up, moving, and rotating an object. In the state of wearing an HMD, a virtual hand or a virtual pointer is used.
Although many studies on VR interactions were carried out in the past (Bowman, 1999; Moeslund, 2001; Poupyrev et al., 1996), recently commercialized VR devices are all HMD type, and there has been some change including newly added interaction methods. Through such a change, better VR experiences can be offered to users than in the past. Despite diversified interactions and contents, study on their usability lacks a lot so far. Recent studies are mainly focused on VR performance; however, there is a need to research what factors in HMD and interaction devices have an effect on user experience. This study aims to arrange the factors affecting user experience by each interaction used in VR and important factors to consider in designing, and to use them in the future studies. The key user experience factors arranged in this study have significance in that they have been arranged through literature review based on VR devices, and that the presented factors can be applied as scales in the future HMD-based VR interaction evaluation studies.
This study was conducted through the following procedures (Figure 1): Based on the interactions classified by Kjeldskov (2001) and currently commercialized major VR devices, the current VR interactions were classified. Major VR devices were based on Oculus Rift, HTC Vive, Samsung Gear VR, and Sony PlayStation VR. The interaction types defined in this study are remote controller, head tracking, and hand gesture. Concerning the remote controller, such manipulation types as touch pad, joystick, and trackball were integrated, since manipulation modes are different according to manufacturers.
To look at studies corresponding to each interaction type, 113 literatures were collected. VR-related experimental researches were collected. In the case of non VR-based studies, studies applicable to HMD type VR devices were gathered. Centered on the collected studies, usability studies corresponding to each interaction were confirmed. In such a process, a screening process was carried out to validate whether the usability factors concerned are important factors in interaction. The screening process was undertaken in a mode to judge usability issues or evaluation scales adopted commonly in two or more studies as important factors in the same method as the method applied in a study of Kim et al. (2012).
3.1 Remote controller
Although new interactions, such as head tracking or hand gesture, are presented in the HMD environment, the controller type in which users use a specific auxiliary equipment is used a lot. In the typical devices of HMD, Oculus Rift and HTC Vive, a remote controller is provided as a basic interaction. In Samsung VR device, a touch pad on the side of the device is used as a basic user interaction.
Remote controller is a familiar interaction type used by users for a long time. Objective evaluation factors, such as task time and error rate putting priority on recognition ability, were mainly researched in the existing usability evaluation studies on remote controller. This study aims to draw outstanding remote controller usability evaluation factors in the HMD environment by including the studies that measured subjective factors with questionnaire or interview in addition to objective evaluation factors. Previous studies on remote controller's usability evaluation are as follows:
Brown et al. (2015) conducted a study by noticing that complex studies associating games and remote controllers were insufficient, whereas HCI studies on computer games increased. McNamara and Kirakowski (2006) checked things required for measurement from the experience, usability, and functionality aspects to measure game controllers and comparatively measured, after undertaking case studies with game pad, keyboard, and steering wheel controller technologies much used as game controllers currently. From the usability aspect, efficiency, effectiveness, and satisfaction were used, and efficiency was measured through subjective mental effort questionnaire. Effectiveness was measured with lap time as scale, and satisfaction was measured using consumer product questionnaire (CPQ). Wang et al. (2011) developed an interaction mixing a motor wheelchair used by long-term recuperation families within a household with visual, auditory, and haptic feedback, and carried out a study to prevent a motor wheelchair from colliding with an obstacle through improved interactions. From the usability aspect, efficiency, effectiveness, and satisfaction were also used. Task implementation was observed through power-mobility indoor driving assessment (PIDA) in terms of efficiency. Effectiveness was measured through personal rating using National Aeronautics and Space Administration Task Load Index (NASATLX) as scale. Satisfaction was measured using the Quebec User Evaluation of Satisfaction with Assistive Technology (QUEST) and Psychosocial Impact of Assistive Devices Scale (PIADS). Boulay et al. (2011) conducted a study on the treatment of dementia patients through singing or playing a song that one selected by pointing the virtual keyboard using the WiiMote controller, and efficiency, effectiveness, and satisfaction were used as usability evaluation factors in the study as well.
Many studies used various usability evaluation factors as controller usability evaluation method. Rupp et al. (2013) carried out a study comparing existing joystick interaction and keyboard interaction with X Box controller interaction. In their study, performance, workload, and usability were compared between interaction types, and a system usability scale (SUS) evaluation method was confirmed to be used in usability. Fernandes et al. (2010) performed the usability evaluation of 3D connection in a 3D Avatar game, 'Second Life'. In their study, ease of learning, ease of remembering, ease of use, frequency of errors, and subjective satisfaction were used as usability evaluation factors. McMahan et al. (2010) conducted a comparative study on vehicle's steering wheel-shaped natural interaction controller and general shaped non-natural interaction controller in a Mario cart game. Usability evaluation factors included most liked, ease of use, most fun, and performance. Kurniawan et al. (2003) performed a study of joystick-based full screen magnifying glass usability evaluation for visually impaired people. Usability used in their study was the usability evaluation factor.
Therefore effectiveness, efficiency, satisfaction, ease of learning, ease of use, ease of remembering, frequency of errors, most liked, most fun, and performance can be representative user experience factors adopted from existing studies.
3.2 Head tracking
Kjeldskov (2001) says that head tracking in an HMD is a very intuitive and natural interaction method, because a user can easily look at desired direction with just turning his/her head. In recently used HMDs including Gear VR, head tracking is mainly used as a method to change one's view by turning one's head.
With regard to head tracking, studies on abnormal symptoms caused by head movement have been mainly carried out. For example, there were studies that measured motion sickness symptoms, such as dizziness due to movements of turning one's head using interview or questionnaire. Also, there were studies on subjective evaluation factors. For instance, there were studies measuring immersion, presence, concentration, stress, and preference felt by experiencing VR visually in an interview or a questionnaire fashion. In addition, there were studies on objective evaluation factors. For example, there were studies measuring only objective evaluation factors including task time and error rate.
However, there were more studies checking such two or three factors, namely symptoms caused by simple head movement, subjective factors, and objective factors, rather than measuring only one among those three factors. Detailed factors (immersion, dizziness, task time) ascertained by study were mostly different. Studies on head tracking examined in this study are as follows:
Howarth and Costello (1997) measured the differences of simulator sickness symptoms, visual symptoms, and malaise scale in the VDU (visual display unit) environment and HMD wearing environment through questionnaire. Also, experiment drop out causes were measured through interviews with those who gave up the experiment in the middle of the experiment. First, the Pensacola Simulator Sickness Questionnaire (SSQ) of Kennedy et al. (1993) was used for measured simulation sickness symptom. The questionnaire evaluates the following with Likert scale score: general discomfort, fatigue, boredom, drowsiness, headache, eyestrain, sweating, claustrophobia, disorientation, nausea, and difficulty in concentration. Second, the questionnaire of Howarth and Istance (1985) was used for the measured visual symptom. The questionnaire evaluates the following with Likert scale score: tired eyes, sore or aching eyes, irritated eyes, watering or runny eyes, dry eyes, burning eyes, blurred vision, double vision, and general visual discomfort. Third, the Malaise Rating (MR) Scale questionnaire was used for the measured malaise scale. This questionnaire evaluates the following with 5-point Likert scale score: 1 point indicates no symptom, 2 points some symptom and no nausea, 3 points mild nausea, 4 points moderate nausea, and 5 points severe nausea. Lastly, the drop-out causes measured through interviews were three cases of headache and one case of nausea. Based on their study, the identification of physical and visual symptoms and malaise scale can be an important factor.
Pal et al. (2016) compared three types of head tracking: Complete 6-DOF (degrees of freedom), rotational 3-DOF, and translational 3-DOF. For comparison, task time and task errors, which are subjective measurements, were evaluated. Reported usability and presence were evaluated for subjective measurement. By using SSQ (Pensacola Simulator Sickness Questionnaire) used in a study of Howarth and Costello (1997), simulation sickness was also evaluated. Concerning usability, which is subjective measurement, the System Usability Scale (SUS) of Brooke (1996) was used. Presence was evaluated using the Slater-Usoh-Steed Presence Questionnaire (SUS PQ) of Slater et al. (1995). In their study, subjective and objective measurements were evaluated, unlike the study of Howarth and Costello (1997). Especially, they conducted scoring simulation sickness symptom and the presence of usability through questionnaire using SUS or SUS PQ, as well as SSQ, which can be noteworthy.
Ragan et al. (2016) conducted a study on comparison of differences in the HMD and Cave environments according to three scopes of the screen showing VR (1x amplification with 360° horizontal display range, 1.5x amplification with 270° horizontal display range, 4x amplification with 120° horizontal display range). As a result, performance and errors were measured as objective measurements. Sickness was scored using SSQ as objective measurement. In addition, 3D video game experiences were confirmed through interviews in order to confirm experience, and feelings after experiment were collected through interviews. Although their study conducted scoring using SSQ, it has significance in that use experiences were confirmed, and a correlation that use experience was confirmed, and as there were more experiences, SSQ score was lower, and experiment performance was higher was grasped.
Yano et al. (2014) conducted a study comparing widening, shifting, and diverging, which are three methods to expand the field of vision of VR in the state of wearing an HMD. Experiment was to find targets in VR through head tracking. As a result, the number of detected targets and accumulated head rotation in degree, which are objective measurements, were evaluated. Also, subjective measurements were measured. The naturalness of viewing and dizziness as subjective measurements were evaluated with a 10-point Likert scale. Workload was evaluated in addition, and their study used NASA-TLX (NASA Task Load Index) of Hart and Staveland (1988).
The workload evaluation factors of NASA-TLX include mental demand, physical demand, temporal demand, effort, performance, and frustration level.
Bruck and Watters (2011) measured cyber sickness in the VR environment of 3D image in the state of wearing shutter glasses, not wearing an HMD. For measurement, the SSQ of Kennedy et al. (1993) and Anxiety Questionnaire of Kim et al. (2005) were used. Besides, respiratory rate and heart rate per minute were measured. Through all these measurements, cyber sickness, vision factor, arousal factor, and fatigue were figured out.
These previous studies measured physical or visual sicknesses through mainly SSQ or NASA-TLX; however, some measured only some symptoms such as dizziness or stress through Likert scale-based questionnaire or interview. Of the studies assessing head tracking, some studies assessed only objective or subjective measurements without evaluating physical or visual sicknesses unlike mainstream studies.
Van Nieuwenhuizen et al. (2016) carried out a study comparing the differences of image interfaces (inside view, outside view) and search methods (scrolling and head tracking, only head tracking), when a user views cylinder-shaped 3D image in the state of wearing an HMD. As objective measurements, the average speed of task performance, number of errors, and rotation in degree were measured. As subjective measurements, immersion, ease of viewing, intuitiveness, ease of learning, ease of remembering, physical or mental stress, and enjoyed the interface were measured using the Likert scale. Although their study evaluated physical or mental stress as subjective measurement, sickness was not detected through questionnaire including SSQ or NASA-TLX, but only Likert scale was used, which can be a difference from the studies examined above.
In a study of Martel et al. (2016), task time, the number of errors, and number of success were measured as objective measurements, after an HMD and a mouse were used like a controller in the VR game environment. As subjective measurements, immersion was evaluated using some questions of the Immersive Experiences Questionnaire (IEQ) of Jennett et al. (2008).
Meanwhile, Mourant and Jaeger (2000) compared results according to differences in HOV (high-occupancy vehicle) through a virtual environment driving simulator in the state of wearing a helmet mounted display containing a gyro sensor. For comparison, only driving speed, driving distance, and driving time, which are objective measurements, were used.
3.3 Hand gesture
Murthy and Jadon (2009) viewed VR as a domain that can be manipulated with deeper sense of reality through 3D gesture interactions, and they classified vision-based gestures and glove-based gestures according to recognition method. Their study arranged studies on gestures according to the classification of Murthy and Jadon (2009).
3.3.1 Vision-based gesture
The study of Murthy and Jadon (2009) explained that vision-based gestures need to have efficiency technically. Namely, their study explained that interactions generate more effective and cost-effective aspect, when technology or algorithms to recognize gestures are efficient. Also, they asserted that a user should be able to adapt easily, which can be important for scalability in the environment having scales different from reality such as desktop, as well as in the virtual environment. Krum et al. (2002) carried out a study on multimodal interactions operating VGIS (virtual geographic information system) suggested in a study of Lindstrom et al. (1997). Especially, responsiveness, ease of learning, and ease of use were presented as scales and standards in using gesture manipulation. In other words, how fast a user can respond should be detected, and whether it is easy for a novice to learn and recognition load from the user perspective need to be considered. A study of Baudel and Beaudouin-Lafon (1993) suggested a remote controller using vision-based gestures. Their study saw two issues of gesture input. First, fatigue caused in the gesture interaction process. This is accompanied by more muscles than those in the keyboard or voice interaction. In this regard, fatigue should be able to be minimized by enhancing efficiency including presenting a simple and quick gesture. In relation with ease of learning, they insisted that the gestures used for interaction should be those that a user must know. Therefore, simpler, more natural and consistent gestures need to be presented to users. In addition, they said immediate feedback of gestures are important as well. Song et al. (2012) performed a study on vision-based gesture interactions to manipulate 3D objects in the virual environment, and the manipulation of handle bar with both hands in the real environment was carried out as metaphor, and thus more effective and reality-friendly interaction experiences were offered. Through such a manipulation, they reported that users can use intuitive use, and they need to have controllability through which intuitive use is possible, and the controllability easy to remember and good. Kim et al. (2005) presented gesture interactions for 3D modeling, and they reported that they improved interaction accuracy more through visual feedback in their study. That is, 3D modeling of user's hand visually enables more accurate interactions, compared to the case not doing so. In VR, they pointed out the usability of interaction between a user and a virtual model is an issue, and they reported that more meaningful modeling is possible than the case of gestures easy to use.
3.3.2 Glove-based gesture
Murthy and Jadon (2009) asserted that easy and natural interactions for users are important with regard to a gesture interaction method using data glove. The reason is presumed that inconvenience may be caused due to wearing data glove and cable to connect with a computer, unlike the vision mode in which relatively natural interaction is possible. Yang and Kim (2002) suggested a motion training system using head tracking in the HMD VR environment in their study, and they insisted that offering multimodal feedback to implement the system can be an objective. For example, they explained haptic needs to be considered materially in the case of a motion like rowing. They also comparatively evaluated the suggested virtual environment system with the existing real environment. In their study, whether a user can learn a motion efficiently and quickly was validated by measuring motion speed and error rate. Bowman et al. (2001) undertook a study on interaction design in 3D interface, such as virtual environment, and they classified interaction devices into input devices and output devices. They emphasized that natural, efficient, and taskharmonizing input is needed as to an input device and that feedback like haptic will be more important in 3D UI in VR in the case of output devices. In relation with interaction skills, they explained user training and learning as key guidelines in the interaction process including navigation. Stanney et al. (1998) arranged ergonomic issues existing in the virtual environment in a study, and they reported that maximization of performance efficiency in the virtual world is a basic development direction required for the effective virtual environment. Likewise, from the navigation perspective, they explained that user's task performance ability will reach a limit seriously, if user's navigation is not effectively conducted within VR. They saw haptic feedback as an important issue in the virtual environment according to human sense. Haptic feedback in VR is required for better performance and presence (Biocca, 1992).
Existing previous studies were examined according to the two types of classification mentioned above. Irrelevant of gesture classification, usability factors, to which interaction devices for VR can be applied in the other interaction methods in addition to gestures, exist. Feedback is mentioned as a key usability factor in numerous previous studies on gestures. A study of Welch et al. (1996) saw feedback between user's motion and display's motion as a material factor that can bring a positive effect on interaction's vividness. They also suggested that feedback delay needs to be 200~220ms maximum for vivid interaction. Schuemie et al. (2001) conducted a study on presence in the virtual environment, and they insisted that a user needs to be ready-to-hand to use tools immediately in terms of interactions and that a user should always feel usefulness in performing any task. Kjeldskov (2001) compared interactions by classifying VR displays into full immersive VR displays and partial immersive VR displays. Especially, he stressed that an interaction implemented by virtual hand within the virtual environment for close-by interaction needs to be intuitive and natural.
Based on the previous studies investigated above, key user experience factors by each interaction have been summarized. The different terms adopted in literatures were revised to be consistent, and only the factors adopted by two literatures or more are summarized as follows:
User experience factors in controllers are presented as follows (Table 1): A controller is an interaction type adopted and used for relatively longer than other interaction types. For this reason, the controller type using a specific auxiliary device in a new HMD environment is used a lot, and it is an interaction type very familiar to users. In previous studies on controllers, traditional usability evaluation factors used long before are judged to be relatively more suitable, compared to other interaction types. The satisfaction evaluation factor of a user in using interaction was the most used usability evaluation factor. Since the concept of satisfaction deals with effects on users, a controller, through which natural interactions can be carried out, suitable for situations is required to be used so that a user may not undergo inconveniences and have positive attitudes. However, satisfaction becomes an important factor to various platforms and interaction types, as well as to controllers. Therefore, this study excluded satisfaction from the usability factors of a controller. And then, the usability evaluation factors on ease of use and on how much efficient a controller is between a user and the VR environment (efficiency) are regarded as important. Since differences between ease of use and efficiency may occur depending on what type of controller is used in evaluating usability in the VR environment, the selection of a controller device will become a key issue. Additionally, usability evaluation factors on effectiveness and ease of learning are confirmed to be also important.
Rupp et al. (2013) |
Fernandes et al. (2010) |
Brown et al. (2015) |
Kurniawan et al. (2003) |
Wang |
Boulay et al. (2011) |
McMahan et al. (2010) |
|
Ease of learning |
x |
x |
|||||
Ease of use |
x |
x |
x |
x |
|||
Satisfaction |
x |
x |
x |
x |
x |
||
Effectiveness |
x |
x |
x |
||||
Efficiency |
x |
x |
x |
x |
In conclusion, key user experience factors in terms of controller are ease of learning, ease of use, effectiveness, and efficiency.
Key user experience factors in head tracking are sickness, immersion, fatigue, stress, ease of learning, and intuitiveness (Table 2). Because user moves his/her head in terms of head tracking, unlike controller or gesture, sickness was mentioned in the most papers, and it appears to be the most important factor. However, the questionnaires to measure sickness include SSQ and Malaise Scale, and thus they are not unified. Therefore, consideration on which questionnaire is more suitable seems to be necessary. Actually, head tracking can be a usability factor to be considered more importantly, since visual change viewed through an HMD by turning one's head is biggest in comparison with controller or gesture in terms of immersion or intuitiveness. Immersion can be evaluated using IEQ in addition to Likert scale, and intuitiveness can be assessed through SUS questionnaire, Likert scale, and interview. Because a user must move his/her head, and motion becomes bigger, stress and fatigue need to be included in usability evaluation. NASA-TLX questionnaire can be used for stress, and fatigue can be evaluated through Likert scale or interview. When a user performs a task of movement through head tracking, how much it is easy to learn motion change depending on head movement can be a usability evaluation factor that can be considered, and it can be evaluated through Likert scale-based questionnaire and interview.
Howarth and Costello (1997) |
Pal et al. (2016) |
Ragan et al. (2016) |
Yano et al. (2014) |
Bruck and Watters (2011) |
Van Nieuwenhuizen et al. |
Martel et al. (2016) |
|
Sickness |
x |
x |
x |
x |
x |
|
|
Immersion |
|
|
|
x |
x |
||
Intuitiveness |
|
x |
|
|
|
x |
|
Stress |
|
|
|
x |
x |
x |
|
Fatigue |
|
|
|
x |
x |
|
|
Ease of learning |
|
x |
|
|
|
x |
|
There are 11 key user experience factors in gestures (Table 3). Feedback is the factor emphasized and adopted most in the previous studies, and can be the most important factor in gestures. For the accuracy of gestures in the VR environment, visual feedback needs to be provided, and furthermore haptic feedback should be able to be offered to users so that the same experiences as in the real world can be offered to users in the virtual environment. Following the gesture, ease of learning and ease of use were confirmed to be important factors in the interaction process. Although these are the factors emphasized in all the interaction process, gestures are important in that natural interactions should be induced in the real environment, different from the controller mode. To offer ease of learning and ease of use to users, the gesture vocabulary familiar and already known in the real environment should be applied to make adaptation easy (adaptability), and gestures need to be natural gestures reflecting users' mental model on the existing manipulation methods. In this context, consistent, simple, and intuitive gesture vocabulary using the number of gestures, which are easy to remember in consideration of users' recognition load, needs to be presented. From the manipulation command perspective, the gesture should be useful (usefulness) to the manipulation command to adopt a gesture interaction to manipulation commands. To this end, only efficient manipulation commands to gesture manipulation need to be adopted (efficiency). Lastly, from the responsiveness perspective, there should be no gap between user manipulation in the real environment and responsive speed in the virtual environment.
Argyros and Lourakis |
Krum et al. |
Baudel and Beaudouin -Lafon (1993) |
Bowman et al. (2001) |
Kim et al. |
Stanney et al. (1998) |
Yang and Kim (2002) |
Schuemie et al. (2001) |
Welch et al. (1996) |
Wachs et al. (2011) |
Murthy and Jadon (2009) |
Song et al. |
|
Ease of learning |
x |
x |
x |
x |
x |
|||||||
Ease of use |
x |
x |
x |
x |
x |
x |
x |
|||||
Feedback |
x |
x |
x |
x |
x |
x |
||||||
Consistent |
x |
x |
||||||||||
Simple |
x |
x |
||||||||||
Natural |
x |
x |
x |
x |
||||||||
Efficiency |
x |
x |
x |
|||||||||
Responsiveness |
x |
x |
||||||||||
Usefulness |
x |
x |
||||||||||
Intuitiveness |
x |
x |
x |
|||||||||
Adaptability |
x |
x |
The usability factors mentioned above are those mentioned in two papers or more at least. However, there are usability factors mentioned in only on paper. For example, ease of remembering and frequency of errors were in remote controller. In head tracking, presence, naturalness, and precision existed, and way finding was in gesture. These usability factors are regarded as necessary for usability evaluation of each interaction method. However, those were remarked in one of the 29 papers, and thus they were excluded, because they were judged to be unreasonable to view them as essential user experience factors adopted by most studies.
As above, key user experience factors on three representative interaction types in the VR environment were arranged. Consequently, it was ascertained that commonly detected factors in the three interaction types existed in this study, and such common factors were judged to be separately arranged. The common factors are ease of learning, ease of use, and satisfaction, and these are presumed to be user experience factors essential to all interactions. These three factors need to be used as scales to be taken into account for all interactions.
This study presented key user experience factors on three interaction types adopted for the HMD environment-based VR devices. Towards this end, existing studies related to or applicable to VR were investigated, and important usability factors were drawn according to the framework of this study.
In conclusion, the key user experience factors specialized for each interaction are as follows: two factors (effectiveness and efficiency) in remote controller, five factors (sickness, immersion, fatigue, stress, and intuitiveness) in head tracking, and nine factors (feedback, adaptability, natural, consistent, simple, intuitiveness, usefulness, efficiency, and responsiveness in gesture). Ease of learning, ease of use, and satisfaction are included in the common key user experience factors.
Upon investigation of studies mainly on usability evaluation, numerous previous studies, in which a traditional interaction type, controller, measures technical part quantitatively, were verified to exist. Meanwhile, other interaction types did not deal with technical issues relatively.
A further study needs to carry out diversified usability evaluations containing qualitative measurement considering technical factors that affect usability by interaction type and qualitative scales in which user experiences are founded. A study on usability in line with various contexts should be conducted in the future in that usability evaluation methods can differ depending on contents and services.
Although numerous technical studies to implement VR exist, usability evaluation studies are still insufficient. Diverse usability studies need to be undertaken for the best interaction experiences in the VR environment. This study has significance, as user experience factors presented in this study can be used as evaluation scales.
References
1. Argyros, A. and Lourakis, M., Vision-based interpretation of hand gestures for remote control of a computer mouse. Computer Vision in Human-Computer Interaction, 40-51, 2006.
Crossref
Google Scholar
2. Baudel, T. and Beaudouin-Lafon, M., Charade: remote control of objects using free-hand gestures, Communications of the ACM, 36(7), 28-35, 1993.
Crossref
Google Scholar
3. Biocca, F., Communication within virtual reality: Creating a space for research. Journal of Communication, 42(4), 5-22, 1992.
Crossref
Google Scholar
4. Boulay, M., Benveniste, S., Boespflug, S., Jouvelot, P. and Rigaud, A.S., A pilot usability study of MINWii, a music therapy game for demented patients. Technology and Health Care, 19(4), 233-246, 2011.
Crossref
Google Scholar
5. Bowman, D.A., Interaction techniques for common tasks in immersive virtual environments, Diss. Georgia Institute of Technology, 1999.
Crossref
Google Scholar
6. Bowman, D.A., Kruijff, E., LaViola Jr, J.J. and Poupyrev, I., An introduction to 3-D user interface design, Presence: Teleoperators and Virtual Environments, 10(1), 96-108, 2001.
Crossref
Google Scholar
7. Brooke, J., SUS-A quick and dirty usability scale, Usability Evaluation in Industry, 189(194), 4-7, 1996.
Crossref
Google Scholar
8. Brown, M., Kehoe, A., Kirakowski, J. and Pitt, I., Beyond the gamepad: HCI and game controller design and evaluation, In Game User Experience Evaluation, Springer International Publishing. 263-285, 2015.
Crossref
Google Scholar
9. Bruck, S. and Watters, P.A., The factor structure of cybersickness. Displays, 32(4), 153-158, 2011.
Crossref
Google Scholar
10. Buxton, B. and Fitzmaurice, G.W., HMDs, caves & chameleon: a human-centric analysis of interaction in virtual space, ACM SIGGRAPH Computer Graphics, 32(4), 69-74, 1998.
Crossref
Google Scholar
11. Castaneda, L. and Pacampara, M., Virtual Reality in the Classroom: An Exploration of Hardware, Management, Content and Pedagogy, In Society for Information Technology & Teacher Education International Conference, 2016(1), 527-534, 2016.
Crossref
Google Scholar
12. Cruz-Neira, C., Sandin, D.J., DeFanti, T.A., Kenyon, R.V. and Hart, J.C., The CAVE: Audio Visual Experience Automatic Virtual Environment, Communications of the ACM, 35(6), 64-73, 1992.
Crossref
Google Scholar
13. Desai, P.R., Desai, P.N., Ajmera, K.D. and Mehta, K., A review paper on Oculus Rift - a virtual reality headset, International Journal of Engineering Trends and Technology, 13(4), 175-179, 2014.
Crossref
Google Scholar
14. Fernandes, P., Ferreira, C., Cunha, A. and Morgado, L., Usability of 3D controllers in Second Life, DSAI, 25-26, 2010.
Crossref
Google Scholar
15. Fitzmaurice, G.W., Situated information spaces and spatially aware palmtop computers, Communications of the ACM, 36(7), 39-49, 1993.
Crossref
Google Scholar
16. Hart, S.G. and Staveland, L.E., Development of NASA-TLX (Task Load Index): Results of empirical and theoretical research, Advances in Psychology, 52, 139-183, 1988.
Crossref
Google Scholar
17. Howarth, P.A. and Costello, P.J., The occurrence of virtual simulation sickness symptoms when an HMD was used as a personal viewing system, Displays, 18(2), 107-116, 1997.
Crossref
Google Scholar
18. Howarth, P.A. and Istance, H.O., The association between visual discomfort and the use of visual display units. Behaviour & Information Technology, 4(2), 131-149, 1985.
Crossref
Google Scholar
19. Jennett, C., Cox, A.L., Cairns, P., Dhoparee, S., Epps, A., Tijs, T. and Walton, A., Measuring and defining the experience of immersion in games. International Journal of Human-Computer Studies, 66(9), 641-661, 2008.
Crossref
Google Scholar
20. Kennedy, R.S., Lane, N.E., Berbaum, K.S. and Lilienthal, M.G., Simulator sickness questionnaire: An enhanced method for quantifying simulator sickness. The International Journal of Aviation Psychology, 3(3), 203-220, 1993.
Crossref
Google Scholar
21. Kim, H., Albuquerque, G., Havemann, S. and Fellner, D.W., Tangible 3D: Hand Gesture Interaction for Immersive 3D Modeling, In IPT/EGVE, 191-199, 2005.
Crossref
Google Scholar
22. Kim, K., Proctor, R.W. and Salvendy, G., The relation between usability and product success in cell phones, Behaviour & Information Technology, 31(10), 969-982, 2012.
Crossref
Google Scholar
23. Kim, Y.Y., Kim, H.J., Kim, E.N., Ko, H.D. and Kim, H.T., Characteristic changes in the physiological components of cybersickness. Psychophysiology, 42(5), 616-625, 2005.
Crossref
Google Scholar
24. Kjeldskov, J., Interaction: Full and partial immersive virtual reality displays, IRIS24, 587-600, 2001.
Crossref
Google Scholar
25. Krum, D.M., Omoteso, O., Ribarsky, W., Starner, T. and Hodges, L.F., Speech and gesture multimodal control of a whole Earth 3D visualization environment, 2002.
Crossref
Google Scholar
26. Kurniawan, S., King, A., Evans, D.G. and Blenkhorn, P., Design and user evaluation of a joystick-operated full-screen magnifier, In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM, 25-32, 2003.
Crossref
Google Scholar
27. Lindstrom, P., Koller, D., Ribarsky, W., Hodges, L.F., Op den Bosch, A. and Faust, N.L., An integrated global GIS and visual simulation system, 1997.
Crossref
Google Scholar
28. Martel, E., Hassan, A., Su, F., Girouard, A., Gerroir, J. and Muldner, K., Diving Head-First into Virtual Reality—Evaluating HMD Control Schemes for VR Games, cil.csit.carleton.ca, http://cil.csit.carleton.ca/wordpress/wp-content/papers/FDG2015.Martel.HMD-VR.pdf (retrieved November 25, 2016).
Crossref
29. McMahan, R.P., Alon, A.J.D., Lazem, S., Beaton, R.J., Machaj, D., Schaefer, M., Silva, M.G., Leal, A., Hagan, R. and Bowman, D.A., Evaluating natural interaction techniques in video games, In 3D User Interfaces (3DUI), 2010 IEEE Symposium, 11-14, 2010.
Crossref
Google Scholar
30. McNamara, N. and Kirakowski, J., Functionality, usability, and user experience: three areas of concern, interactions, 13(6), 26-28, 2006.
Crossref
Google Scholar
31. Moeslund, T.B., Interacting with a virtual world through motion capture, Virtual interaction: interaction in virtual inhabited 3D worlds, Springer, 2001.
Crossref
Google Scholar
32. Moon, J.S., The study on the applicability of virtual reality headset to space design field through focus group interviews, Journal of Integrated Design Research, 13(1), 33-44, 2014.
Crossref
33. Mourant, R.R. and Jaeger, B.K., Dynamic evaluation of pre-entry HOV signage using a virtual environments driving simulator, et. al. Vision in Vehicles VIII Elseiver Science BV NorthHolland, 2000.
Crossref
Google Scholar
34. Murthy, G.R.S. and Jadon, R.S., A review of vision based hand gestures recognition, International Journal of Information Technology and Knowledge Management, 2(2), 405-410, 2009.
Crossref
Google Scholar
35. Pal, S.K., Khan, M. and McMahan, R.P., The benefits of rotational head tracking, 2016 IEEE Symposium on 3D User Interfaces (3DUI), 31-38, IEEE, 2016.
Crossref
Google Scholar
36. Poupyrev, I., Billinghurst, M., Weghorst, S. and Ichikawa, T., The go-go interaction technique: non-linear mapping for direct manipulation in VR, Proceedings of the 9th Annual ACM Symposium on User Interface Software and Technology, 79-80, 1996.
Crossref
Google Scholar
37. Ragan, E.D., Scerbo, S., Bacim, F. and Bowman, D.A, Amplified Head Rotation in Virtual Reality and the Effects on 3D Search, Training Transfer, and Spatial Orientation, IEEE Transactions on Visualization and Computer Graphics, 2016.
Crossref
Google Scholar
38. Rebelo, F., Noriega, P., Duarte, E. and Soares, M., Using virtual reality to assess user experience, Human Factors, 54(6), 964-982, 2012.
Crossref
Google Scholar
PubMed
39. Rhee, B.A. and Kim, J.S., The sutability of VR artwork as an immersive learning tool, Proceedings of the Korean Society of Computer Information Conference, 24(1), 223-226, 2016.
Crossref
40. Rupp, M.A., Oppold, P. and McConnell, D.S., Comparing the Performance, Workload, and Usability of a Gamepad and Joystick in a Complex Task, In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, SAGE Publications, 57(1), 1775 -1779, 2013.
Crossref
Google Scholar
41. Schuemie, M.J., Van Der Straaten, P., Krijn, M. and Van Der Mast, C.A., Research on presence in virtual reality: A survey, Cyber Psychology & Behavior, 4(2), 183-201, 2001.
Crossref
Google Scholar
PubMed
42. Shneiderman, B., Designing the User Interface: Strategies for Effective Human-Computer Interaction, 3rd ed., McGraw-Hill, 1998.
Crossref
43. Slater, M., Usoh, M. and Steed, A., Taking steps: the influence of a walking technique on presence in virtual reality, ACM Transactions on Computer-Human Interaction (TOCHI), 2(3), 201-219, 1995.
Crossref
Google Scholar
44. Song, P., Goh, W.B., Hutama, W., Fu, C.W. and Liu, X., A handle bar metaphor for virtual object manipulation with mid-air interaction, In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 1297-1306, 2012.
Crossref
Google Scholar
45. Stanney, K.M., Mourant, R.R. and Kennedy, R.S., Human factors issues in virtual environments: A review of the literature, Presence: Teleoperators and Virtual Environments, 7(4), 327-351, 1998.
Crossref
Google Scholar
46. Sutherland, I.E., A head-mounted three dimensional display, In Proceedings of the December 9-11, 1968, fall joint computer conference, part I, 757-764, 1968.
Crossref
Google Scholar
47. Sutherland, I.E., The ultimate display, Multimedia: From Wagner to virtual reality, 1965.
Crossref
48. Van Nieuwenhuizen, K., Schaap, G.J. and Hürst, W., 3D HMD Image Browsers: Optimal User Placement and Exploration Mechanisms, johnnyschaap.com, http://www.johnnyschaap.com/files/3D%20HMD%20Image%20Browsers%20Optimal%20User%20Placement% 20and%20Exploration.pdf (retrieved November 23, 2016).
Crossref
49. Wachs, J.P., Kölsch, M., Stern, H. and Edan, Y., Vision-based hand-gesture applications. Communications of the ACM, 54(2), 60- 71, 2011.
Crossref
Google Scholar
50. Wang, R., Mihailidis, A., Dutta, T. and Fernie, G., Usability testing of multimodal feedback interface and simulated collisionavoidance power wheelchair for long-term-care home residents with cognitive impairments. Journal of Rehabilitation Research and Development, 48(7), 801, 2011.
Crossref
Google Scholar
51. Welch, R.B., Blackmon, T.T., Liu, A., Mellers, B.A. and Stark, L.W., The effects of pictorial realism, delay of visual feedback, and observer interactivity on the subjective sense of presence, Presence: Teleoperators and Virtual Environments, 5(3), 263-273, 1996.
Crossref
Google Scholar
52. Yang, U. and Kim, G.J., Implementation and evaluation of
Crossref
Google Scholar
53. Yano, Y., Kiyokawa, K., Sherstyuk, A., Mashita, T. and Takemura, H., Investigation of dynamic view expansion for head-mounted displays with head tracking in virtual environments, Proc. of the 24rd International Conference on Artificial Reality and Telexistence (ICAT 2014), 2014.
Crossref
Google Scholar
PIDS App ServiceClick here!