Next Article in Journal
Numerical Analysis of the Influence of Empty Channels Design on Performance of Resin Flow in a Porous Plate
Next Article in Special Issue
The Effect of the Degree of Anxiety of Learners during the Use of VR on the Flow and Learning Effect
Previous Article in Journal
Elimination of Gear Clearance for the Rotary Table of Ultra Heavy Duty Vertical Milling Lathe Based on Dual Servo Motor Driving System
Previous Article in Special Issue
Comparing Augmented Reality-Assisted Assembly Functions—A Case Study on Dougong Structure
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Examining User Perception of the Size of Multiple Objects in Virtual Reality

Australian Research Centre for Interactive and Virtual Environments, University of South Australia, Mawson Lakes 5095, Australia
Appl. Sci. 2020, 10(11), 4049; https://doi.org/10.3390/app10114049
Submission received: 15 May 2020 / Revised: 4 June 2020 / Accepted: 7 June 2020 / Published: 11 June 2020
(This article belongs to the Special Issue Applications of Virtual, Augmented, and Mixed Reality)

Abstract

:
This article presents a user study into user perception of an object’s size when presented in virtual reality. Critical for users understanding of virtual worlds is their perception of the size of virtual objects. This article is concerned with virtual objects that are within arm’s reach of the user. Examples of such virtual objects could be virtual controls such as buttons, dials and levers that the users manipulate to control the virtual reality application. This article explores the issue of a user’s ability to judge the size of an object relative to a second object of a different colour. The results determined that the points of subjective equality for height and width judgement tasks ranging from 10 to 90 mm were all within an acceptable value. That is to say, participants were able to perceive height and width judgements very close to the target values. The results for height judgement task for just-noticeable difference were all less than 1.5 mm and for the width judgement task less than 2.3 mm.

1. Introduction

The IVE: Australian Research Centre for Interactive and Virtual Environments at the University of South Australia has been working with CADwalk Global (Gerhard Kimenkowski and Stephen Walton, Project Managers for CADwalk Global—https://www.cadwalk.global/) in the development of visualisation tools for large scale designs [1]. Manufactured high-end instrumented facilities, such as command centres and control panels (as depicted in Figure 1), are almost entirely designed virtually [2]. Understanding the size and shape of the virtual controls is important for the understanding of the final design by the stakeholders [2]. Figure 2 depicts an example of a 3D virtual push button. Understanding the height of these buttons off the control panel is critical for understanding the design as a whole. The faithful representation of the size of a button, switch, dial or lever is critical for stakeholder understanding of control system designs [3]. Immersive Virtual Reality (VR) with a Head Mounted Display (HMD) and large stereo displays are the most common methods of presenting designs [4]. This study is motivated by the use of HMDs in the design process and understanding an HMD’s ability to correctly represent the size of a virtual control when compared to the additional virtual controls in the user’s view.
This article presents the results of a user study to determine a user’s perception of the size of a cylinder shaped object when presented in VR. The cylinder shape was chosen as an abstract representation of a button or dial, such as is shown in the Figure 2 depiction of a 3D virtual pushbutton control. The button has an apparent scale in ℝ3. This article explores how users perceive the size of these virtual objects in all three dimensions when viewed in a VR HMD. There have been numerous studies [5,6] comparing the size of a virtual object in VR with a physical object. In other words, how do users judge the size of the virtual object in terms of real-world dimensions? In a study by Eggleston, Jenson and Aldrich [7], their findings suggest that there may be a difference between VR and the natural environment in the way they affect the perception of size changes.
This article investigates whether virtual objects of different colours appear to be different sizes to the user in a VR HMD. Measuring the amount of difference perceived by the users is critical for understanding the user’s perception of various virtual objects in a virtual scene. In the design domain, we are investigating solutions; these virtual objects would be the individual controls on a control panel. Still, this insight extends to other VR applications that require users to interact with virtual objects, such as picking up or touching the objects.
The study presented in this article employs concepts of Point of Subjective Equality and Just-Noticeable Distance as a means to understand the participants’ ability to understand the sizes of the virtual objects. These concepts were first purposed in 1910 by Urban [8]. The “Point of Subjective Equality” (PSE) is when the two stimuli, the standard and comparison appear subjectively the same. The point of subjective equality is the 50% probability point of the user of choosing a measure for the comparison stimuli greater or less than the standard stimuli. Point of subjective equality has been used to detect sizes of objects for redirect haptics in VR [9], system latency [10] and rotational gain for redirected walking [11]. The method employed in this study presents to the participant (in VR wearing an HMD) a standard virtual object and a second comparison virtual object.
The experiment employs method of adjustment for the participants to interact with the stimulus. This is a psychophysical technique in which the participant manipulates a variable stimulus to match a fixed stimulus or standard [12]. In this experiment, the participant is shown a standard visual stimulus (cylinder) of a specific size and a comparison stimulus (cylinder) to match the height or width of the standard. The participant can continuously adjust one parameter of the size of the comparison virtual object (either the height or diameter) to match that same parameter of the standard virtual object.
Just-Noticeable Difference (JND) is a psychophysics measure and is the quantity a stimulus must be changed in order for a difference to be noticeable [12]. This noticeable difference is detectable at least half the time by the observer. This measure provides insight into how much a stimuli can change before a user detects the change. This study will be measuring both the point of subjective equality and just-noticeable difference for a range of different sized cylinders.
This article investigates the following research questions.
  • What is the point of subjective equality for two different coloured cylinders of varying height and width?
  • What is the just-noticeable difference for two different coloured cylinders of varying height and width?

1.1. Structure of Article

This section finishes with a review of the relevant literature for the study presented in this article. The experimental design is then presented and describes the equipment, tasks and measures taken. The results of the study are then presented. This section is followed by a discussion of the results. The article finishes with some concluding remarks and future directions.

1.2. Background

Perception of depth and size with different 3D displays is an essential area of investigation and well studied. Cutting and Vishton [13] investigated the human perception of layout and distance. They identify nine sources of information for people to determine layout and their relative utility at different distances: occlusion, relative size, relative density, height in the visual field, aerial perspective, motion perspective, binocular disparities, convergence and accommodation.
A common technique for investigating size perception with an HMD is having a participant perform a size-matching or size-judgement task. The general findings of Eggleston et al. [7] suggested that there was a difference in the participants’ size judgements between VR and physical environments. Experiments by Luo et al. [14] were consistent across the participant population, and results showed that scene complexity and stereovision could have a significant impact on user performance in virtual environments to make precise judgments on the size of virtual objects. They found contrary to other studies, motion parallax, either from the virtual environment or by user movement, might not be a significant factor in determining a users’ performance. Stefanucci et al. [6] employed size-judgement tasks to measure the perceived size of virtual objects displayed on a large screen monitor. They determined the size of an object in the VE is underestimated compared to an object in the real world. This difference was smaller when viewed in stereo. Egocentric [15] distances in virtual environments are frequently under perceived by up to 50% of the actual distances. Kelly [15] investigated these effects while walking through a VE, and they found an under-perception of egocentric distance. This section will explore these concepts in more detail.

1.2.1. Size Judgement

The study of users’ ability to judge the size of virtual objects on a computer monitor provides insight into the issues concerning these judgements in VR. Measuring users’ ability to judge size of virtual objects on computer monitors has been investigated by numerous researchers. Charras and Lupiáñez [16] employed a standard computer monitor (resolution not reported) and performed a user study with point of subjective equality and just-noticeable difference tasks for a vertical and a horizontal line. Vertical lines are overestimated by the participants when compared to horizontal lines. The amount of vertical overestimation becomes more significant from the L to the T configurations. An L configuration is when one end of the vertical line intersects through an end point of the horizontal line. In the case of a T configuration, the vertical line intersects midway through the horizontal line. A factor in this difference appears to be that the horizontal line is bisected by the vertical one in the T configurations.
Linkenauger, Witt and Proffitt [17], using a laptop display, employed point of subjective equality to examine right-handed participants if the visually perceived size of graspable objects is scaled to the extent of apparent grasping ability for the individual. They found virtual objects looked smaller when held by their right hand compared to their left. In a second experiment, participants perceived objects to be smaller when their hand was visually scaled larger than when their hand was a normal size. In an evaluation [6] with high-quality computer graphics and precisely matched geometry, virtual objects displayed on a computer monitor appear smaller than equivalent real objects, specifically in circumstances when stereo information is not provided.

1.2.2. Virtual Reality Technologies

Virtual Reality is the display technology employed in the study present in this article. Although this article is concerned with size estimations, distance estimations can provide insight into the size judgement ability of users in VR. Jackson and Cormack [18] found participants in their study slightly overestimated distance judgements by a similar amount in both their horizontal and vertical estimates in VR with an HMD. The section will start with a review of the literature concern users’ size judgement in HMD versions of VR, and this is followed by a discussion of other display technologies employed to study size judgement of users’ in VR.
Size Judgement HMD: This section provides a description of some of the experiments to understand a user’s ability to estimate the size of virtual objects in VR using a HMD. HMDs can be categorised into eight major categories, as shown by Schmalstieg and Höllerer [19]. HMDs have different optical parameters and characteristics [20] including method of augmentation, vergence and accommodation, ocularity, stereoscopy, focus, occlusion, resolution, refresh rate, depth of field, viewpoint offset, brightness, contrast, distortions and aberrations, latency, ergonomics and social acceptance (social weight [21]).
When viewing in VR in HMDs, people frequently underestimate egocentric distances (the measured distance from themselves to a target). The range underestimates egocentric distances is from 50% to 90% of the displayed distance [22,23,24]. Leroy et al. conducted a series of tests to compare the influence of different factors on the perception of shapes displayed on stereoscopic displays. They found that head tracking has a more significant impact than other parameters [25].
Geuss et al. [26] investigated display attributes that affect a user’s perception of the scale of virtual objects, and these are as follows, (1) stereoscopic, (2) tracking the user’s viewpoint and (3) the field of view (FOV). Participants directly estimated the size of a gap by matching the distance between their hands to the gap width and judged their ability to pass unimpeded through the gap in one of five typical implementations of three display technologies: (1) wide-FOV HMD condition nVisor SX111 display, (2) nVisor SX60 display and (3) a back-projected display wall. The results showed gap width was similar for the HMD conditions and the back-projected display wall with stereo and tracking. Monocular viewing conditions showed an underestimation of size. It has been shown users employing a HMD perceived objects around them relative to the size of their body. Ogawa, Narumi and Hirose [27] examined how in VR a virtual avatar of the user would affect perceived graspable object sizes as the size of the user’s avatar hand changes. They manipulated the realism to be high, medium or low, and they changed the size of the avatar’s hand to smaller or enlarged. They measured the user’s perceived size of a virtual cube they held. They found that the size of the cube was perceived to be reduced when the user’s avatar hand was enlarged, and this was true for all conditions of realism of the hand.
Eggleston et al. [7] investigated the effects of six factors on size judgement in a HMD VR system: (1) binocular vs. stereoscopic viewing; (2) image resolution 1280 × 1024 vs. 640 × 480 ; (3) FOV 60 ° × 60 ° vs. 100 ° × 60 ° ; (4) scene contrast of single luminance level for every surface vs. multiple luminance level for every surface; (5) consistent contrast environment vs. inconsistent contrast environment; and (6) target distance of 10, 60, and 120 feet. The general findings from this study suggest more natural viewing conditions in VR differ from natural viewing conditions. Katzakis et al. [28] investigated visual-haptic size estimation in HMD VR. They found participants demonstrated a central tendency effect where they overestimated for smaller haptic sizes but had an under-estimation for larger haptic sizes.
Size Judgement Other VR Display Technology: There are numerous display technologies for VR aside from HMDs. This section investigates three such technologies that were employed for size judgement experiments: CAVE, large displays, and Fish Tank Virtual Reality. Kenyon et al. [5] investigated size consistency in a VR CAVE environment. The best participants in the study achieved a 10% error in the height of the objects of interest. Murgia and Sharkey [29] determined there is distance underestimation in VEs’ with rich cue conditions (realised with textured background surfaces) and even greater underestimation in poor cue conditions (limiting the rich cues for the background). In a six-sided CAVE (2.93 m × 2.93 m × 2.93 m), Ponto et al. [30] employed a perceptual calibration procedure that was derived from geometric models to improve users’ depth acuity, distance estimation and the perception of shape.
Using a large-screen immersive display, Elner and Wright [31] investigated a new method of comparing physical and virtual size and user depth perception that relies on the involuntary responses of participants to varying stimuli in their FOV. This is to overcome any issues with relying on the participants’ skills in judging size, reaching or directed walking. This allows for the determination of systematic errors. Lou et al. [14] investigated scene complexity, stereovision, and motion parallax on size constancy in a single wall CAVE virtual environment. The participants were instructed to manipulate the size of a virtual 2-litre Coke bottle to be the same size of a physical 2-litre Coke bottle visible during the experiment. The participants employed a wireless joystick to increase or decrease the size of the virtual object to the size of the physical object. The authors found that stereo vision improved the participants’ ability to estimate the size of the virtual object. In the case of motion parallax, the authors did not find this improved the participants’ ability to estimate the size of the virtual object. In a virtual environment that was displayed on a large screen (152.5 cm) monoscopic display with high quality computer graphics and precisely matched geometry, displayed virtual objects appeared smaller than equivalent real objects [6]. Wuillemin et al. [32] performed a sequence of three experiments. Participants were told to adjust the sizes of spheres to be equal when presented in the haptic (Phantom) and visual (4D Vision Screen) modalities. The spheres were either physical or virtual. Physical spheres were collections of ball bearings of different sizes that could be both viewed and touched. Contrasting the judgements made between the sensory modalities, virtual spheres of a particular size were perceived as significantly greater than the physical ball bearing counterparts for the vision condition, but not for the haptics condition.
Zhou et al. [33] investigated Fish Tank Virtual Reality (FTVR) displays for their compelling 3D spatial effect with viewer head-tracking. They found participant performance on a size-matching task was significantly better on a spherical FTVR than a flat FTVR with the size error of 2.3 mm for participants employing spherical FTVR.

1.2.3. Augmented Reality Technologies

Employing AR, scenery and augmentations can become significantly distorted and partly abstracted, and a user’s size perception of virtual objects can be affected [34]. Distortions of AR-generated FOV in a video see-through (VST) viewing condition can also affect a user’s size perception of virtual objects. Ahn et al. [35] investigated the correct size perception of the augmented object with a physical object, for three representative AR displays: hand-held, VST HMD and (3) optical see-through (OST) HMD. They found a statistically significant result based on display type: between hand-held and VST, and between hand-held and OST. Participants generally overestimated the augmented object in the hand-held condition. The participants exhibited a significant level of underestimation with the OST and near 1–1 scale matching performance with VST. Benko et al. [36] investigated Dyadic, a projected spatial augmented reality system. They were interested in the ability of participants to differentiate between three distances (near, mid and far) and three sizes (small, medium and large). The participants were able to correctly rate the size of the virtual objects correctly 70.0% of the time.

2. Materials and Methods

The experiment is conducted entirely with an HTC Vive Pro HMD, and the participant is presented two cylinders that are approximately 500 mm in front of them and 300 mm below eye level, see Figure 3. The users were able to move their head to achieve motion parallax by bending at the waist, but they were not to walk from the prescribed standing position. The cylinder on the left is the target (standard) and is of red colour, see Figure 4. The cylinder on the right is the cylinder the participant is adjusting (comparison). The goal of the experiment is to determine the point of subjective equality and just-noticeable difference for users observing two cylinders that approximate buttons, dials and levers on a control panel.
Two colours—red (255,0,0) and yellow (255,255,0)—were employed to highlight a user’s ability to understand the size differences when controls are of different materials. User estimations varied significantly between the colour pairs blue-yellow and blue-green [37,38]. Their results indicated that users had more accuracy in the depth matching task with yellow and green virtual objects than with blue ones. In the CIELAB colour space (http://colorizer.org/), the luminosity of yellow (255,255,0) is 97 and the luminosity of red (255,0,0) is 53. Therefore, yellow virtual cylinders seem brighter than red. The principle being that among similarly sized and distant objects, the brighter appear closer than dimmer objects. Hagtvedt and Brasel [39] demonstrated the perceived size of products depends on the saturation of their colour. They performed six experiments, using physical objects and products with different shapes and hues, and studies determined that increasing colour saturation increases size perceptions. The authors indicate the influence may be explained by the inclination for saturated colours to capture a user’s attention. There are a vast number of options for the choice of these two colours. The space can be broken down into a YUV (Y’CbCr) colour space, and the luminosity could have been control. The choice of red and yellow was for saturated colours that are commonly employed as physical controls.
The experiment examines both adjusting the height and width of the cylinder separately. The ten conditions of the standard cylinder are as follows.
  • With a fixed diameter of 30 mm:
    -
    height 10 mm,
    -
    height 30 mm,
    -
    height 50 mm,
    -
    height 70 mm and
    -
    height 90 mm.
  • With a fixed height 50 mm:
    -
    diameter 10 mm,
    -
    diameter 20 mm,
    -
    diameter 30 mm,
    -
    diameter 40 mm and
    -
    diameter 50 mm.
With a fixed diameter task, the participant adjusted the height of the comparison object. In the case of a fixed height, the participant adjusted the width of the comparison object. The starting height or diameter of the comparison cylinder was set to be noticeably different than the standard cylinder. The comparison height or diameter was set from a random value of a scale factor of the target size. The starting scale factor for the grow tasks is between 0.75 to 0.9, and the scale factor for the shrink tasks is between 1.1 to 1.25. These scale factors provided very noticeable starting points from the target size without requiring the participants to make extremely large adjustments.
Each participant performed the following.
  • 10 tasks per trial,
  • 2 trials per condition (two grow and two shrink),
  • 10 conditions,
    -
    5 heights fixed diameter, and
    -
    5 diameters fixed height.
The experiment was conducted with 10 participants. There were eight males and two females, ages 23 to 35. The number of participants was based on local standards [40]. All participants were computer science students and experienced users of virtual reality with the HTC VIVE technology. The selection criteria required the participants must be between the ages of eighteen and forty, right handed, normal or corrected to normal vision, and in good health. There was no reported simulator sickness during the experiment. Local standards are guidelines derived from similar or analogous studies that have been previously published. The number of participants was chosen as the previous point of subjective equality and just-noticeable difference studies have run with populations of ~10 participants [41,42].
The experiment was conducted in the following manner. The participants were first welcomed to participate in the experiment. They were asked to read the information sheet. All participants gave their informed consent for inclusion before they participated in the study. The study was conducted in accordance with the Declaration of Helsinki, and the protocol (201695) was approved by the Ethics Committee of the University of South Australia on 19 December 2019. Their intraocular pupil distance was then measured, and they were asked to adjust the Vive’s IPD when in use. The participants were shown how to grow and shrink the virtual objects with the game controller, see Figure 5. Pushing the left or right joystick up makes the object taller or wider. They modified one virtual object in one direction to match the target object. Pulling the left or right joystick down made the object shorter or thinner. Small movements of the game controller provided slow adjustments. The participants could make as many adjustments as they required. When the correct size was achieved, the participants pressed the top front button on the controller. The task was not timed. They were asked to “Please be as accurate as possible”. The participants were then asked to put on the VR headset. A short training set of eight tasks was then presented. The training set consisted of four adjustments of height (two grow and two shrink) and four adjustments of width (two grow and two shrink). Once the training tasks were completed, and the participants did not have any more questions, the experiment was started.
The experiment was conducted in a quiet office with only the participant and experimenter present during the study, and the total time of the experiment was less than one hour. Once the tasks started, the participants were asked to stand during the experiment. A Unity 3D application (version 2018.4.14f1) was written to produce the stimulus for the participants. The application operated on Windows 10 workstation with Intel i7-9700 CPU 3.00 GHz, 16 GB of memory and a GeForce RTX 2070 graphics card. The Unity 3D application was operated on a Vive Pro HMD with a resolution of 1440 × 1600 pixels per eye (2880 × 1600 pixels combined), a refresh rate 90 Hz and a field of view of 110 degrees. A Logitech F710 wireless game controller was employed for the participants to interact with the experiment, see Figure 5 and Figure 6.

3. Results

The results will be examined firstly with the point of subjective equality. These results will be followed by the just-noticeable difference results. The analysis was performed with IBM SPSS Statistics version V26.

3.1. Point of Subjective Equality

The point of subjective equality is calculated as the mean of all adjustments performed by the participants [12]. The height judgement task will be presented in the section first, and this will be followed by the width adjustment task. As previously mentioned, the point of subjective equality is when the two cylinders look subjectively the same in height or width, and thus the participant would choose either at a frequency of 50%.

3.1.1. Height Point of Subjective Equality

For the height adjustment task, the width of the cylinder is set to 30 mm. Table 1 depicts the mean continuous height adjustment for the five different target heights. Figure 7 depicts a graph of the mean values are compared to the target heights. There appears to be a linear fitting close to the optimal values. Descriptive statistics are shown in Table 2 for grow and shrink interactions of the five height point of subjective equality tasks. The results show a range of points of subject equality within an approximate range of −0.27 mm to 0.61 mm of the target sizes.

3.1.2. Width Point of Subjective Equality

For the width adjustment task, the height of the cylinder is set to 50 mm. An overview of the results of the five different target widths of the width judgement for points of subject equality is shown in Table 3. The range of results for the points of subject equality is within a range of ~−1.38 mm to ~0.36 mm. Figure 8 depicts a graph of the mean values. There appears to be a linear fitting close the optimal values, but not as close as the height adjustment. A more in-depth view of the results of the width judgement for points of subject equality is provided with descriptive statistics in Table 4.

3.2. Just-Noticeable Difference

As previously defined, just-noticeable difference is a psychophysics measure to determine the amount a stimulus must be changed in order for a difference to be noticeable. Using a z score of 0.75 probability that is equal to 0.674, the JND is calculated by the following manner, J N D = S D × 0.674 , where SD is the standard deviation [12]. The Table 5 and Table 6 depict the values for the adjustments of both height and width for the ten target values. For the continuous height adjustment, they ranged from 0.6462 mm to 1.4500 mm. The width continuous adjustment has a range of 0.5234 mm to 2.2207 mm. These results are similar to what Zhou et al. [33] found for Fish Tank Virtual Reality displays. They found a size error of 2.3 mm for participants employing spherical FTVR.

4. Discussion

A closer investigation into the mean values of the points of subjective equality, the difference of the mean values from the target values, is presented in the Table 7. These values fall well below 1 mm. Figure 9 depicts these values as a graph, and these fit into a tight band. The point of subjective equality for the height of a 30 mm diameter cylinder for a range of heights (10 mm, 30 mm, 50 mm, 70 mm and 90 mm) are all very close the target values, as also shown in the Table 8 as the percentage of final and target values for the height adjustment conditions. Users can accurately judge the height of a virtual object when compared to a second virtual object. All points of subjective equality are under 1 mm for the height judgement task. These results show users can understand the height of control objects, such as dials and levers.
With a deeper examination of the results, the difference of the mean values from the target values is presented in the Table 9. These values are greater than the height adjustment values, but they fall below the absolute value of 1.4 mm. Figure 10 depicts these values as a graph, and they also fall into a tight band of values. The point of subjective equality for the width of a 50 mm tall cylinder for a range of widths (10 mm, 30 mm, 50 mm, 70 mm and 90 mm) is close the target values, as also shown in the Table 10 as the percentage of final and target values for the width adjustment conditions. Users are able to accurately judge the width of a virtual object when compared to a second virtual object. The points of subjective equality for all conditions are all less than 1.4 mm and less than 2% of the target width. These results support the findings of the height judgement tasks that virtual reality is a suitable design tool to allow users to understand the width of control objects, such as dials and levers.
Just-noticeable difference is the amount that height or diameter must be changed in order for a difference to be noticeable, detectable at least half the time by the participants. The just-noticeable difference for both height and width indicates users notice a greater difference for larger sizes, see Figure 11 and Figure 12. In the case of the height judgement, the just-noticeable difference was less than 1.6 mm. The judgement for width was less than 2.4 mm. As previously mentioned, these results are similar those found by Zhou et al. [33] for Fish Tank Virtual Reality displays. One possible explanation for this result is both HMD-based and Fish Tank VR support users changing their point of view when performing the size judgements. Motion parallax is a power distance visual cue, and users might be relaying this form of perception in their size judgement. As with the point of subjective equality, the two colours did not appear to affect the results of just-noticeable difference.
For height judgement, the mean across all heights is 1.1981 mm (SD 0.3293) and for the width judgement is 1.3990 mm (SD 0.7134). The values for each task represent in the smaller sized object noticeable difference of less than 1 mm. In the larger sized objects, noticeable difference was approximately 2.5% of the object’s target size. This leads to confidence that the different coloured objects will be interpreted within an acceptable range. This will allow different controls of different colours to be placed on a control panel with confidence the user would understand the spatial relationship between the controls.
There is a question of why the just-noticeable difference width judgement was slightly worse than the height judgement. One possible explanation is the stereo and head parallax depth cues may function more effectively for the height judgements. It was noticed some users moved they head more during the width adjustment tasks. The viewing angle might be a second reason for this difference. While the user’s viewed the cylinders at an approximate 45° viewing angle (equal for both tasks), this position might favour the height adjustment task as it might be more understandable for the participants.

5. Conclusions

This article presented a user study into user perception of an object’s size when presented in VR. The study showed participants’ understanding and their perception of matching the size of two different coloured virtual objects. The study was conducted with virtual objects that are within arm’s reach of the user. Examples of where such virtual objects are essential in the design process include virtual controls such as buttons, dials and levers that the users manipulate to control the VR application. This article explores the issue of a participant’s ability to judge the size of a virtual object in VR relative to the second virtual object of a different colour in the same VR scene.
This article was able to answer the proposed research questions:
  • What is the point of subjective equality for two different coloured cylinders of varying height and width?
  • What is the just-noticeable difference for two different coloured cylinders of varying height and width?
Research question 1 was answered with results that determined the points of subjective equality for height judgement tasks as being all less than 1 mm or less than 1.13% of the target heights ranging from 10 mm to 90 mm. In the case of width judgement tasks also ranging from 10 mm to 90 mm, they were all less than 1.4 mm or less than 2% of the target width.
Research question 2 was answered through the determination of the just noticeable difference for the height and width adjustment tasks. The results for height judgement task for just noticeable difference were all less than 1.5 mm and for the width judgement task less than 2.3 mm. The two colours did not appear to affect the results. This finding supports the concept of users being able to understand the size of different virtual controls that have different colours. Further investigations are required to support this more fully.
There are a number of future experiments (point of subjective equality and just-noticeable difference) to explore new aspects of human perception of the size of virtual objects in an HMD VR experience. The resolution of the HTC VIVE Pro employed in this study may have impacted on the results. Future studies into the effect display resolution on size judgement are required. Another experiment is to extend the adjustment the participant can make to both height and width. This experiment becomes more of a volume or massing [43] testing of the participant’s ability. A further experiment is to repeat the experiment with objects smaller than 10 mm, making them more representative sizes of controls, such as buttons and dials. In line with this second study is performing studies with objects that resemble these controls. Finally, we intend to perform experiments that investigate objects that are not aligned perpendicular to the viewing direction [44].

Funding

This work has been supported from Jumbo Vision Pty Ltd/CADwalk Global Pty. Ltd through the Innovative Manufacturing Cooperative Research Centre under whose activities are funded by the Australian Commonwealth Government’s Cooper7ative Research Centres Programme. The project is titled “Visualisation Tools for the Design of Manufactured High End Instrumented Facilities”.

Acknowledgments

The author would like to thank Anna Ma-Wyatt for her insights into concepts of point of subjective equality and just-noticeable difference. I would like to acknowledge James Baumeister for his patience will all my questions.

Conflicts of Interest

The author declares no conflict of interest.

Appendix A. Graphs of Individual Adjustment Tasks

Figure A1. Shrink height adjustment for 10 mm.
Figure A1. Shrink height adjustment for 10 mm.
Applsci 10 04049 g0a1
Figure A2. Grow height adjustment for 10 mm.
Figure A2. Grow height adjustment for 10 mm.
Applsci 10 04049 g0a2
Figure A3. Shrink height adjustment for 30 mm.
Figure A3. Shrink height adjustment for 30 mm.
Applsci 10 04049 g0a3
Figure A4. Grow height adjustment for 30 mm.
Figure A4. Grow height adjustment for 30 mm.
Applsci 10 04049 g0a4
Figure A5. Shrink height adjustment for 50 mm.
Figure A5. Shrink height adjustment for 50 mm.
Applsci 10 04049 g0a5
Figure A6. Grow height adjustment for 50 mm.
Figure A6. Grow height adjustment for 50 mm.
Applsci 10 04049 g0a6
Figure A7. Shrink height adjustment for 70 mm.
Figure A7. Shrink height adjustment for 70 mm.
Applsci 10 04049 g0a7
Figure A8. Grow height adjustment for 70 mm.
Figure A8. Grow height adjustment for 70 mm.
Applsci 10 04049 g0a8
Figure A9. Shrink height adjustment for 90 mm.
Figure A9. Shrink height adjustment for 90 mm.
Applsci 10 04049 g0a9
Figure A10. Grow height adjustment for 90 mm.
Figure A10. Grow height adjustment for 90 mm.
Applsci 10 04049 g0a10
Figure A11. Shrink width adjustment for 10 mm.
Figure A11. Shrink width adjustment for 10 mm.
Applsci 10 04049 g0a11
Figure A12. Grow width adjustment for 10 mm.
Figure A12. Grow width adjustment for 10 mm.
Applsci 10 04049 g0a12
Figure A13. Shrink width adjustment for 30 mm.
Figure A13. Shrink width adjustment for 30 mm.
Applsci 10 04049 g0a13
Figure A14. Grow width adjustment for 30 mm.
Figure A14. Grow width adjustment for 30 mm.
Applsci 10 04049 g0a14
Figure A15. Shrink width adjustment for 50 mm.
Figure A15. Shrink width adjustment for 50 mm.
Applsci 10 04049 g0a15
Figure A16. Grow width adjustment for 50 mm.
Figure A16. Grow width adjustment for 50 mm.
Applsci 10 04049 g0a16
Figure A17. Shrink width adjustment for 70 mm.
Figure A17. Shrink width adjustment for 70 mm.
Applsci 10 04049 g0a17
Figure A18. Grow width adjustment for 70 mm.
Figure A18. Grow width adjustment for 70 mm.
Applsci 10 04049 g0a18
Figure A19. Shrink width adjustment for 90 mm.
Figure A19. Shrink width adjustment for 90 mm.
Applsci 10 04049 g0a19
Figure A20. Grow width adjustment for 90 mm.
Figure A20. Grow width adjustment for 90 mm.
Applsci 10 04049 g0a20

References

  1. Irlitti, A.; Piumsomboon, T.; Jackson, D.; Thomas, B.H. Conveying spatial awareness cues in xR collaborations. IEEE Trans. Vis. Comput. Graph. 2019, 25, 3178–3189. [Google Scholar] [CrossRef] [PubMed]
  2. Thomas, B.H.; Von Itzstein, G.S.; Vernik, R.; Porter, S.; Marner, M.R.; Smith, R.T.; Broecker, M.; Close, B.; Walker, S.; Pickersgill, S.; et al. Spatial augmented reality support for design of complex physical environments. In Proceedings of the 2011 IEEE International Conference on Pervasive Computing and Communications Workshops (PERCOM Workshops), Seattle, WA, USA, 21–25 March 2011; pp. 588–593. [Google Scholar] [CrossRef]
  3. Aghina, M.A.C.; Mól, A.C.A.; Jorge, C.A.F.; Pereira, C.M.; Varela, T.F.; Cunha, G.G.; Landau, L. Virtual control desks for nuclear power plant simulation: Improving operator training. IEEE Comput. Graph. Appl. 2008, 28, 6–9. [Google Scholar] [CrossRef]
  4. Berg, L.P.; Vance, J.M. Industry use of virtual reality in product design and manufacturing: A survey. Virtual Real. 2017, 21, 1–17. [Google Scholar] [CrossRef]
  5. Kenyon, R.; Sandin, D.; Smith, R.; Pawlicki, R.; DeFanti, T. Size-constancy in the CAVE. Presence 2007, 16, 172–187. [Google Scholar] [CrossRef]
  6. Stefanucci, J.K.; Creem-Regehr, S.H.; Thompson, W.B.; Lessard, D.A.; Geuss, M.N. Evaluating the accuracy of size perception on screen-based displays: Displayed objects appear smaller than real objects. J. Exp. Psychol. Appl. 2015, 21, 215. [Google Scholar] [CrossRef] [PubMed]
  7. Eggleston, R.G.; Janson, W.P.; Aldrich, K.A. Virtual reality system effects on size-distance judgements in a virtual environment. In Proceedings of the IEEE 1996 Virtual Reality Annual International Symposium, Santa Clara, CA, USA, 30 March–3 April 1996; pp. 139–146. [Google Scholar] [CrossRef]
  8. Urban, F.M. The method of constant stimuli and its generalizations. Psychol. Rev. 1910, 17, 229. [Google Scholar] [CrossRef]
  9. Bergström, J.; Mottelson, A.; Knibbe, J. Resized Grasping in VR: Estimating Thresholds for Object Discrimination. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology, New Orleans, LA, USA, 20–23 October 2019; ACM: New York, NY, USA, 2019; pp. 1175–1183. [Google Scholar] [CrossRef]
  10. Ellis, S.R.; Mania, K.; Adelstein, B.D.; Hill, M.I. Generalizeability of latency detection in a variety of virtual environments. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, New Orleans, LA, USA, 20–24 September 2004; SAGE Publications Sage CA: Los Angeles, CA, USA, 2004; Volume 48, pp. 2632–2636. [Google Scholar]
  11. Paludan, A.; Elbaek, J.; Mortensen, M.; Zobbe, M.; Nilsson, N.C.; Nordahl, R.; Reng, L.; Serafin, S. Disguising rotational gain for redirected walking in virtual reality: Effect of visual density. In Proceedings of the 2016 IEEE Virtual Reality (VR), Greenville, SC, USA, 19–23 March 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 259–260. [Google Scholar]
  12. Kuroda, T.; Hasuo, E. The very first step to start psychophysical experiments. Acoust. Sci. Technol. 2014, 35, 1–9. [Google Scholar] [CrossRef] [Green Version]
  13. Cutting, J.E.; Vishton, P.M. Perceiving layout and knowing distances: The integration, relative potency, and contextual use of different information about depth. In Perception of Space and Motion; Elsevier: Amsterdam, The Netherlands, 1995; pp. 69–117. [Google Scholar]
  14. Luo, X.; Kenyon, R.; Kamper, D.; Sandin, D.; DeFanti, T. The effects of scene complexity, stereovision, and motion parallax on size constancy in a virtual environment. In Proceedings of the 2007 IEEE Virtual Reality Conference, Charlotte, NC, USA, 10–14 March 2007; IEEE: Piscataway, NJ, USA, 2007; pp. 59–66. [Google Scholar]
  15. Kelly, J.W.; Donaldson, L.S.; Sjolund, L.A.; Freiberg, J.B. More than just perception–action recalibration: Walking through a virtual environment causes rescaling of perceived space. Atten. Percept. Psychophys. 2013, 75, 1473–1485. [Google Scholar] [CrossRef] [Green Version]
  16. Charras, P.; Lupiáñez, J. Length perception of horizontal and vertical bisected lines. Psychol. Res. PRPF 2010, 74, 196–206. [Google Scholar] [CrossRef]
  17. Linkenauger, S.A.; Witt, J.K.; Proffitt, D.R. Taking a hands-on approach: Apparent grasping ability scales the perception of object size. J. Exp. Psychol. Hum. Percept. Perform. 2011, 37, 1432–1441. [Google Scholar] [CrossRef] [Green Version]
  18. Jackson, R.E.; Cormack, L.K. Reducing the presence of navigation risk eliminates strong environmental illusions. J. Vis. 2010, 10, 9. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  19. Schmalstieg, D.; Höllerer, T. Augmented Reality: Principles and Practice; Addison-Wesley Professional: Boston, MA, USA, 2016. [Google Scholar]
  20. Kiyokawa, K. An introduction to head mounted displays for augmented reality. In Emerging Technologies of Augmented Reality; Billinghurst, M., Thomas, B., Eds.; IGI Global: Pennsylvania, PA, USA, 2008. [Google Scholar]
  21. Toney, A.; Mulley, B.; Thomas, B.H.; Piekarski, W. Social weight: Designing to minimise the social consequences arising from technology use by the mobile professional. Pers. Ubiquitous Comput. 2003, 7, 309–320. [Google Scholar] [CrossRef]
  22. Kunz, B.; Wouters, L.; Smith, D.; Thompson, W.; Creem-Regehr, S. Revisiting the effect of quality of graphics on distance judgments in virtual environments: A comparison of verbal reports and blind walking. Atten. Percept. Psychophys. 2009, 71, 1284–1293. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  23. Knapp, J.; Loomis, J. Visual Perception of Egocentric Distance in Real and Virtual Environments. Virtual Adapt. Environ. 2003, 11, 21–46. [Google Scholar] [CrossRef]
  24. Richardson, A.R.; Waller, D. Interaction with an Immersive Virtual Environment Corrects Users’ Distance Estimates. Hum. Factors 2007, 49, 507–517. [Google Scholar] [CrossRef] [PubMed]
  25. Leroy, L.; Fuchs, P.; Paljic, A.; Moreau, G. Experiments on shape perception in stereoscopic displays. In Stereoscopic Displays and Applications XX; International Society for Optics and Photonics: New York, NY, USA, 2009; Volume 7237, p. 723717. [Google Scholar]
  26. Geuss, M.N.; Stefanucci, J.K.; Creem-Regehr, S.H.; Thompson, W.B.; Mohler, B.J. Effect of Display Technology on Perceived Scale of Space. Hum. Factors 2015, 57, 1235–1247. [Google Scholar] [CrossRef]
  27. Ogawa, N.; Narumi, T.; Hirose, M. Object Size Perception in Immersive Virtual Reality: Avatar Realism Affects the Way We Perceive. In Proceedings of the 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Reutlingen, Germany, 18–22 March 2018; pp. 647–648. [Google Scholar] [CrossRef]
  28. Katzakis, N.; Chen, L.; Mostajeran, F.; Steinicke, F. Peripersonal Visual-Haptic Size Estimation in Virtual Reality. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 1010–1011. [Google Scholar] [CrossRef]
  29. Murgia, A.; Sharkey, P.M. Estimation of distances in virtual environments using size constancy. Int. J. Virtual Real. 2009, 8, 67–74. [Google Scholar] [CrossRef]
  30. Ponto, K.; Gleicher, M.; Radwin, R.G.; Shin, H.J. Perceptual Calibration for Immersive Display Environments. IEEE Trans. Vis. Comput. Graph. 2013, 19, 691–700. [Google Scholar] [CrossRef]
  31. Elner, K.W.; Wright, H. Phenomenal regression to the real object in physical and virtual worlds. Virtual Real. 2015, 19, 21–31. [Google Scholar] [CrossRef]
  32. Wuillemin, D.; Van Doorn, G.; Richardson, B.; Symmons, M. Haptic and visual size judgements in virtual and real environments. In Proceedings of the First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, World Haptics Conference, Pisa, Italy, 18–20 March 2005; pp. 86–89. [Google Scholar] [CrossRef]
  33. Zhou, Q.; Hagemann, G.; Fafard, D.; Stavness, I.; Fels, S. An Evaluation of Depth and Size Perception on a Spherical Fish Tank Virtual Reality Display. IEEE Trans. Vis. Comput. Graph. 2019, 25, 2040–2049. [Google Scholar] [CrossRef]
  34. Kruijff, E.; Swan, J.E.; Feiner, S. Perceptual issues in augmented reality revisited. In Proceedings of the 2010 IEEE International Symposium on Mixed and Augmented Reality, Seoul, Korea, 13–16 October 2010; IEEE: Piscataway, NJ, USA, 2010; pp. 3–12. [Google Scholar]
  35. Ahn, J.G.; Ahn, E.; Min, S.; Choi, H.; Kim, H.; Kim, G.J. Size Perception of Augmented Objects by Different AR Displays. In HCI International 2019—Posters; Stephanidis, C., Ed.; Springer International Publishing: Cham, Switzerland, 2019; pp. 337–344. [Google Scholar]
  36. Benko, H.; Wilson, A.D.; Zannier, F. Dyadic projected spatial augmented reality. In Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology, Honolulu, HI, USA, 5–8 October 2014; ACM: New York, NY, USA, 2014; pp. 645–655. [Google Scholar]
  37. Coules, J. Effect of photometric brightness on judgments of distance. J. Exp. Psychol. 1955, 50, 19. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  38. Singh, G.; Ellis, S.; Swan, J. The Effect of Focal Distance, Age, and Brightness on Near-Field Augmented Reality Depth Matching. IEEE Trans. Vis. Comput. Graph. 2018, 26, 1385–1398. [Google Scholar] [CrossRef] [Green Version]
  39. Hagtvedt, H.; Brasel, S.A. Color saturation increases perceived product size. J. Consum. Res. 2017, 44, 396–413. [Google Scholar] [CrossRef]
  40. Caine, K. Local Standards for Sample Size at CHI. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, San Jose, CA, USA, 7–12 May 2016; Association for Computing Machinery: New York, NY, USA, 2016; pp. 981–992. [Google Scholar] [CrossRef] [Green Version]
  41. Chen, K.; Ponto, K.; Sesto, M.; Radwin, R. Influence of altered visual feedback on neck movement for a virtual reality rehabilitative system. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2014, 58, 693–697. [Google Scholar] [CrossRef]
  42. Liu, D.; Wang, Y.; Chen, Z. Joint Foveation-Depth Just-Noticeable-Difference Model for Virtual Reality Environment. J. Vis. Commun. Image Represent. 2018, 56. [Google Scholar] [CrossRef]
  43. Sun, Q.; Lin, J.; Fu, C.W.; Kaijima, S.; He, Y. A multi-touch interface for fast architectural sketching and massing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Paris, France, 27 April–2 May 2013; pp. 247–256. [Google Scholar]
  44. Peillard, E.; Thebaud, T.; Norrnand, J.M.; Argelaguet, F.; Moreau, G.; Lécuyer, A. Virtual Objects Look Farther on the Sides: The Anisotropy of Distance Perception in Virtual Reality. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 227–236. [Google Scholar]
Figure 1. Control room example. Image owned by Michael Elleray and is under Creative Commons https://creativecommons.org/licenses/by/2.0/.
Figure 1. Control room example. Image owned by Michael Elleray and is under Creative Commons https://creativecommons.org/licenses/by/2.0/.
Applsci 10 04049 g001
Figure 2. 3D virtual button example. Image owned by needpix.com and is under CC0 Creative Commons License.
Figure 2. 3D virtual button example. Image owned by needpix.com and is under CC0 Creative Commons License.
Applsci 10 04049 g002
Figure 3. View of a participant during the experiment.
Figure 3. View of a participant during the experiment.
Applsci 10 04049 g003
Figure 4. The participant’s view during the experiment.
Figure 4. The participant’s view during the experiment.
Applsci 10 04049 g004
Figure 5. Logitech F710 wireless game controller.
Figure 5. Logitech F710 wireless game controller.
Applsci 10 04049 g005
Figure 6. View of the participant using the Logitech F710.
Figure 6. View of the participant using the Logitech F710.
Applsci 10 04049 g006
Figure 7. Point of subjective equality for the height judgement task.
Figure 7. Point of subjective equality for the height judgement task.
Applsci 10 04049 g007
Figure 8. Mean widths adjustments in mm.
Figure 8. Mean widths adjustments in mm.
Applsci 10 04049 g008
Figure 9. Difference of final and target height in mm for height adjustment (error bars depict standard error of the overall mean values).
Figure 9. Difference of final and target height in mm for height adjustment (error bars depict standard error of the overall mean values).
Applsci 10 04049 g009
Figure 10. Difference from the final mean and target in mm for width adjustments (error bars depict standard error of the overall mean values).
Figure 10. Difference from the final mean and target in mm for width adjustments (error bars depict standard error of the overall mean values).
Applsci 10 04049 g010
Figure 11. Just noticeable difference height in mm.
Figure 11. Just noticeable difference height in mm.
Applsci 10 04049 g011
Figure 12. Just noticeable difference width in mm.
Figure 12. Just noticeable difference width in mm.
Applsci 10 04049 g012
Table 1. Mean heights in mm for height judgement task.
Table 1. Mean heights in mm for height judgement task.
Target HeightPSE
10 mm10.1117 mm
30 mm29.7259 mm
50 mm50.0455 mm
70 mm70.6167 mm
90 mm90.3284 mm
Table 2. Height judgement for tasks of 10 mm to 90 mm for height point of subjective equality.
Table 2. Height judgement for tasks of 10 mm to 90 mm for height point of subjective equality.
Height Point of Subjective Equality
PSEStd. Dev.MinimumMaximumRangeGraph
Height 10 mm Shrink10.2752 mm0.91377.3866 mm12.8147 mm5.4281 mmFigure A1
Height 10 mm Grow9.9483 mm1.00377.0460 mm13.1067 mm6.0607 mmFigure A2
Height 30 mm Shrink29.9609 mm1.483625.9661 mm33.2025 mm7.2365 mmFigure A3
Height 30 mm Grow29.4910 mm1.931824.7537 mm34.1079 mm9.3542 mmFigure A4
Height 50 mm Shrink50.1502 mm1.455047.3878 mm53.6212 mm6.2334 mmFigure A5
Height 50 mm Grow49.9408 mm2.499544.8474 mm56.1118 mm11.2644 mmFigure A6
Height 70 mm Shrink70.7484 mm1.991567.0715 mm76.6390 mm9.5675 mmFigure A7
Height 70 mm Grow70.4851 mm2.311064.9422 mm75.7677 mm10.8255 mmFigure A8
Height 90 mm Shrink90.3409 mm1.969285.2156 mm94.6688 mm9.4531 mmFigure A9
Height 90 mm Grow90.3160 mm2.217185.0963 mm95.1726 mm10.0763 mmFigure A10
Table 3. Mean widths in mm for width judgement task.
Table 3. Mean widths in mm for width judgement task.
Target WidthPSE
10 mm10.0013 mm
30 mm29.6365 mm
50 mm50.9861 mm
70 mm71.3845 mm
90 mm90.2119 mm
Table 4. Width judgement for tasks of 10 mm to 90 mm for height point of subjective equality.
Table 4. Width judgement for tasks of 10 mm to 90 mm for height point of subjective equality.
Width Point of Subjective Equality
Point of Subjective EqualityStd. Dev.MinimumMaximumRangeGraph
Width 10 mm Shrink29.7026 mm1.152826.9380 mm35.3506 mm8.4125 mmFigure A11
Width 10 mm Grow29.5704 mm1.203326.2231 mm33.0424 mm6.8192 mmFigure A12
Width 30 mm Shrink29.7026 mm1.152826.9380 mm35.3506 mm8.4125 mmFigure A13
Width 30 mm Grow29.5704 mm1.203326.2231 mm33.0424 mm6.8192 mmFigure A14
Width 50 mm Shrink51.0674 mm2.936443.6741 mm61.0000 mm17.3259 mmFigure A15
Width 50 mm Grow50.9046 mm1.978146.4467 mm55.9687 mm9.5219 mmFigure A16
Width 70 mm Shrink71.2485 mm2.823365.9305 mm78.2263 mm12.2957 mmFigure A17
Width 70 mm Grow71.5205 mm2.521165.4601 mm75.9389 mm10.4786 mmFigure A18
Width 90 mm Shrink89.1925 mm3.137781.8442 mm97.2569 mm15.4127 mmFigure A19
Width 90 mm Grow91.2312 mm3.451885.2544 mm101.3365 mm16.0821 mmFigure A20
Table 5. Just noticeable difference height in mm for height judgements.
Table 5. Just noticeable difference height in mm for height judgements.
Just-Noticeable Different (JND) for Height Judgements
Height1030507090
JND0.64621.15101.33271.45001.4108
Shrink0.61580.99990.98061.34231.3273
Grow0.67651.30201.68471.55761.4943
Table 6. Just noticeable difference width in mm for width adjustment.
Table 6. Just noticeable difference width in mm for width adjustment.
Just Noticeable Difference (JND)
Width1030507090
JND0.52340.79381.65621.80112.2207
Shrink0.52440.77661.97911.90292.1148
Grow0.52240.81101.33321.69922.3265
Table 7. Difference from the final mean and target height in mm for height adjustment.
Table 7. Difference from the final mean and target height in mm for height adjustment.
Difference of Final and Target Height in mm
Height1030507090
Total0.1117−0.27410.04550.61680.3284
Shrink0.2752−0.039130.15020.74840.3409
Grow−0.0517−0.5090−0.05920.48510.3160
Table 8. Percentage from the final mean and target height in mm for height adjustment.
Table 8. Percentage from the final mean and target height in mm for height adjustment.
Percentage of Final and Target Height
Height1030507090
Total1.12%0.91%0.09%0.88%0.36%
Shrink2.75%0.13%0.30%1.07%0.38%
Grow0.52%1.70%0.12%0.69%0.35%
Table 9. Difference from the final mean and target width in mm.
Table 9. Difference from the final mean and target width in mm.
Difference of Final and Target Width in mm
Width1030507090
Total0.0013−0.36350.98611.38450.2119
Shrink0.1697−0.29741.06761.2485−0.8075
Grow−0.1670−0.42960.90461.52051.2312
Table 10. Percentage from the final mean and target width in mm.
Table 10. Percentage from the final mean and target width in mm.
Percentage of Final and Target Width
Width1030507090
Total0.013%1.21%1.97%1.98%0.25%
Shrink1.70%0.99%2.14%1.78%0.90%
Grow1.67%1.43%1.81%2.17%1.37%

Share and Cite

MDPI and ACS Style

Thomas, B.H. Examining User Perception of the Size of Multiple Objects in Virtual Reality. Appl. Sci. 2020, 10, 4049. https://doi.org/10.3390/app10114049

AMA Style

Thomas BH. Examining User Perception of the Size of Multiple Objects in Virtual Reality. Applied Sciences. 2020; 10(11):4049. https://doi.org/10.3390/app10114049

Chicago/Turabian Style

Thomas, Bruce H. 2020. "Examining User Perception of the Size of Multiple Objects in Virtual Reality" Applied Sciences 10, no. 11: 4049. https://doi.org/10.3390/app10114049

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop