Next Article in Journal
Femtosecond Pulsed Laser Irradiation of Zirconia for Embedding Silver Nanoparticles in Surface Nanopores
Next Article in Special Issue
Which Are the Needs of People with Learning Disorders for Inclusive Museums? Design of OLOS®—An Innovative Audio-Visual Technology
Previous Article in Journal
A Systematic Decision-Making Approach for Quality Function Deployment Based on Hesitant Fuzzy Linguistic Term Sets
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Manipulating Underfoot Tactile Perceptions of Flooring Materials in Augmented Virtuality

1
School of Product Design, University of Canterbury, Christchurch 8041, New Zealand
2
HIT Lab NZ, University of Canterbury, Christchurch 8041, New Zealand
*
Author to whom correspondence should be addressed.
Appl. Sci. 2023, 13(24), 13106; https://doi.org/10.3390/app132413106
Submission received: 16 November 2023 / Revised: 6 December 2023 / Accepted: 7 December 2023 / Published: 8 December 2023
(This article belongs to the Special Issue Cross Applications of Interactive System and Extended Reality)

Abstract

:
Underfoot haptics, a largely unexplored area, offers rich tactile information close to that of hand-based interactions. Haptic feedback gives a sense of physicality to virtual environments, making for a more realistic and immersive experience. Augmented Virtuality offers the ability to render virtual materials on a physical object, or haptic proxy, without the user being aware of the object’s physical appearance while seeing their own body. In this research, we investigate how the visual appearance of physical objects can be altered virtually to impact the tactile perception of the object. An Augmented Virtuality system was developed to explore this, and two tactile perception experiments, consisting of 18 participants, were conducted. Specifically, we explore whether changing the visual appearance of materials affects a person’s underfoot tactile perception and which tactile perception is most affected by the change through a within-subjects experiment. Additionally, the study examines whether people are aware of changes in visual appearance when focused on other tasks through a between-subjects experiment. The study showed that a change in visual appearance significantly impacts the tactile perception of roughness. Matching visual appearance to physical materials was found to increase awareness of tactile perception.

1. Introduction

Haptic research has explored providing physicality to virtual environments over the last five decades. Head-mounted displays (HMD) are capable of providing high-fidelity vision and audio to the wearer; however, the displayed virtual environment lacks physicality. Haptic feedback can provide physicality to the virtual environment through a variety of methods, whether it be a static proxy object [1,2]; a dynamic, often hand-held device [3,4,5]; mid-air devices [6,7]; or vibrotactile devices [8,9,10]. Visual feedback is often superimposed over these devices by the HMD. This allows for control over what the wearer sees when interacting with the haptic device.
Proxy objects provide an inexpensive and uncomplicated method of delivering tactile feedback to the user. Nonetheless, they have limitations in terms of either their usability or variability. More straightforward proxies can have a broader range of use cases but may provide lower-quality haptic feedback. In contrast, complex proxies can offer high-quality haptic feedback but have specific use cases. One significant advantage of proxy objects is that they do not require specific visual appearances, as a virtual representation can be offered. A proxy object can be a number of things. Alternatively, entire environments can change visually while physically remaining the same [11]. Nevertheless, further development of proxy objects is needed to expand their applications and minimise the number of required objects.
While research has examined the interaction between visual and tactile feedback [12,13,14,15,16], little attention has been given to how specific tactile feedback is affected by a change in visual appearance. Work by Hirano et al. [17] and Punpongsanon et al. [18] investigated how virtual deforming can influence softness perception by visually changing how much it deforms. Research has looked at simulating various tactile properties without considering fully how the visual input affects the perception [10,19,20,21].
Previous research has shown that altering the visual appearance alters a persons’ perception of a material; however, this change has not been quantified. By quantifying this change, a single physical material can simulate a range of materials by altering the visual appearance to change a persons’ perception of the material. The level of visual augmentation varies from material to material; by quantifying this level, a better understanding of simulating materials would be gathered.
Rather than simulating tactile properties using virtual textures, we focus on how the tactile properties can be impacted by altering the virtual textures. We aim to explore how the tactile properties of roughness, hardness, and stiffness are impacted when the visual appearance of a material is altered. We focus on underfoot tactile experiences due to the lack of research in this area and the richness of the tactile experiences that are still available. To achieve this, we use indoor flooring materials as proxy objects. A user study consisting of two experiments was conducted to explore this further.
In this study, participants experienced tactile feedback of flooring materials under various virtual visual appearances to determine how the augmentation influences their tactile perception. This research aims to provide insight into how visual feedback can be controlled to impact how haptic feedback is perceived. The first experiment explored how participants’ perception of tactile properties—namely, roughness, hardness, and stiffness—was affected when the visual feedback was altered. We hypothesise that altering the visual appearance of materials will affect a person’s tactile perception of the material [H1]. We also hypothesise that altering the visual appearance of materials with extreme tactile properties will have a less significant impact on tactile perception [H2]. The second experiment explored how aware participants were of the tactile feedback based on the given visual feedback when focused on another task. For this, we hypothesise that the use of virtual appearances that match the various physical materials will increase participants’ awareness of tactile feedback [H3]. The findings from our study are intended to open further research into feet-based haptics and simulating a range of materials by changing visual appearances. Our contributions include the following:
  • An Augmented Virtuality system was devised to facilitate feet-centric interactions with virtual textures, mimicking indoor flooring materials.
  • Two perceptual experiments were conducted to investigate the influence of visual augmentation to flooring materials on an individual’s underfoot tactile perception.
  • We interpreted the experimental outcomes and their implications in the context of foot-based tactility and haptic feedback.

2. Related Work

2.1. Defining Tactile Properties

The definition of tactile dimensionality for physical properties and our perception of the environment through touch is a subject of ongoing debate. Tactile properties are experienced when making contact with a material or texture, and each material exhibits a unique combination of tactile properties. As a result, a universally agreed-upon set of properties has not been established. Research has attempted to limit the number of defining tactile properties while still describing all material groups. Tactile dimensions, such as roughness/smoothness and hardness/softness, are considered the most influential properties in material identification [22,23]. Additional suggested tactile properties include sticky/slippery, warm/cold, bumpy/flat, and wet/dry [24]. Okamoto et al. [25] reviewed other studies to determine prominent tactile properties and suggest possible dimensional structures. They identified five potential tactile property dimensions: Macro and Fine Roughness, Warmness/Coldness, Hardness, and Friction. Sakamoto and Watanabe [26] discovered six major dimensions, four of which overlapped with the findings of Okamoto et al. Their work investigated Japanese sound-symbolic words (SSW) and how these words describe tactile properties; the following six major dimensions groups were chosen: Affective Evaluation and Friction, Compliance, Surface, Volume, Temperature, and Naturalness.
Tactile properties can be categorised and measured through various methods, but these are predominantly limited to specific groups of materials. The Shore A Hardness Scale quantifies the hardness of materials in the rubber group, facilitating the use of measured values in haptic studies of rubber materials [27]. However, not all materials can be accurately measured and categorised on a single scale. Instead, subjective measurement is necessary to identify general trends.
For our research, we focused on three tactile pairs: Roughness/Smoothness, Hardness/Softness, and Stiffness/Compliance. We chose these tactile properties as they were the most notable properties of carpet-based materials and would offer the best results. An initial pilot study found that users noted these tactile properties more than the others we investigated from the work of Zhezhova et al. [28]. Properties such as Hot/Cold, Wet/Dry, and Sticky/Slippery would not have been suited to the carpet materials.

2.2. Visual and Tactile Perception

Vision plays a critical role in our perception of objects and expectations when interacting with them. With visual input making up more than 70% of our sensory input [29], even before we approach an object, visual input provides information about it. As we continue to interact with objects, we develop a mental model of the materials we come into contact with, and the visual appearance of these objects is essential in identifying and determining how to interact with them [12,13]. However, the bimodal relationship between inputs varies based on the object being interacted with. For instance, a photograph offers visual information but little tactile information, whereas other objects have a preference towards tactile information as the visual texture offers little, like with sandpaper [30]. Our expectations of the physical environment are impacted by what we visually see and information from past experiences. Even if the virtual world does not match the physical space, the wearer tends to avoid virtual areas that would normally provide consequences in the physical space [31]. This control of visual sensations allows for haptic illusions—perceived sensations that do not exist physically and can be created through auditory control as well [32].
In some cases, both visual and tactile modalities are necessary to accurately determine a material. For example, Chen et al. [14] explored how participants perceived the roughness of a range of materials with only one modality of either visual or touch. Their findings revealed that the modality affected the participants’ perception of the materials. However, visual feedback alone is not always enough to identify the material, and a combination of visual and tactile feedback is needed to provide enough information on the material [15,16]. While one modality may influence our perception of an object more than others, a combination of modalities is often the preferred choice, providing the most positive experience [33]. Moreover, the relationship between visual and tactile feedback can be used to alter tactile perception by changing the visual appearance. Research has shown that superimposing digital imagery on objects can change a user’s perception of shape, roughness, hardness, and stiffness [34,35]. The form of virtual content also influences how an object is perceived, as the amount of virtual content varies. A study by Gaffary et al. [36] found that a piston shown in VR was perceived as stiffer in 60% of cases against an AR display, even though physical stiffness remained constant. Other factors such as variations of a material and past experiences can impact haptic impressions when combined with altering visual textures [37]. Dynamic visuals that change differently from the physical object have also been found to alter tactile perception. For instance, by virtually displaying an object to be deforming more than how it is being interacted with, the object is perceived as softer [17,18]. Additionally, Günther et al. [38] explored the perceived roughness of a range of materials with various virtual displays and found that the range of virtual materials could be represented using only two physical materials.
Research has shown that there is a relation between visual and tactile feedback and that changing visual feedback can impact tactile feedback. We decided to ask the question “Could multiple flooring materials be simulated using a singular material with multiple visual appearances?”. To help answer this, we decided to focus on quantifying the degree that the tactile perception is impacted by changing visual appearance.

2.3. Feet-Based Haptics

Haptic and touch research focuses largely on hand interactions due to the high density of mechanoreceptors in our fingertips and their everyday use [39], yet our feet also contain a significant number of mechanoreceptors [40]. Our feet constantly experience tactile sensations while walking and moving. However, foot-based haptics have not been as extensively explored as hand-based haptics. Foot haptic devices have been developed to concentrate on specific tasks, such as stepping up [41], walking through deep snow [42], or simulating slipperiness [43] and friction [44]. Visell et al. [45] designed a floor-tile-based haptic device replicating various ground materials using vibrotactile feedback. When combined with realistic audio and visual input, it can simulate a range of materials and a walkable area corresponding to the device’s size. Similarly, Hansen et al. [46] designed a floor-tile-based haptic device that simulated texture and shape similar to textured paving tiles used in public spaces. This work provided insight into directional haptic cues to the foot and how they should be applied to provide the most perceivable haptics.
Foot-based haptics can differ largely from hand-based haptics due to the interaction methods and the environment interacted in. Different terrains provide different haptics with unique interactions, such as water environments like rivers. Ke et al. [47] created PropelWalker, a propeller-based haptic device that provides resistance to the lower leg to simulate wading through knee-high water. Han et al. [48] took this a step further, creating a haptic environment that made use of water to provide haptic feedback and providing the unique interactions that come from liquid-based environments.
Auditory feedback plays a crucial role in underfoot interactions, influencing the overall tactile experience. As we walk, we constantly receive information about the material beneath our feet through audio cues or their absence. Research by [49,50,51] investigated foot-based tactile devices that deliver audio vibrotactile haptic feedback for walking in virtual spaces. While the added feedback did not significantly alter the overall perception, user experience was enhanced with a preference towards the added feedback and improved realism.
Carpets, like hair and fur, are composed of a multitude of fibrillar structures packed together that behave differently from most objects. Research has not investigated the tactile properties of carpet flooring. Nevertheless, Lee et al. [52] developed HairTouch, a system that offers stiffness and roughness tactile feedback using adjustable brush hair structures. Lowering the height of the fibres increases rigidity, making the material feel stiffer and rougher due to reduced fibre bending. Similarly, Degraen et al. [53] employed 3D-printed hair structures of varying lengths combined with visual dominance to enhance virtual textures.
Tactile perception studies have largely focused on hand-based interactions. Given the constant interaction of our feet on the floor, feet-based tactile interactions can provide insightful information on the tactile input we perceive walking around day to day. We aim to open up research on feet-based haptics by investigating how the tactile perception of our feet is impacted by changing visuals.
In this research, we investigate how changes in visual appearance influence the tactile perception of materials and develop a system that displays various overlays of flooring materials in Augmented Virtuality.

3. Materials And Methods

After a pilot study to select materials and tactile properties, we conducted two experiments with unique focuses, namely, how is tactile perception influenced by changing visual appearance and did matching virtual appearances increase awareness of tactile feedback?
In experiment 1, participants were asked to rate the roughness, hardness, and stiffness of four materials with four different virtual textures (see Figure 1). A 9-point Likert scale was used for each rating, with 1 being the lowest and 9 being the highest. A within-subjects design was employed, where each participant rated the same four materials with the four virtual textures (see Figure 2). A within-subjects design was used so that participants’ results could be analysed separately, as their perception of the material is unique to them. The Likert scale allowed for perception variance to be recorded without a zero-point material, as each participant’s perception was unique. The order of the material/visual pair was randomised and balanced using a Latin square, and no material or visual was repeated in succession to ensure participants had different experiences visually and through touch in each condition.
In experiment 2, participants walked around a small area, stepping on virtual objects that appeared one at a time. During the experiment, participants walked between two different flooring materials and were asked how many times they felt a cross between the materials. A between-subjects design was used in which participants experienced either the same virtual appearance across the floor or a virtual appearance that matched the materials. A between-subjects design was used, as participants would become aware of the task if it was repeated under the other condition. Participants were not informed of the reason for this experiment and were only told what they had to do for the task.
To provide the highest quality MR experience, we used the Varjo XR-3 HMD to display the virtual environment on the green-screened experiment area. This provided the highest-resolution display and allowed participants to see themselves in the virtual environment (VE).

3.1. Pilot Study

A pilot study was initially conducted to provide better insight as to what questions should be asked, what materials should be included, and what tactile properties should be investigated. Participants were asked to rate the tactile properties of several flooring materials under 3 conditions: blindfolded, no visual alteration, and visual overlay from a HMD. For the visual overlay condition, the appearance was designed to mimic the physical appearance. Initially, 8 materials were chosen for the pilot study; this was then narrowed down to 4 to be used in the experiments due to similarities between some materials. Six tactile pairs were chosen from the work of Zhezhova et al. [28], which are as follows: Rough/Smooth, Cool/Warm, Hard/Soft, Coarse/Fine, Stiff/Pliable and Rigid/Quiet. This was narrowed down to 3 pairs—Rough/Smooth, Hard/Soft, and Stiff/Compliant—to be used in the experiments, due to similarities between some pairs and time constraints.

3.2. Setup

Both experiments took place in a closed 3 × 3 m 2 office area situated in the corner of the room (see Figure 3). The floor of the experiment area was covered with rubber tiles, while green screen fabric was placed on top of the tiles and the adjacent walls. A chair was positioned in the centre of the experiment area, facing one of the green-screened walls, with the other wall to the left of the participant. The virtual display was provided using the Varjo XR-3 HMD powered by a Windows 10 PC (Intel 9 10900KF 3.70 GHZ, Nvidia RTX 3080, 32 GB RAM). The HMD is capable of Chroma-keying, displaying the virtual content against the green screen and allowing the wearer to see their body in the VE. Positional tracking for the Varjo HMD was achieved using three SteamVR 2.0 Base Stations, placed in the corner of the two walls, behind the participant on a stand, and to the participant’s right on a bookcase. The Unity game engine 2021.3.3f1 rendered the XR graphics displayed on the Varjo XR-3 HMD, handling all positional processing, virtual texture display, and physics involved during interactions with virtual materials. Interactive virtual materials were created using Hair Designer 1.10.3, a Unity asset by Kalagaan. To interact with the virtual materials, participants’ feet were tracked using two Vive trackers attached to socks worn throughout the experiment.

3.3. Materials

We used flooring materials acquired from local flooring stores to provide tactile feedback to participants for both experiments. Various materials were chosen that cover those found in most homes, and these were modified to measure 0.5 × 0.5 m 2 . for consistency. Based on the pilot study, four materials were selected for the experiments and will be referred to as follows: “Chunky”, “Plain”, “Rug”, and “Vinyl”, as shown in Figure 1. The materials were coloured green to support Chroma-keying for the Augmented Virtuality effects visualised through the Varjo XR-3. Four virtual textures—“Wood”, “Thick”, “Short”, and “Medium”—were created to be similar in appearance to the four materials used in the study, as shown in Figure 2. Due to software limitations, the four materials could not be fully replicated; instead, each virtual material is inspired by physical material (“Wood”->“Vinyl”, “Thick”->“Rug”, “Short”->“Chunky”, and “Medium”->“Fine”). The carpet-like textures were coloured similarly to avoid any bias. The carpet-like textures were created using the Hair Designer, this allowed for dynamic 3D textures that deformed when interacted with—that is, the virtual carpet strands deformed under the presence of the physical foot.

3.4. Procedure

Upon arrival, participants were seated and provided an oral introduction to the study. They were given an information sheet and consent form to read and sign. Afterwards, participants completed a demographic survey and were informed that they would answer questions verbally during the experiment. To enable foot movement tracking for interacting with the VE, participants were provided with socks to wear over their own and given an explanation. They were instructed on how to properly adjust the Varjo HMD and, once it was properly worn, the virtual experiment space replaced the green screen. Participants had the materials placed under their feet while wearing the HMD and seated in the experimental space. Participants were instructed to interact with the material with both feet. To prevent participants from seeing the new physical material, the environment was switched to be a fully immersive virtual environment instead of Augmented Virtuality, and participants were given an explanation to avoid confusion.
For each material/visual pair, participants were asked to rank the roughness, hardness, and stiffness on a scale of 1 to 9 and verbally give their answers. A 9-point Likert scale was chosen as it would provide more range in answers compared with a 7-point scale. It was also noted in the pilot study that participants were less likely to rate the extreme values as other materials existed that would be harder or softer than those in the experiment. Once experiment 1 was completed, participants were asked to keep the HMD on but to stand up, as they would be walking for experiment 2. For safety reasons, the chair was removed, and the virtual environment was adjusted so that the virtual walls matched the physical walls in space to prevent them from walking into a wall. Participants were explained that they would be walking around the experimental area and would see small virtual objects on the ground that they would have to step on, and they would disappear. The next virtual object would then appear. The process repeated until all twelve objects were stepped on. The study lasted approximately 45 min.

3.5. Participants

We recruited 18 participants (age range 21–40, x ¯ = 28, sd = 6.1; 10 male, 7 female, and 1 non-binary) for both experiments of our study. No inclusion criteria were used; however, we sought participants with experience in product design or engineering. This preference was due to individuals in these fields typically having studied materials and their tactile properties during their education. We were also interested in the amount of time spent in VR, as those with more experience are often more comfortable in VR environments. Therefore, we asked participants about their years of experience in Product Design or Engineering and the number of hours spent in Virtual Reality. Experience ranged from 0 to 20 years ( x ¯ = 6.4, sd = 5.3), and hours spent in VR ranged from 0 to 200 h ( x ¯ = 30, sd = 51.8). Participants were recruited amongst the local University campus through email; the study size was limited by the response to advertising emails that were sent out. Recruitment was focused on the Product Design and Engineering areas of the University; apart from this, the participant sample was random and based purely on who was available.

4. Results

For experiment 1, we gathered data for four materials, four virtual textures, and three properties for 18 participants, making for 864 data points. From these gathered data, we conducted a Shapiro–Wilk Normality test. The normality assumption was violated (W = 0.96, p < 0.001); because of this, an Aligned Rank Transform Analysis of Variance (ART-ANOVA) was performed due to the data being non-parametric and ranked [54]. A separate ART-ANOVA was conducted for each tactile property (roughness, hardness, and stiffness) as the properties are independent of each other. An ART-ANOVA was then conducted on each material to measure the impact of visual appearance separately on them. Post-Hoc tests were conducted using the “art.con” function from the ARTool package for R, created by [55]. For comparing pairs, Tukey’s method of the contrast test was used as it provides the strongest results for pairwise comparison.

4.1. Tactile Properties

The ART-ANOVA conducted for roughness showed that the tactile property is significantly influenced by the change in visual appearance (F = 3.73, p < 0.01). For hardness and stiffness, it was inconclusive as to whether changing the visual appearance influenced the tactile perception of these properties (F = 1.45, p = 0.23 and F = 0.96, p = 0.41). The ANOVA for all three tactile properties showed that material choice impacts tactile perception, suggesting that rankings were consistent with their materials. No significant interaction effect was found between the material and visual for any of the properties. Table 1 illustrates the ART-ANOVA for materials and visuals. These results weakly support our hypothesis that altering visuals will impact a person’s tactile perception of the material, as only one tactile property was significantly impacted.

4.2. Materials

The ART-ANOVA conducted for the “Plain” material showed that the perception of this material was significantly impacted by the change in visual appearance (F = 3.00, p = 0.03). No significant interaction effect was found between the tactile property and visual factors for any of the four materials. The results are illustrated in Table 2.
For each material, we gathered a 1–9 ranking for each of the three tactile properties under each of the four visual textures. The averages of each tactile property for each material are shown below in Figure 4. The materials “Vinyl” and “Rug” were shown to have more extreme tactile properties; they were closer to the ends of the Likert scale compared with the materials “Chunky” and “Plain”. This supports our hypothesis that altering the visual appearance of materials with extreme tactile properties will have less of an impact on tactile perception.

4.3. Post-Hoc Tests

Post-hoc pairwise tests were conducted between visual pairs for both the tactile properties and for each material. The “art.con” function with the Tukey method was used as it provided the best results for our data. For roughness, a significant difference was shown between the two visuals “Thick” and “Short” (t = 3.12, p < 0.01); for all other pairs, no significant differences were found (p > 0.05). For the “Plain” material, a significant difference was shown between the two visuals “Medium” and “Thick” (t = −2.81, p = 0.03); for all other pairs, no significant differences were found (p > 0.05). The results of the post-hoc tests are shown below in Table 3 and Table 4.

4.4. Experiment 2

After completing the walking task for experiment 2, participants were asked to estimate how often they crossed between the two flooring materials. As the participants were not informed of this question beforehand, the question was presented as multiple choice with options 1–3 times, 4–6 times, and 7–9 times crossings, instead of a specific number. This is shown below in Figure 5. It was expected that participants would cross between the materials seven times during the experiment. It was found that participants who experienced visuals matching the physical materials reported crossing between the materials closer to the actual number (1–3: 2, 4–6: 4, 7–9: 3) than the condition of mismatched visuals (1–3: 5, 4–6: 4, 7–9: 0). This supports our hypothesis that the use of virtual appearances that match the various physical materials will increase participants’ awareness of tactile feedback.

4.5. iGroup Presence Questionnaire

We used the iGroup Presence Questionnaire [56] to measure the participants’ level of presence experienced in the virtual environment. The questionnaire comprises 14 items, with one general question and the remaining thirteen divided into three subcategories: Spatial Presence, Involvement, and Experienced Realism. Questions were answered on a 9-point Likert Scale ranging from “Fully Disagree” to “Fully Agree”. In our XR experiment, the average scores for the three subcategories were as follows: Spatial Presence ( x ¯ = 6.2, SD = 1.6), Involvement ( x ¯ = 4.4, SD = 2.2), and Experienced Realism ( x ¯ = 4.4, SD = 2.2). These are shown in the graph in Figure 6.

5. Discussion

Experiment 1 yielded results indicating a significant effect of visual appearance modification on the tactile perception of roughness, but no comparable impact was noted for the perception of hardness and stiffness. This finding aligns with prior research by Bergmann, Tiest, and Kappers [14] that underscored the influence of visual feedback on roughness perception. However, our results did not confirm the findings of Hirano et al. [17] and Punpongsanon et al. [18], who suggested that visual augmentation could affect the tactile perception of softness/hardness. The divergence may be attributed to differences in the chosen materials and the methods for visual augmentation between our study and the previous work. It could be posited that certain materials may be more effectively augmented using distinct methodologies. Participants interacted with the materials with both feet. While limb dominance has been shown to impact perception [57], footedness was not considered for this study. Participants’ results were only compared against themselves for variance; this was to reduce the impact of subjective factors such as sensory numbness and footedness that differ between individuals.

5.1. Extreme Tactile Properties

The “Rug” and “Vinyl” materials utilised in the first experiment exhibited what we define as “extreme tactile properties”, which denotes their tactile ratings being closer to the ends of the Likert scale (1 and 9). It was observed that alterations in the visual representation of these materials with pronounced tactile properties resulted in a less significant impact on tactile perception [H3]. This phenomenon can be attributed to the dominant tactility of the material, which could supersede the influence of visual feedback.

5.2. Tactile Awareness

The first experiment examined the effect of visual modification on tactile perception, while the second one assessed the prominence of this perception when attention was diverted. Results from the second experiment suggested that congruence between visual representation and physical material enhanced perception when participants were engaged in other tasks. Participants exposed to matched visuals reported more notable transitions between different materials than those with mismatched visuals. The experimental design anticipated participants crossing between two materials seven times during the walk to each virtual object. However, variations in navigation led to some participants only experiencing six transitions, which presented an unexpected complication, as the accurate response lay between two answer brackets (4–6 and 7–9). Consistent with previous findings suggesting that visual input constitutes the majority of our sensory input [29], our results support that matched visuals increase tactile awareness.

5.3. iPQ

The iPQ, incorporated into our post-experiment survey, was utilised to evaluate user experience during the experiments. According to the iPQ data, our system performed commendably in Spatial Presence, but Involvement and Experienced Realism fell short. The Augmented Virtuality context, employing Varjo’s chroma-keying, allowed participants to observe their bodies within the VE, which likely contributed to the high ratings of Spatial Presence. However, the comparative aspects of Involvement (user’s perception of the real world when in the VE) and Experienced Realism (how authentic the VE feels compared to the real world) received lower scores. This might be attributed to the concurrent visibility of the real world and the VE, enabling immediate comparison and exposing the VE’s relative lack of realism.

5.4. Implications for Foot-Based Haptic Feedback

The findings of our study underscore the significance of both visual and tactile feedback in the design of foot-oriented haptic devices. To achieve a convincing simulation of materials, tactile input should be meticulously crafted to emulate the targeted material, complemented by high-resolution visual input. Visuals can be altered from the physical material to simulate new materials; however, different visuals should be used for each material. It is crucial to align virtual visuals with the physical environment to ensure the user’s safety and prevent unexpected sensations when interacting with unknown objects in the virtual landscape. Devices offering foot-based haptics should promote unhindered user mobility, enabling individuals to freely navigate the virtual environment and fostering a more naturalistic experience. Considering the continuous influx of visual information we receive about our surroundings, new experiences can be created by altering the visual appearance of materials, yet these should not be the user’s main focus. This would allow users to navigate the VE intuitively, focusing less on their movements and more on the immersive experience.

5.5. Sustainability and Industrial Implications

While the results from our study are limited, they highlight an area for continuing work and reduction of resources. Our study only focused on the perception of indoor flooring materials; further work is needed to explore if the methods discussed in this study can be applied to other materials. The methods of this study could also be applied to other areas of the body. Work by Shokur et al. [58] has shown that tactile perception of flooring materials can be perceived from the arms. This would allow future development of haptic technologies to be less focused on the materials used and more on how the visuals can be controlled to simulate various materials. This reduction in materials used is also environmentally beneficial as less resources are needed to produce the haptic technologies.

6. Limitations

Our exploration was confined to four materials in order to curtail the duration of the user study. This was a reduction from the eight materials featured in the pilot study, following the exclusion of similar materials, yet it inevitably resulted in a somewhat restricted data range from the user study. The inclusion of a greater variety of materials would have facilitated a more comprehensive comparison and provided a richer understanding of the overall trends we sought to investigate. Given our focus on a limited subset of indoor flooring materials and only three tactile pairs, it is plausible that the gathered results do not represent a comprehensive scope of all possible materials. Future research should endeavour to broaden the spectrum of materials and tactile pairs examined.
The findings of Experiment 2 suggested that aligning the visual appearance with physical surroundings heightens one’s awareness of tactile feedback. However, the data are derived from a solitary case, involving 18 participants distributed between two conditions; hence, the inferential capacity of the results is somewhat limited. The experiment did not delve into which materials would influence a person’s awareness or how the overlay of different visual appearances might impact this awareness. It merely determined whether one condition elevated awareness compared with the other, providing a binary answer without quantifying the degree to which awareness was affected by the physical material or the virtual overlay. Additionally, some participants did not traverse between the two materials the anticipated seven times, resulting in a range of correct answers. Coupled with the answer brackets, this led to an ambiguity concerning the overall trend.

7. Conclusions

In this research, we conducted a pilot study and two perception experiments in Augmented Virtuality to investigate the effects of changing visual appearance on tactile perception. In the first experiment, our findings showed that altering the virtual texture of flooring materials had a significant influence on a person’s tactile perception of roughness, but this was not the case for hardness and stiffness. It was also observed that materials characterised by more pronounced tactile properties were less impacted by alterations in visual appearance. The second experiment revealed that aligning the visual representation with the physical material enhanced a person’s tactile awareness, particularly when their attention was directed towards another task. Despite the somewhat limited scope of materials examined in our study, this research will potentially catalyse further investigations into how the manipulation of visual feedback can shape tactile perception. Future research should not limit user movement and provide accurate, high-quality visuals to create the most natural experience.

Author Contributions

Conceptualization, J.T., T.P., S.L. and E.C.; methodology, J.T.; software, J.T.; validation, J.T., T.P., S.L. and E.C.; formal analysis, J.T.; investigation, J.T.; resources, T.P.; data curation, J.T.; writing—original draft preparation, J.T.; writing—review and editing, J.T., T.P., S.L. and E.C.; visualization, J.T. and T.P.; supervision, T.P, S.L. and E.C.; project administration, T.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the Human Research Ethics Committee of the University of Canterbury (HREC 2022/61/LR), 26 August 2022.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study. Written informed consent was obtained from the patient(s) to publish this paper.

Data Availability Statement

Data available on request due to restrictions.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

ANOVA—Analysis of Variance; AR—Augmented Reality; ART—Aligned Rank Transform; AV—Augmented Virtuality; HMD—Head-Mounted Display; iPQ—iGroup Presence Questionnaire; MR—Mixed Reality; PC—Personal Computer; RAM—Random Access Memory; VE—Virtual Environment; VR—Virtual Reality.

References

  1. Strandholt, P.L.; Dogaru, O.A.; Nilsson, N.C.; Nordahl, R.; Serafin, S. Knock on wood: Combining redirected touching and physical props for tool-based interaction in virtual reality. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; pp. 1–13. [Google Scholar]
  2. Cheng, L.P.; Chang, L.; Marwecki, S.; Baudisch, P. iturk: Turning passive haptics into active haptics by making users reconfigure props in virtual reality. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, QC, Canada, 21–26 April 2018; pp. 1–10. [Google Scholar]
  3. Teng, S.Y.; Li, P.; Nith, R.; Fonseca, J.; Lopes, P. Touch&Fold: A Foldable Haptic Actuator for Rendering Touch in Mixed Reality. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, CHI ’21, Virtual, 8–13 May 2021. [Google Scholar] [CrossRef]
  4. Zenner, A.; Krüger, A. Drag: On: A virtual reality controller providing haptic feedback based on drag and weight shift. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; pp. 1–12. [Google Scholar]
  5. McClelland, J.C.; Teather, R.J.; Girouard, A. Haptobend: Shape-changing passive haptic feedback in virtual reality. In Proceedings of the 5th Symposium on Spatial User Interaction, Brighton, UK, 16–17 October 2017; pp. 82–90. [Google Scholar]
  6. Hwang, I.; Son, H.; Kim, J.R. AirPiano: Enhancing Music Playing Experience in Virtual Reality with Mid-Air Haptic Feedback. In Proceedings of the 2017 IEEE World Haptics Conference (WHC), Munich, Germany, 6–9 June 2017. [Google Scholar] [CrossRef]
  7. Sodhi, R.; Poupyrev, I.; Glisson, M.; Israr, A. AIREAL: Interactive tactile experiences in free air. ACM Trans. Graph. TOG 2013, 32, 1–10. [Google Scholar] [CrossRef]
  8. Bau, O.; Poupyrev, I.; Israr, A.; Harrison, C. Teslatouch: Electrovibration for Touch Surfaces. In Proceedings of the UIST ’10: 23rd Annual ACM Symposium on User Interface Software and Technology, New York, NY, USA, 3–6 October 2010; pp. 283–292. [Google Scholar] [CrossRef]
  9. Choi, I.; Zhao, E.; González, E.; Follmer, S. Augmenting Perceived Softness of Haptic Proxy Objects Through Transient Vibration and Visuo-Haptic Illusion in Virtual Reality. IEEE Trans. Vis. Comput. Graph. 2020, 27, 4387–4400. [Google Scholar] [CrossRef]
  10. Yem, V.; Okazaki, R.; Kajimoto, H. Vibrotactile and pseudo force presentation using motor rotational acceleration. In Proceedings of the 2016 IEEE Haptics Symposium (HAPTICS), Philadelphia, PA, USA, 8–11 April 2016; pp. 47–51. [Google Scholar]
  11. Simeone, A.L.; Velloso, E.; Gellersen, H. Substitutional Reality: Using the Physical Environment to Design Virtual Reality Experiences. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, CHI ’15, Seoul, Republic of Korea, 18–23 April 2015; pp. 3307–3316. [Google Scholar] [CrossRef]
  12. Fleming, R.W. Visual perception of materials and their properties. Vis. Res. 2014, 94, 62–75. [Google Scholar] [CrossRef]
  13. Posner, M.I.; Nissen, M.J.; Klein, R.M. Visual dominance: An information-processing account of its origins and significance. Psychol. Rev. 1976, 83, 157–171. [Google Scholar] [CrossRef]
  14. Bergmann Tiest, W.M.; Kappers, A.M. Haptic and visual perception of roughness. Acta Psychol. 2007, 124, 177–189. [Google Scholar] [CrossRef]
  15. Baumgartner, E.; Wiebel, C.B.; Gegenfurtner, K.R. Visual and haptic representations of material properties. Multisens. Res. 2013, 26, 429–455. [Google Scholar] [CrossRef]
  16. Whitaker, T.A.; Simoes-Franklin, C.; Newell, F.N. Vision and touch: Independent or integrated systems for the perception of texture? Brain Res. 2008, 1242, 59–72. [Google Scholar] [CrossRef]
  17. Hirano, Y.; Kimura, A.; Shibata, F.; Tamura, H. Psychophysical influence of mixed-reality visual stimulation on sense of hardness. In Proceedings of the 2011 IEEE Virtual Reality Conference, Singapore, 19–23 March 2011; pp. 51–54. [Google Scholar] [CrossRef]
  18. Punpongsanon, P.; Iwai, D.; Sato, K. Softar: Visually manipulating haptic softness perception in spatial augmented reality. IEEE Trans. Vis. Comput. Graph. 2015, 21, 1279–1288. [Google Scholar] [CrossRef]
  19. Benko, H.; Holz, C.; Sinclair, M.; Ofek, E. Normaltouch and texturetouch: High-fidelity 3d haptic shape rendering on handheld virtual reality controllers. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology, Tokyo, Japan, 16–19 October 2016; pp. 717–728. [Google Scholar]
  20. Choi, I.; Culbertson, H.; Miller, M.R.; Olwal, A.; Follmer, S. Grabity: A wearable haptic interface for simulating weight and grasping in virtual reality. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology, Quebec, QC, Canada, 22–25 October 2017; pp. 119–130. [Google Scholar]
  21. Murakami, T.; Person, T.; Fernando, C.L.; Minamizawa, K. Altered touch: Miniature haptic display with force, thermal and tactile feedback for augmented haptics. In Proceedings of the SIGGRAPH ’17: ACM SIGGRAPH 2017 Emerging Technologies, Los Angeles, CA, USA, 30 July–3 August 2017; pp. 1–2. [Google Scholar]
  22. Hollins, M.; Bensmaia, S.; Karlof, K.; Young, F. Individual differences in perceptual space for tactile textures: Evidence from multidimensional scaling. Percept. Psychophys. 2000, 62, 1534–1544. [Google Scholar] [CrossRef]
  23. Hollins, M.; Faldowski, R.; Rao, S.; Young, F. Perceptual dimensions of tactile surface texture: A multidimensional scaling analysis. Percept. Psychophys. 1994, 54, 697–705. [Google Scholar] [CrossRef]
  24. Chen, X.; Shao, F.; Barnes, C.; Childs, T.; Henson, B. Exploring Relationships between Touch Perception and Surface Physical Properties. Int. J. Des. 2009, 3, 67–76. [Google Scholar]
  25. Okamoto, S.; Nagano, H.; Yamada, Y. Psychophysical Dimensions of Tactile Perception of Textures. IEEE Trans. Haptics 2013, 6, 81–93. [Google Scholar] [CrossRef]
  26. Sakamoto, M.; Watanabe, J. Exploring Tactile Perceptual Dimensions Using Materials Associated with Sensory Vocabulary. Front. Psychol. 2017, 8, 569. [Google Scholar] [CrossRef]
  27. Bergmann Tiest, W.M.; Kappers, A. Cues for Haptic Perception of Compliance. IEEE Trans Haptics 2009, 2, 189–199. [Google Scholar] [CrossRef] [PubMed]
  28. Zhezhova, S.; Jordeva, S.; Golomeova-Longurova, S.; Vangja, D.K.; Dimova, T. Tactile properties of fabrics. Tekst. Ind. 2019, 67, 4–10. [Google Scholar] [CrossRef]
  29. Hutmacher, F. Why is there so much more research on vision than on any other sensory modality? Front. Psychol. 2019, 10, 2246. [Google Scholar] [CrossRef] [PubMed]
  30. Heller, M.A. Visual and tactual texture perception: Intersensory cooperation. Percept. Psychophys. 1982, 31, 339–344. [Google Scholar] [CrossRef]
  31. Simeone, A.L.; Mavridou, I.; Powell, W. Altering User Movement Behaviour in Virtual Environments. IEEE Trans. Vis. Comput. Graph. 2017, 23, 1312–1321. [Google Scholar] [CrossRef]
  32. Kang, N.; Sah, Y.J.; Lee, S. Effects of visual and auditory cues on haptic illusions for active and passive touches in mixed reality. Int. J. Hum.-Comput. Stud. 2021, 150, 102613. [Google Scholar] [CrossRef]
  33. Balaji, M.S.; Raghavan, S.; Jha, S. Role of tactile and visual inputs in product evaluation: A multisensory perspective. Asia Pac. J. Mark. Logist. 2011, 23, 513–530. [Google Scholar] [CrossRef]
  34. lesaki, A.; Somada, A.; Kimura, A.; Shibata, F.; Tamura, H. Psychophysical Influence on Tactual Impression by Mixed-Reality Visual Stimulation. In Proceedings of the 2008 IEEE Virtual Reality Conference, Reno, NV, USA, 8–12 March 2008; pp. 265–266. [Google Scholar] [CrossRef]
  35. Villa, S.; Pacchierotti, C.; Girouliere, X.; Maciel, A.; Marchal, M. Altering the Stiffness, Friction, and Shape Perception of Tangible Objects in Virtual Reality Using Wearable Haptics. IEEE Trans. Haptics 2020, 13, 167–174. [Google Scholar] [CrossRef]
  36. Gaffary, Y.; Le Gouis, B.; Marchal, M.; Argelaguet, F.; Arnaldi, B.; Lécuyer, A. AR feels “softer” than VR: Haptic perception of stiffness in augmented versus virtual reality. IEEE Trans. Vis. Comput. Graph. 2017, 23, 2372–2377. [Google Scholar] [CrossRef]
  37. Kitahara, I.; Nakahara, M.; Ohta, Y. Sensory Properties in Fusion of Visual/Haptic Stimuli Using Mixed Reality. In Advances in Haptics; IntechOpen: London, UK, 2012. [Google Scholar] [CrossRef]
  38. Günther, S.; Rasch, J.; Schön, D.; Müller, F.; Schmitz, M.; Riemann, J.; Matviienko, A.; Mühlhäuser, M. Smooth as Steel Wool: Effects of Visual Stimuli on the Haptic Perception of Roughness in Virtual Reality. In Proceedings of the CHI ’22: Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems; New Orleans, LA, USA, 29 April–5 May 2010, pp. 1–17. [CrossRef]
  39. Purves, D.; Augustine, G.J.; Fitzpatrick, D.; Hall, W.; LaMantia, A.S.; White, L. Neurosciences; De Boeck Supérieur: Paris, France, 2019. [Google Scholar]
  40. Mancini, F.; Bauleo, A.; Cole, J.; Lui, F.; Porro, C.; Haggard, P.; Iannetti, G. Whole-Body Mapping of Spatial Acuity for Pain and Touch. Ann. Neurol. 2014, 75, 24179. [Google Scholar] [CrossRef]
  41. Schmidt, D.; Kovacs, R.; Mehta, V.; Umapathi, U.; Köhler, S.; Cheng, L.P.; Baudisch, P. Level-Ups: Motorized Stilts that Simulate Stair Steps in Virtual Reality. In Proceedings of the 33rd Annual ACM Conference, Seoul, Republic of Korea, 18–23 April 2015; pp. 359–362. [Google Scholar] [CrossRef]
  42. Yokota, T.; Ohtake, M.; Nishimura, Y.; Yui, T.; Uchikura, R.; Hashida, T. Snow Walking: Motion-Limiting Device That Reproduces the Experience of Walking in Deep Snow. In Proceedings of the 6th Augmented Human International Conference AH ’15, Singapore, 9–11 March 2015; pp. 45–48. [Google Scholar] [CrossRef]
  43. Millet, G.; Otis, M.; Horodniczy, D.; Cooperstock, J. Design of Variable-Friction Devices for Shoe-Floor Contact. Mechatronics 2017, 46, 5. [Google Scholar] [CrossRef]
  44. Tsao, C.A.; Wu, T.C.; Tsai, H.R.; Wei, T.Y.; Liao, F.Y.; Chapman, S.; Chen, B.Y. FrictShoes: Providing Multilevel Nonuniform Friction Feedback on Shoes in VR. IEEE Trans. Vis. Comput. Graph. 2022, 28, 2026–2036. [Google Scholar] [CrossRef] [PubMed]
  45. Visell, Y.; Cooperstock, J.R.; Giordano, B.L.; Franinovic, K.; Law, A.; Mcadams, S.; Jathal, K.; Fontana, F. A Vibrotactile Device for Display of Virtual Ground Materials in Walking. In Proceedings of the 6th International Conference on Haptics: Perception, Devices and Scenarios, EuroHaptics ’08, Madrid, Spain, 10–13 June 2008; pp. 420–426. [Google Scholar] [CrossRef]
  46. Hansen, K.L.; Jensen, U.S.; Johansson, S.P.; Papachristos, E.; Skov, M.B.; Vertegaal, R.; Merritt, T. FeetBack: Providing Haptic Directional Cues Through a Shape-changing Floor. In Proceedings of the Nordic Human-Computer Interaction Conference, Aarhus, Denmark, 8–12 October 2022; pp. 1–10. [Google Scholar]
  47. Ke, P.; Cai, S.; Gao, H.; Zhu, K. Propelwalker: A leg-based wearable system with propeller-based force feedback for walking in fluids in vr. IEEE Trans. Vis. Comput. Graph. 2022, 29, 5149–5164. [Google Scholar] [CrossRef]
  48. Han, P.H.; Wang, T.H.; Chou, C.H. GroundFlow: Liquid-based Haptics for Simulating Fluid on the Ground in Virtual Reality. IEEE Trans. Vis. Comput. Graph. 2023, 29, 2670–2679. [Google Scholar] [CrossRef] [PubMed]
  49. Nilsson, N.C.; Nordahl, R.; Turchet, L.; Serafin, S. Audio-Haptic Simulation of Walking on Virtual Ground Surfaces to Enhance Realism. In Proceedings of the HAID 2012: Haptic and Audio Interaction Design, Lund, Sweden, 23–24 August 2012. [Google Scholar]
  50. Nordahl, R.; Berrezag, A.; Dimitrov, S.; Turchet, L.; Hayward, V.; Serafin, S. Preliminary Experiment Combining Virtual Reality Haptic Shoes and Audio Synthesis. In Proceedings of the EuroHaptics 2010: Haptics: Generating and Perceiving Tangible Sensations, Amsterdam, The Netherlands, 8–10 July 2010. [Google Scholar]
  51. Turchet, L.; Burelli, P.; Serafin, S. Haptic Feedback for Enhancing Realism of Walking Simulations. IEEE Trans. Haptics 2013, 6, 35–45. [Google Scholar] [CrossRef]
  52. Lee, C.J.; Tsai, H.R.; Chen, B.Y. HairTouch: Providing Stiffness, Roughness and Surface Height Differences Using Reconfigurable Brush Hairs on a VR Controller. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, CHI ’21, Virtual, 8–13 May 2021. [Google Scholar] [CrossRef]
  53. Degraen, D.; Zenner, A.; Krüger, A. Enhancing Texture Perception in Virtual Reality Using 3D-Printed Hair Structures. In Proceedings of the CHI ’19: CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; pp. 1–12. [Google Scholar] [CrossRef]
  54. Wobbrock, J.O.; Findlater, L.; Gergle, D.; Higgins, J.J. The Aligned Rank Transform for Nonparametric Factorial Analyses Using Only Anova Procedures. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vancouver, BC, Canada, 7–12 May 2011; CHI ’11. pp. 143–146. [Google Scholar] [CrossRef]
  55. Elkin, L.A.; Kay, M.; Higgins, J.J.; Wobbrock, J.O. An aligned rank transform procedure for multifactor contrast tests. In Proceedings of the The 34th Annual ACM Symposium on User Interface Software and Technology, Virtual, 10–14 October 2021; pp. 754–768. [Google Scholar]
  56. Igroup Presence Questionnaire (IPQ) Overview. Available online: https://www.igroup.org/pq/ipq/download.php (accessed on 18 June 2022).
  57. Squeri, V.; Sciutti, A.; Gori, M.; Masia, L.; Sandini, G.; Konczak, J. Two hands, one perception: How bimanual haptic information is combined by the brain. J. Neurophysiol. 2012, 107, 544–550. [Google Scholar] [CrossRef]
  58. Shokur, S.; Gallo, S.; Moioli, R.C.; Donati, A.R.C.; Morya, E.; Bleuler, H.; Nicolelis, M.A. Assimilation of virtual legs and perception of floor texture by complete paraplegic patients receiving artificial tactile feedback. Sci. Rep. 2016, 6, 32293. [Google Scholar] [CrossRef]
Figure 1. The four materials used for the User Study, coloured green to blend in with the Green screen.
Figure 1. The four materials used for the User Study, coloured green to blend in with the Green screen.
Applsci 13 13106 g001
Figure 2. The 4 Virtual Textures used for the User Study, displayed over green colours in the experiment area.
Figure 2. The 4 Virtual Textures used for the User Study, displayed over green colours in the experiment area.
Applsci 13 13106 g002
Figure 3. Experimental setup for the user study.
Figure 3. Experimental setup for the user study.
Applsci 13 13106 g003
Figure 4. Average rankings for each tactile property for each material.
Figure 4. Average rankings for each tactile property for each material.
Applsci 13 13106 g004
Figure 5. Number of times crossed between two floorings, with the correct answer circled in red.
Figure 5. Number of times crossed between two floorings, with the correct answer circled in red.
Applsci 13 13106 g005
Figure 6. Average iPQ results for the Virtual Environment.
Figure 6. Average iPQ results for the Virtual Environment.
Applsci 13 13106 g006
Table 1. ART-ANOVA for Materials and Visuals.
Table 1. ART-ANOVA for Materials and Visuals.
RoughnessHardnessStiffness
FpFpFpdf
Material209.340.01185.680.0160.880.013
Visual3.730.011.450.230.960.413
Material: Visual1.470.160.420.920.760.659
Table 2. ART-ANOVA for Properties and Visuals.
Table 2. ART-ANOVA for Properties and Visuals.
ChunkyPlainVinylRug
FpFpFpFpdf
Property81.740.0122.020.01214.960.017.190.012
Visual2.290.083.000.030.480.690.180.913
Property: Visual0.590.741.340.241.040.400.510.806
Table 3. Post-hoc pairwise test between visual appearances.
Table 3. Post-hoc pairwise test between visual appearances.
RoughnessHardnessStiffness
t-Ratiop-Valuet-Ratiop-Valuet-Ratiop-Valuedf
Med–Thick−0.610.93−1.830.26−1.230.61255
Med–Short2.510.06−1.190.63−0.211.00255
Med–Wood1.050.72−0.231.000.390.98255
Thick–Short3.120.010.630.921.020.74255
Thick–Wood1.660.351.590.381.630.37255
Short–Wood−1.460.460.960.770.610.93255
Table 4. Post-hoc pairwise test between visual appearances for each material.
Table 4. Post-hoc pairwise test between visual appearances for each material.
ChunkyPlainVinylRug
tptptptpdf
Med–Thick−2.210.12−2.810.030.420.980.500.96187
Med–Short−0.830.84−0.530.951.080.710.680.91187
Med–Wood0.111.00−0.910.800.930.790.580.94187
Thick–Short1.390.512.280.110.660.910.181.00187
Thick–Wood2.320.101.900.230.520.950.081.00187
Short–Wood0.930.79−0.380.98−0.141.00−0.101.00187
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Topliss, J.; Lukosch, S.; Coutts, E.; Piumsomboon, T. Manipulating Underfoot Tactile Perceptions of Flooring Materials in Augmented Virtuality. Appl. Sci. 2023, 13, 13106. https://doi.org/10.3390/app132413106

AMA Style

Topliss J, Lukosch S, Coutts E, Piumsomboon T. Manipulating Underfoot Tactile Perceptions of Flooring Materials in Augmented Virtuality. Applied Sciences. 2023; 13(24):13106. https://doi.org/10.3390/app132413106

Chicago/Turabian Style

Topliss, Jack, Stephan Lukosch, Euan Coutts, and Tham Piumsomboon. 2023. "Manipulating Underfoot Tactile Perceptions of Flooring Materials in Augmented Virtuality" Applied Sciences 13, no. 24: 13106. https://doi.org/10.3390/app132413106

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop