Next Article in Journal
Cultural Heritage in Fully Immersive Virtual Reality
Previous Article in Journal
User Identification Utilizing Minimal Eye-Gaze Features in Virtual Reality Applications
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Evaluating the Effect of Multi-Sensory Stimulation on Startle Response Using the Virtual Reality Locomotion Interface MS.TPAWT

Department of Mechanical Engineering, University of Utah, Salt Lake City, UT 84112, USA
*
Authors to whom correspondence should be addressed.
Virtual Worlds 2022, 1(1), 62-81; https://doi.org/10.3390/virtualworlds1010005
Submission received: 12 August 2022 / Revised: 1 September 2022 / Accepted: 5 September 2022 / Published: 9 September 2022

Abstract

:
The purpose of the study was to understand how various aspects of virtual reality and extended reality, specifically, environmental displays (e.g., wind, heat, smell, and moisture), audio, and graphics, can be exploited to cause a good startle, or to prevent them. The TreadPort Active Wind Tunnel (TPAWT) was modified to include several haptic environmental displays: heat, wind, olfactory, and mist, resulting in the Multi-Sensory TreadPort Active Wind Tunnel (MS.TPAWT). In total, 120 participants played a VR game that contained three startling situations. Audio and environmental effects were varied in a two-way analysis of variance (ANOVA) study. Muscle activity levels of their orbicularis oculi, sternocleidomastoid, and trapezius were measured using electromyography (EMG). Participants then answered surveys on their perceived levels of startle for each situation. We show that adjusting audio and environmental levels can alter participants physiological and psychological response to the virtual world. Notably, audio is key for eliciting stronger responses and perceptions of the startling experiences, but environmental displays can be used to either amplify those responses or to diminish them. The results also highlight that traditional eye muscle response measurements of startles may not be valid for measuring startle responses to strong environmental displays, suggesting that alternate muscle groups should be used. The study’s implications, in practice, will allow designers to control the participants response by adjusting these settings.

1. Introduction

What startles you? The startle response is an involuntary muscular reaction that protects oneself from an unexpected stimulus. Startles are common and desirable in horror movies, video games, and psychology research, but generally undesirable in the VR-based teleoperation of robots and remote systems. Although it is well established that higher audio levels cause greater startle responses [1] and that adding more haptic feedback devices for multi-stimulation in VR generally corresponds to higher realism [2,3], the question that is unknown in these situations is how the environment (e.g., wind, heat, smell, and moisture) combined with graphics and audio affects a startling situation. This knowledge is important to understand as researchers develop new VR worlds. Hence, the goal of this study was to determine how various aspects of VR and extended reality (XR) can be exploited to cause a good startle, or to prevent them.
To address this research, a multi-sensory virtual reality system was created. The goal was to create a system capable of providing natural full-body haptic stimulation mimicking natural environmental effects without interfering with the graphical display of the virtual world. We also wanted to allow users to locomote through the virtual world while experiencing environmental display, which would allow the results of this study to be extended to creating realistic training for first responders, gait therapy sessions for people with Parkinson’s disease [4], strokes, or spinal cord injury [5].
VR headsets and walking in place are one option; however, we elected to modify an existing system to provide this capability. The result is the Multi-Sensory TreadPort Active Wind Tunnel (MS.TPAWT, pronounced Ms. Teapot), a modified version of the TreadPort Active Wind Tunnel (TPAWT). The TreadPort was originally a VR environment that consisted of a large treadmill locomotion interface with a CAVE display [6], which lets users walk around and explore a VR world. A wind display was added via a large wind tunnel built around the TreadPort to create steerable wind that appeared to come from the graphical displays [7]. The TPAWT was modified for this study to include several haptic environmental displays: heat, olfactory, and mist. An overview of the system is presented in Figure 1. Combined, the system created an extended reality (XR) experience that allowed a user to walk through a virtual world while physically experiencing haptic displays portraying environmental aspects of the VR world. This system also allowed us to investigate the effects of environment on warnings provided to users wearing protective robots during physical activity [8], which enabled safe controlled experimental conditions for such experiments, one of the true benefits of virtual reality systems.
We designed three startles (i.e., bird, beam, and thunder) with different levels of visual, audio, and environmental feedback aimed to elicit a variety of psychological and physiological responses. Some of the startles were surprising and others offered premonition. Some startles were small and discrete, whereas others were large. This range of startling events enabled evaluation of how starling events can be designed to elicit a strong startle or to negate the effects of a startle. The former is important for developing VR horror games, psychology studies, and gait therapy, whereas the latter is important for developing VR-based interfaces for teleoperation.
To better understand how environmental stimuli, graphics (i.e., size of startling event), and audio affected responses to the startles, the audio level and environmental display were varied as the participants experienced the different startles. Graphic levels were naturally varied because each startle was visually different. Qualitative startle responses were measured using a survey, and quantitative startle responses were determined by measuring the activation of several muscles associated with the different startles. Video footage provided an evaluation of participant reactions.
The results highlight both expected and unexpected findings. As expected, audio was usually able to elicit a strong startle response, but we found that enabling or disabling the environmental display could have significant impact on the physiological responses and psychological perception of the startles. Likewise, the findings indicate that physiological and psychological responses do not necessarily agree. In some cases, muscle responses can be used to indicate a startle, whereas in others, particularly the eye, they are in reaction to the environmental display itself; however, this depends on the magnitude of the environmental display. Additionally, we found that visually larger startles elicited stronger responses both psychologically and physiologically than smaller startles. When comparing video response to muscle activation, we found that the eye muscle is not suitable for understanding the effects of environmental displays. This is generally because the added effects of wind and moisture directed to the face cause extra activation. This suggests that future studies should use other muscle groups when evaluating environmental displays. Lastly, we found that the environment was vital towards creating a premonition effect; in our case, even more so than graphics.
As a result, this paper presents several contributions. It presents a comprehensive XR system, MS.TPAWT, designed to simulate outdoor virtual environments, which is also useful for training scenarios and therapy. An XR game is presented with several startling events that are proven to be capable of eliciting a range of startle responses. The subject study demonstrates how multi-sensory display can be leveraged to create or cancel the effects of startling situations. We show that adjusting audio and environmental levels can alter participants’ physiological and psychological responses to the virtual world. We show that the environment can be used to increase or reduce startle responses depending on how the virtual world is designed and how classical audio and graphical startle techniques are affected by environmental displays. The results also suggest that eye muscle measurements may not be valid for measuring startle response when environmental displays are enabled because strong displays may cause squinting even when startle responses are suppressed. These findings should be useful for designers of virtual worlds and haptic systems aiming to create a startle or to prevent them.

2. Related Work

2.1. Startles, Fear, and Premonition

Startles are widely used in in psychology as a physiological means of measuring psychological characteristics [9,10,11,12,13]. The review by Blumenthal et al. [14] provides an excellent overview of the established techniques for eliciting a startle response, as well methods used for measuring the response. Eye twitches are one of the oldest metrics used in psychology research. EMG measurements of eye muscle activation are typically used to measure eye muscle activity as an indicator of a response to stimuli [14], although eyeblinks can be measured by cameras or mechanical devices [15], yielding similar results. In this research, we used EMG measurements of muscle activity near the eye, back, and neck, because the different startles were expected to activate muscles in those areas based upon the participant’s whole-body physical response (e.g., jumping, looking up, and blinking) due to the startles. Our results suggest, however, that the classical eye squint response measurements may not be valid for measuring response to strong environmental displays because they naturally induce squinting.
In psychology research, startle responses are elicited using loud audio, electrical stimulation, magnetic stimulation, light flashes, or mechanical stimulation [14]. One can imagine that some of these techniques may be more desirable for VR. Audio stimulation, the most common technique, consists of a short burst of white noise (~50 ms) played at high sound levels (~100 dBA) to ensure that a startle response is elicited, although research indicates that audio levels as low as 50 dBA can elicit a startle response [16]. Audio is a natural part of a VR experience, but bursts of white noise are not. Hence, researchers must decide if their goal is to elicit a clinically accepted startle response, or if the goal is to evaluate the effect of different startle audio. Clinically accepted white noise audio bursts have been used in VR research to evaluate the effects of conditioning [17], although some do use natural sounds from the environment, such as the sounds of explosions and sirens in VR training simulators [18]. Our results highlight that enabling environmental display appropriately can actually magnify or diminish the startle effect of loud audio.
To drown out background noise (e.g., machinery), noise-cancelling headphones have been used to mask the sounds of apparatus [19]. The role of artificial background noise is debated because background noises in the 65–75 dBA range have been shown to reduce startle responses (i.e., pre-pulse inhibition) while masking the sound of apparatus [20]. In contrast, louder background levels, such as 75 dBA or 85 dBA, have been shown to increase startle responses [21]. In this study, we used noise-cancelling headphones to reduce sounds from the apparatus and then used background sounds that would naturally occur in the virtual world; startle sounds were also natural, corresponding to the event that the user experienced.
Visual stimulation, such as bright lights with increasing intensity [22] or rapidly approaching threatening objects [23], are also used by psychologists to elicit startle reflexes. Psychologists sometimes use still images to evoke fear [24], but videos have been shown to be more effective for generating fear than static images [25]. Similarly, light and darkness have been shown to elicit stress in participants akin to real life as they drive cars through tunnels in VR [26]. VR simulations for emergency responders use graphical representations of a car exploding coupled with the sound of an explosion, instead of bursts of noise [18]. Startles consisting of loud white noise were used in [27] as a user travelled through different parts of a virtual world. Scary games are used with surprising events, such as falling furniture or ghouls that appear suddenly [28], which are also combined with startle audio characteristic of those phenomena. Our virtual world was most similar to the former in terms of visual startle variety, but our phenomena were typical of real life; we employed a small bird that suddenly fluttered across the display, a beam that fell from the ceiling of a barn, and a large flash of light due to lightning, all of which included audio that was appropriate to each startle.
A variety of other startle stimulation techniques have been used. Electrical stimulation is achieved by the application of an electrical potential above the threshold of detection and below the threshold of pain [29]. Magnetic stimulation can also be used to elicit eyeblink responses [30]. Air stimulation can elicit startle responses using brief puffs of low-pressure air [31,32], typically applied to the forehead or temporal regions of the head [33]. Mechanical stimulation, such as tapping or ballistic impacts, has also been used [34]. The study presented here used wind as an environmental display, but this is quite different compared with the air pulses used to generate startles in the abovementioned studies. When enabled, wind was used as a steady haptic display until the thunder startle, whereupon wind was increased as a premonition of an impending storm. Others have used haptic displays, such as vibrating interfaces, to elevate the heartrate in VR, but this does not result in a startle response [35].
The usage of startles in VR research is often focused on psychology. Studies have used startle responses in VR to study phobia reactions [17], the motivational mechanisms of cravings [36], and to improve the effectiveness of post-traumatic stress disorder treatments [37]. These types of studies measure startle responses to evaluate the efficacy of their therapies. Studies using VR to study startle responses have found that users who were startled while involved in a complex task exhibited smaller startle responses [18], and that social anxiety is associated with a greater startle response [38]. Other studies have also used startle responses in VR to study the effects of extinction learning [27], effectiveness of certain trauma treatment methods [28], and conditioning for phobias [39,40]. In contrast, startles [41] and audio warnings [42] have also been examined to alert a user to an impending impact so that a tensed reaction can be used to better protect themselves; although these were not focused on VR, the latter used a treadmill interface similar to the one used in this research in order to deliver warnings at precise instances during a running gait. One of the long-term goals of this research is to augment that work to evaluate the effectiveness of different audio and visual warnings in controlled realistic VR environments as users wear protective gear such as smart helmets [8]. Finally, it is worth noting that not all VR environments promote startles or rely on fears; work by Noronhona used a natural outdoor VR world, as we do, but they were focused on promoting calmness via peaceful graphical presentations and soothing sounds as a means of therapy [43]. Our VR world focused on pleasing natural environments, such as a walk along a river, through the forest, and in the mountains; however, we did not focus on promoting peacefulness, which could be a focus of future research given the advanced haptic displays presented here.
Premonitions created by VR were also a feature of this study. Specifically, the thunder startle was designed to feature graphics and environmental displays which cued the participant that a startle was about to occur. This is typical of extinction learning [44], where repeated exposure to a stimulus is used to reduce the effect of the stimulus. In this case, we relied upon associations of the stimuli with the thunder startle to diminish its impact. Previous VR studies have used imagery of spiders as the unconditioned stimulus [39] and colored light as the conditioned stimulus reducing the fear response. Images of fierce dogs and falling walls have also been used as unconditioned stimuli [24]. The thunder stimulus is fundamental to child development to the point that it is known to have become “extinct through awareness” rather than requiring any specific conditioning [45], which is why it was selected for this study. We have yet to find any papers that deal with environmental display or graphical display in VR as cues (e.g., the conditioned stimuli) for thunder startles. Hence, we believe that this paper highlights advancements regarding the ability of graphical and environmental displays in virtual worlds to leverage these natural premonitions engrained in humans from childhood.
It is important to note that premonition is different from exposure therapy, where a user is repeatedly exposed to a stimulus to reduce its effect, such as the treatment of acrophobia [40]. In fact, this study only presents each stimulus once. Likewise, psychological research focuses on pre-pulse inhibition for a startle, meaning that things that happen moments before the startle can diminish the magnitude of the startle. Pre-pulse tones, for example, can diminish the intensity of startle response to loud white-noise pulses [1], but as indicated by [46], tones that occur 15 to 400 milliseconds before the startle inhibit reactions, whereas longer pre-pulse periods (e.g., 2 s) become ineffective. This is in contrast to a fear-potentiated startle, such as the dark haunted house used in [28] to potentiate startles, which is typical of fear-potentiated VR research. In this study, however, we created a startling situation, a thunderstorm, and examined the effect of an environmental display (wind, mist, heat, and odor) and graphical display (darkening skies) several seconds before the startle to attenuate the startle response instead of potentiating it.

2.2. Simulator Technology

Development of computer graphics technology in the 1960s and 1970s led to the early development of VR simulators for training pilots [47]. As indicated by [48], contemporary simulators range from a computer screen and joystick to high-fidelity mockups with 6 DOF motion platforms and graphical displays to create realistic experiences [49]. Similar difficulties operating the simulations have been reported across the range of simulators; however, neurological activity is noted to be significantly increased with VR-based simulators. VR simulators provide an improved sense of presence, and many use haptic interactions with remote systems and simulations, leading to extended reality (XR), which is used in applications such as piloting remotely operated vehicles [50,51] and surgical robotics [52]. Haptic feedback is often an important part of this process so that user can better perceive the conditions and interactions of a remote system or simulation. We have not found any simulators that evaluate the effect of environmental display on their training results, although it is common for simulators to make users respond to the effect of adverse environmental conditions, suggesting that environmental displays could add a new dimension to these training scenarios.
To enhance realism in VR, many studies have used sensory feedback systems such as olfactory, heat, and wind. Few, however, combine as many as MS.TPAWT. Examples of olfactory feedback systems include the Lotus system, which uses a directional mist system, and the VE VIREPSE, which uses a fan-based system [53,54]. Both systems differ from ours in that they focus solely on the effects of olfactory feedback.
Warmth has also been used across many types of VR systems to induce a greater feeling of presence. Examples of this being implemented include using several heating/cooling elements to enable a seated user to feel changes in temperature as they explore a space with their hand [55], or through heatpads worn with a mobile headset [56]. Our system allows ambient heat to be felt by the user in a way realistic to sunlight without limiting locomotion of the user.
An example of a wind system is the WindCube [57], which can provide wind from several directions, but impedes heavily on the virtual environment, and thus, the total immersion [58]. Simpler, less intrusive wind systems include the VR Scooter [59] and Sensorama [60], which both use fixed fans hidden from the user to simulate speed. Ambiotherm combines a thermal display with a wind display as an accessory for head-mounted displays [56]. These setups minimize intrusion on the VR experience, but do not allow for multidirectional wind or user movement. Our system created multidirectional wind from a device hidden from the user and allowed for user movement.
Head-mounted displays have become popular commodities for VR research and home applications due to their affordability and portability; however, CAVE display systems are still a common occurrence for research-related VR systems [4,61,62] and commercial locomotion systems [61]. Without locomotive input, they have comparable effectiveness in some scenarios to higher-end mobile displays such as the Samsung Gear VR [62]. CAVE displays combined with locomotion systems are very popular for locomotion studies and gait therapy, however [4,63]. The user can interact with their physical environment, for example, using the railings on a treadmill or via advanced haptic displays that render terrain features [4], slopes [64,65,66,67], or inertial forces [6,68], while also experiencing their virtual world unencumbered. Physical therapists can more easily interact with the users, helping to guide their therapy. Bodyweight support systems [5] allow for safety in applications such as gait therapy for spinal cord rehabilitation, or in simulations of reduced gravity. Tether-based systems also allow for perturbations to be applied during gait [43,69]. Such systems often provide sufficient space perturbations and for motion capture systems that can be used for characterizing gait properties and controlling interactions with the user. CAVE displays also remove the added weight of a head-mounted display, which would perturb the user’s kinematics, allowing the user to move more freely and naturally. Although MS.TPAWT provides all of the above features, only a subset were applied in this research. Future research would expand on the results from this paper to evaluate the effects of other haptic interactions and user experiences.
Most of the related studies described above use up to two haptic feedback devices. MS.TPAWT used four haptic feedback devices coupled with a locomotive input and visual/audio display. These effects are designed to be non-intrusive, allowing the user to freely interact with their environment. The combination of these systems enables the effective study of startling experiences which would be difficult or dangerous were they to be performed without VR.

3. System Description

MS.TPAWT is a VR system that couples graphical and locomotion interfaces with several haptic displays including wind, olfactory, moisture, heat, and audio, as shown in Figure 1. This section further elaborates on the displays and details the game design used in this study.

3.1. CAVE Display and Locomotion

The CAVE display presented a 180° view of the virtual world to the participant. There were four projections: left, right, front, and ground. The first three projections were angled 120° to the front, whereas the ground projection was projected on the white treadmill and its surrounding floorboards [7].
The TreadPort consisted of a treadmill-style locomotion interface and used a robotic tether to display inertial forces for realistic walking [70]. In addition to the tether, participants were attached to a fall arrest harness to prevent falling.

3.2. Environmental Display

The environmental display consisted of wind, olfactory, moisture, heat, and audio. Wind was generated with a Greenheck QEID 33 mixed flow fan [7], which was split along two channels located on the side of the system and exited from both sides of the CAVE display [7]. The airflow then followed the curvature of the screens before merging and being redirected towards the participant. In effect, the user felt wind as though it were coming straight from the screen. The angle of wind experienced by the user could be adjusted by changing the ratio of air flow between the two channels. For this study, the wind direction was fixed directly towards the participant.
Two scents were administered in this study: an ambient lavender scent and a rain scent used in situations when the participant encountered water. The VR environment consisted of forested areas and flowers; therefore, we used a lavender scent developed by P&J Trading. The oil was applied to a cloth and placed on the exit of the wind system which dispersed the scent. Air collection charcoal filters were installed at the back of MS.TPAWT, which prevented the participants from being exposed to other scents. For the rain scent, a Numatics 236-102B solenoid control valve released pressurized air which flowed over a rain-scented (P&J Trading) cotton ball and exited an SUF1 air atomizing nozzle (Spray Systems Co., Glendale Heights, IL, USA) in front of the user [71].
Moisture was created by spraying water particles into the wind just in front of the participant. The wind then carried the moisture directly in front of them. The system was identical to the olfactory system, except that it was attached to a water source.
To simulate sun exposure, heat was provided overhead by an RPH-208-A Infrared Heater (Fostoria). It was turned on when the user was in sunlight and turned off when the user was not. Lastly, we used Cowin E7 active noise-cancelling Bluetooth headphones to provide audio. These headphones can produce volumes higher than the sound levels throughout the game which are 95 dB SPL or lower.

3.3. VR Game

Participants experienced a series of startle events that were each designed to elicit a variety of responses. There were three startles: a bird flying in front of the participant (bird), a beam crashing down in front of the participant (beam), and a thunder strike that landed in front of them (thunder). Images for each startle are presented in Figure 2, Figure 3 and Figure 4, respectively.
The bird startle was designed for participants to track a graphically subtle but fast object moving past them. In contrast, the beam startle was designed to be large and easily noticed. The thunder startle incorporated the environment as part of its design. Before the thunder strike, the heat was turned off, wind speed was increased, and mist and rain scents were activated. The thunder then struck in front of them, creating a loud noise.
At the start of the game, participants started on top of a hill near a brick house. They ventured downhill, following a dirt path, with occasional markers to indicate the way. After some time, they crossed through a shallow river, during which, if environmental effects were on, olfactory and moisture systems were activated. This experience took approximately 5 min and was intended for users to become immersed in the game.
After traversing through the river, participants crossed a bridge, leading to a farm. A few steps past the bridge, the first startle was activated and a blue bird flew from the bottom right of the display and out to the left. Participants continued their walk marked by the path indicators through the farmland for 2–3 min. Eventually, they entered a barn, triggering the second startle where the beam crashed in front of them, creating a loud sound. After exiting the barn and continuing along the path for an additional 2–3 min, the thunder startle was activated. The system then returned to normal and the participants walked for 1–2 min before the game was ended by the researcher.

4. Methods and Procedures

The experimental design is detailed in this section. We discuss the environment and startles, statistical design, measures, and participants. The study was completed with University of Utah IRB_00100544 approval.

4.1. Participants

This study enlisted 120 participants (77 male and 43 female) with a mean age of 23 (between 18 and 49 years old). USD 10 gift cards were given as compensation for their time.

4.2. Design

To understand how environmental and audio stimuli affect startle response, we varied the levels of each and analyzed their responses with a two-way ANOVA. The environment was either on or off, and audio levels were varied between off, medium, and high. All configurations are presented in Table 1. Participants were randomly assigned to a configuration upon arrival.
To understand the effects of graphics on participant response, we performed a one-way ANOVA across the startles which had varying level of graphics. Participants with environment were removed from this analysis to avoid any possible level of interaction.
To understand the effects of the environment on premonition, we compared the beam and thunder startles, because these both contained large visual elements but had varying levels of premonition. Two-way ANOVA was used to analyze the main effects of environment and premonition, as well as the interactions between them.

4.3. Measures

Two measures were used in the study: EMG and survey. EMG was captured using a Trigno Avanti Platform. The sensors were placed on muscle groups associated with startle activation. These included the orbicularis oculi (eye), sternocleidomastoid (neck), and trapezius (back) muscles. The maximum voluntary contraction (MVC) was measured and used to normalize the EMG data. EMG data were passed through a 50 Hz high pass filter, after which a moving RMS envelope of 100 samples was calculated. EMG data affected by equipment malfunctions were removed from the analysis. Values falling outside of three standard deviations were also removed.
The survey assessed the user’s perception of each startle. A standard 5-point visual analog scale with “strongly agree” and “strongly disagree” was used for all questions, with scores of “5” and “1” correlating to the two respective extremes. A score of “3” was neutral, but not labelled as such. The following questions were asked:
Q1.
When the bird flew in front of you did you feel startled?
Q2.
When the beam fell in front of you in the barn did you feel startled?
Q3.
When the lightning hit the ground did you feel startled?

5. Results

This section explains the roles of audio, environment, and graphics on perception and muscle activation in the different scenarios. For each scenario, muscle activation and participant perception were also compared. The results are presented in five subsections. The first three analyze the effects of environment and audio for the bird, beam, and thunder startles separately. The last two subsections analyze the effect of graphics and the effects of environment and premonition on participant responses. EMG and survey data were analyzed using two-way ANOVA. Follow-up tests were conducted using Tukey–Kramer’s difference procedure. We used a standard significance level of (p < 0.05) and considered notable effects to be in the range of 0.05   p < 0.09 .
EMG data were slightly skewed; therefore, Box–Cox transform was applied before analyses [72]. Normality was verified using the Kolmogorov–Smirnov test, validating the use of ANOVA. Statistics are reported using the transformed data, whereas effect sizes are reported using the non-transformed EMG data. It was unclear discussing effect sizes with units that had percentages; therefore, two metrics were used for muscle activation. The first was the %MVC increase, which was the true effect size between two groups. The other was the percentage increase between the two groups. For example, if group 1 had 10 %MVC and group 2 had 50 %MVC, we determined that there was a 40 %MVC increase and an overall 500% increase in muscle activation between the two groups.

5.1. Bird Startle

The bird startle was designed with a small visual focus and larger auditory focus. The startle took place in an outdoor setting with wind and heat activated when the environment was turned on. The bird was a small graphical object flittering across the screen, accompanied by the sound of wings fluttering when the startle audio was enabled (i.e., Aud Med and Aud High). During the bird startle, participants generally moved their neck to track the bird moving across the screen and closed their eyes when the bird flew close to them. The results are presented in Table 2 and Figure 5.
Audio had a significant effect on how startled they felt, as per survey Q1. Between Audio Off and Audio High, participants responded with an average 1.88-point (115%) increase (t(119) = 7.32, p < 0.001), which was similar for Med. startle audio, with an average 1.58-point (93%) increase (t(119) = 5.76, p < 0.001). Both indicated that the startle audio made perceptions of the event more startling. In contrast, the effect of environment on/off had no statistically significant effect on their feelings towards the startle.
Likewise, audio had a significant effect on neck and eye activation. Between Aud Off and Aud High, there was an average 13.5% MVC (161%) increase in their neck (t(112) = 2.54, p = 0.033), and an 8.21% MVC (85%) increase in their eye (t(112) = 3.17, p = 0.006) activation. Many participants did not even notice the bird with Aud Off (i.e., background noise only). Bird startle audio was clearly important for eliciting a response.
There was an obvious pattern in the EMG data and survey data. As the confidence interval plots indicate, the general trend is that Aud Off resulted in the lowest % MVC and survey scores, regardless of the environment settings. Aud High generally resulted in significantly higher values when compared with Aud Off, whereas responses to Aud Med were generally somewhere in between. Again, this highlights the importance of audio with a small graphical object.

5.2. Beam Startle

The beam startle had an auditory and visual focus. The beam had a large graphical representation coupled with a loud crashing sound as the beam fell, which was only present with Aud Med. and Aud High. Participants with environment on felt wind and heat before entering the barn where the startle occurred, but the environmental effects were identical to those provided by the bird. During the startle, participants typically responded with a backwards flinch, involving a shrugging motion of the shoulders (back activation), looking up at the beam falling in front of them (neck activation), and a widening of the eyes (eye activation). The results are presented in Figure 6 and Table 3.
Audio had a significant effect on the perception of startles, as indicated by Q2. The loud startle sounds, Aud High, resulted in an average 1.62-point (65%) increase compared with Aud Off (t(119) = 6.39, p < 0.001), whereas Aud Med. resulted in an average 1.33-point (47%) increase compared with Aud Off (t(119) = 5.21, p < 0.001). Again, the environmental display did not have a significant impact on the startle perception.
Audio levels had a statistically significant effect on EMG activity, demonstrating significantly increased muscle activity with the addition of Aud Med. and Aud High. Between Aud Off and Aud Med., there was a 30.2% MVC (355%) average increase in neck activation (t(104) = 4.17, p < 0.001), a 11.48% MVC (127%) average increase in back activation (t(112) = 2.41, p = 0.045), and a 13.3% MVC (97%) average increase in eye activation (t(111) = 3.12, p = 0.006). Between Aud Off and Aud High, there was a 30.5% MVC (337%) average increase in neck activation (t(104) = 3.99, p < 0.001), 14.29% MVC (158%) average increase in back activation (t(112) = 3.36, p = 0.003), and 12.25% MVC (89%) average increase in eye activation (t(111) = 3.01, p = 0.009).
Neck muscle responses did exhibit environmental significance. Enabling the environmental display significantly increased neck activation by an average 23.8% MVC (129%) (t(104) = 2.66, p = 0.009).
Similar to the bird startle, there was a notable pattern in muscle activation and questionnaire responses. Generally, Aud Off resulted in the lowest % MVC and questionnaire response, whereas Aud Med and Aud High generally resulted in significantly higher levels. This highlights the importance of startle audio (Aud Med and Aud High) with the beam startle.

5.3. Thunder Startle

The thunder startle had a visual and auditory focus, but with a focus on creating premonition a few seconds before the startle. All participants experienced dark grey clouds overhead, and reduced sunlight. Those with environment enabled experienced increased wind speed and the heat was turned off; rain scent and mist were activated. The startle then consisted of flashing the screen brightly to simulate lightning, followed by a loud thunderclap. During the startle, participants typically responded by stiffening their neck, looking away from the screen, and squinting their eyes when the lightning and thunder struck. Confidence interval plots and ANOVA results are shown in Figure 7 and Table 4, respectively.
In contrast to the bird and beam startles, audio did not have a statistically significant effect on startle level (Q3), although environment did have a significant effect. Those with environment on reported an average 0.68-point (20.8%) decrease in startle perception compared with those without, reducing the questionnaire results from an average of 3.28 with environment off to 2.6 with environment on (t(119) = 2.87, p = 0.004). These results indicate that enabling the environmental display reduced how startled participants felt, regardless of audio levels.
Audio had a significant effect on neck muscle activation. Between Aud Off and Aud Med, there was an average 9.61% MVC (71%) increase (t(112) = 2.54, p = 0.033). Increased audio had an impact on the stiffening of the neck and looking away from the screen during the startle.
Environment had a notable effect on eye muscle activation. There was an average 8.80% MVC (29.9%) increase in activation when the environment was enabled (t(112) = 1.91, p = 0.058). The participants squinted their eyes more when the environment was enabled. This could have been a result of the startle itself, but the increased wind and the mist likely resulted in a natural physical response to such conditions.
There was a notable decrease in back activation when turning the environment on (t(113) = 1.80, p = 0.074). Back EMG decreased an average of 5.5% MVC (29.6%) with the environment enabled, which highlights that participants shrugged less or jumped back less when it was enabled. This correlated well with survey data indicating that the environmental display resulted in the event being less startling, which resulted in a reduced physical response to the startle.
Patterns in EMG and survey data were more varied in the thunder startle. Neck EMG, which was the only metric to show significance for audio levels, was also the only metric to show the previously mentioned pattern where audio resulted in low activation for Aud Off and significantly higher activation for startle audio (Aud Med. and Aud High). No other metrics demonstrated this trend. This could suggest that the neck response is more of a subconscious physiological startle response, which may be more reliable than the eye squint response in the presence of an environmental display for measuring startle responses.
In fact, audio made no noticeable difference in startle survey data when the environment was enabled, suggesting that the audio made no difference psychologically to the user. EMG data indicated a similar trend, where back, neck, and eye responses generally exhibited blunted responses to Aud High when the environment was on. This lack of audio significance in the neck, back, and eye responses, as well as lack of audio significance in perceptions of the startle, suggest that enabling the environmental display in fact made the startle less effective. This is believed to be due to the premonition that the environmental display created.
An interesting pattern was noted when the environment was on or off. Specifically, when the environment is off, as audio is increased, we observed gradual increases in neck EMG, eye EMG, and startle survey data with each successive level. Otherwise, enabling the environment resulted in lower startle responses compared with environment off. Combined, these results indicate that increased audio yields increased startle responses (both physical and psychological), but that enabling environment provided a premonition that resulted in a lower startle response.

5.4. Graphics

One-way ANOVA was used across startles to measure the effects of varying graphic levels, as shown in Table 5 and Figure 8. The bird was designed to be a small graphical startle, Vis Small, the beam was designed to be a large graphical startle Vis Large, and the thunder was designed to be a large graphical startle with premonition (the sky darkened before the thunder strike), Vis Large w/Premonition. The bird and beam were both startles with objects of varying size moving towards the user, whereas thunder was a bright flash of light. Categories with the environment were removed from the analysis to avoid any possible level of interaction; hence, participant responses were truly dependent on graphics. Lastly, only eye muscle activation was analyzed for EMG, because this muscle is the most receptive to visual stimulation.
Visual levels had a significant effect on perceptions of startle. Between Vis Small (bird) and Vis Large (beam), there was an average 0.82-point (28%) increase (t(179) = 3.31, p = 0.002). Similarly, visual levels also had a significant effect on eye muscle activation. Between Vis Small (bird) and Vis Large (beam), there was a 5.76% MVC (40%) increase (t(186) = 2.46, p = 0.036). Additionally, between Vis Small (bird) and Vis Large w/Prem (thunder), there was a 6.20% MVC (43%) increase (t(186) = 3.37, p = 0.002). This generally suggests that larger graphical objects are more startling.
The effects of visual premonition can be compared between beam and thunder, because the beam suddenly appeared and the thunder was preceded by darkening skies. There was a decrease in how startling the event was by 0.46 points (14%), although technically not quite notable (t(179) = 1.05, p = 0.14): Table 5. This suggests that visual premonition may decrease the effect of visually startling events.

5.5. Premonition and Environment

The results presented in Section 5.3 and Section 5.4 hinted that premonition caused by environmental effects decreased participant muscle activation and perception. This section aims to solidify it. Levels of graphics were kept constant by comparing the beam and thunder startles because these both contained large visual elements but had varying levels of premonition. Two-way ANOVA was used to analyze the main effects of the environment and premonition, as well as the interactions between them. As noted in Section 5.3, eye muscle activation was likely caused by wind and mist blowing into the participants’ faces when the environment was active; thus, we excluded it from the analyses.
Results of the two-way ANOVA indicated there was a significant interaction between premonition and environment on participant perception of how startling the event was, as shown in Table 6 and Figure 9, i.e., participants responses to premonition, or lack thereof, changed depending on whether the environment was activated. For the startle with premonition (thunder), participants on average felt less startled with Env On compared with Env Off, by 0.68 points (26%) (t(239) = 2.84, p = 0.024). In contrast, the environment had no significant effect on the startle without premonition (beam). The thunder startle had the same level of visual premonition across environment activation, which suggests that the added effect of the environment truly decreased their perceptions of how startling the event was. This is further supported by the results presented in Section 5.4, where it is reported that the visual element of premonition did not have a significant effect on their perception.
Somewhat similar results are reflected in neck muscle activation. For the startle without premonition, there was a notable increase in neck activation when turning the environment on: 23.8% MVC (129%), (t(113) = 1.80, p = 0.074). This is in line with the previous results exploring the effects of Env on the beam startle, although the drop from significant to notably significant was most likely caused by changing the main effects of the analysis. In contrast, the environment had no significant effect for the startle with premonition (thunder). The above results suggest that Env can be used to either amplify a response or blunt it. For startles without premonition, turning on the environment can be used to amplify neck muscle activations. For startles with premonition, the environment can be used to blunt the perception of how startling the event is.

6. Discussion

The goal of the startles was to elicit a variety of psychological and physiological responses. The bird startle was designed for participants to track a graphically subtle, fast, and potentially dangerous object moving past them, but without any dependence on the environment. As the EMG responses show, participants used their neck and eyes to follow the bird. As expected, startle audio was important for making the event startling. Startle perception was increased with startle audio. Startle audio significantly increased EMG activity in the eye and neck as the user noticed the bird. As expected, environment had little effect.
The beam was intended to be a large visual startle that is easily noticed, again with no dependence on environmental display. As expected, startle audio was important to increasing how startling the beam was and for increasing the EMG response. We had expected that the large graphical appearance would reduce the change in effect size, but again, startle audio created large increases in startle effects. As EMG responses show, participants jumped backwards, shrugged, and stiffened their neck significantly more when startle audio was provided. Startle audio correlated to participants widening their eyes more as the beam fell, which contrasted with the bird and thunder startles, where participants instead closed their eyes. Surprisingly, enabling the environment significantly increased neck activation, meaning that the participants looked upwards significantly more with the environmental display, which was not expected. Although startle audio was important for eliciting a startle response, adding the environment had a significant effect on muscle activation, which notably compounded the effectiveness of startle audio. The environmental display did not impact perceptions of the startle; however, it had a clear impact on the physical response to the startle, indicating that audio and environment combined can be used to amplify responses.
Trends for the thunder startle were quite different from those for the bird and the beam, because the goal was to use the environment and graphics to create premonition and reduce startle. The thunder startle had a full-screen graphical effect, darkening followed by a bright flash, combined with loud audio, but the environmental display played an active role. Wind increased, scent and moisture were sprayed, and the heat lamp turned off before the thunder struck. We had anticipated that the use of environment would cue the participant that a startle would occur. We expected participants to be less startled with the environment and more startled with louder startle audio. This would correspond to them closing their eyes and jumping back less with the environment on and more with audio. Instead, participants reacted by squinting their eyes and looking away. As expected, turning the environment on decreased the perceptions of how startling it was. The environmental display amplified the effect of the full-screen graphical display, and warned the participant that something was coming. Surprisingly, the premonition effect was so significant with the environment enabled that the startle audio had little effect on the perceptions of the startle. This was reinforced by the reduced back responses with environment enabled, meaning that the participants jumped less. The environment did increase eye activity, but this was believed to be due to squinting in response to the mist and increased wind speeds, because it contradicted the survey results. It further suggests that the classical eye squint measurements for assessing startle responses may not be valid with strong environmental effects. The startle audio still had a significant impact on neck activity, meaning that the startle audio still caused them to look up more aggressively. Overall, the results highlight that environmental displays can be used to help participants better anticipate startling events, although the natural neck response to the startle audio remains.
Future work could focus on evaluating the subject study using a head-mounted display (HMD). HMDs are very popular in the VR community, but they may interfere with environmental displays. The advantage of the CAVE display used by MS.TPAWT is that it allows a user to view the VR world and experience full-body and full-face contact with the environmental displays. Such interfaces are used in systems with locomotion interfaces, enabling large workspaces, such MS.TPAWT or CAREN [61], teleoperation [73], or theme park entertainment systems [74], whereas HMDs are more common in personal entertainment or gaming VR interfaces.

7. Conclusions

These results have shown that extended reality created by environmental displays can be quite effective for altering the response of participants and their perceptions of virtual worlds. Without doubt, audio is key for eliciting stronger responses and perceptions of startling experiences, but environmental displays can be used to either amplify those responses or to diminish those responses. Environmental displays can be used to create premonitions of startling situations, which can be used to diminish startle responses, both physiologically and psychologically. Environmental displays can also be used to either increase or decrease physiological responses to surprising events, depending on the type and sequence of events. In conclusion, we can control for the response to a startling situation by altering environment displays and audio. These findings suggest that environmental displays could be investigated as a means of improving awareness during the teleoperation of remote systems.

Author Contributions

Conceptualization, N.G.L., B.R.S. and C.A.G.; methodology, N.G.L., B.R.S. and C.A.G.; software, B.R.S. and T.E.T.; validation, T.E.T., D.D.G., A.B.M.Z. and E.R.E.; formal analysis, T.E.T. and E.R.E.; investigation, T.E.T., A.B.M.Z., D.D.G. and E.R.E.; resources, T.E.T.; data curation, T.E.T., E.R.E. and D.D.G.; writing—original draft preparation, T.E.T., M.A.M., E.R.E. and T.J.H.; writing—review and editing, T.E.T. and M.A.M.; visualization, T.E.T. and E.R.E.; supervision, M.A.M.; project administration, T.E.T.; funding acquisition, M.A.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the US National Science Foundation, grant number 1162617 and 1622741.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Institutional Review Board of the University of Utah (IRB_00100544).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Not applicable.

Acknowledgments

The authors would like to acknowledge the Titan graphics cards donated by NVIDIA Corporation and motor control hardware from Advanced Motion Controls that helped make this research possible.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Blumenthal, T.D. Inhibition of the human startle response is affected by both prepulse intensity and eliciting stimulus intensity. Biol. Psychol. 1996, 44, 85–104. [Google Scholar] [CrossRef]
  2. Fröhlich, J.; Wachsmuth, I. The Visual, the Auditory and the Haptic—A User Study on Combining Modalities in Virtual Worlds. In International Conference on Virtual, Augmented and Mixed Reality; Springer: Berlin/Heidelberg, Germany, 2013. [Google Scholar]
  3. Dinh, H.; Walker, N.; Hodges, L.; Song, C.; Kobayashi, A. Evaluating the importance of multi-sensory input on memory and the sense of presence in virtual environments. In Proceedings of the IEEE Virtual Reality (Cat. No. 99CB36316), Houston, TX, USA, 13–17 March 1999. [Google Scholar]
  4. Wang, Y.; Truong, T.E.; Chesebrough, S.W.; Willemsen, P.; Foreman, K.B.; Merryweather, A.S.; Hollerbach, J.M.; Minor, M.A. Augmenting Virtual Reality Terrain Display with Smart Shoe Physical Rendering: A Pilot Study. IEEE Trans. Haptics 2021, 14, 174–187. [Google Scholar] [CrossRef] [PubMed]
  5. Sabetian, P.; Hollerbach, J.M. A 3 wire body weight support system for a large treadmill. In Proceedings of the IEEE International Conference on Robotics and Automation, Singapore, 29 May–3 June 2017; pp. 498–503. [Google Scholar]
  6. Christensen, R.R.; Hollerbach, J.M.; Xu, Y.; Meek, S.G. Inertial-force feedback for the Treadport locomotion interface. Presence Teleoperators Virtual Environ. 2000, 9, 1–14. [Google Scholar] [CrossRef]
  7. Kulkarni, S.D.; Fisher, C.J.; Lefler, P.; Desai, A.; Chakravarthy, S.; Pardyjak, E.R.; Minor, M.A.; Hollerbach, J.M. A Full Body Steerable Wind Display for a Locomotion Interface. IEEE Trans. Vis. Comput. Graph. 2015, 21, 1146–1159. [Google Scholar] [CrossRef] [PubMed]
  8. Aston, J.P.; Benko, N.; Truong, T.; Zaki, A.; Olsen, N.; Eshete, E.; Luttmer, N.G.; Coats, B.; Minor, M.A. Optimization of a Soft Robotic Bladder Array for Dissipating High Impact Loads: An Initial Study in Designing a Smart Helmet. In Proceedings of the 2020 3rd IEEE International Conference on Soft Robotics (RoboSoft), New Haven, CT, USA, 15 May—15 July 2020; p. 8. [Google Scholar]
  9. Bach, D.R.; Melinscak, F. Psychophysiological modelling and the measurement of fear conditioning. Behav. Res. Ther. 2020, 127, 103576. [Google Scholar] [CrossRef]
  10. Colvonen, P.J.; Straus, L.D.; Acheson, D.; Gehrman, P. A Review of the Relationship Between Emotional Learning and Memory, Sleep, and PTSD. Curr. Psychiatry Rep. 2019, 21, 2. [Google Scholar] [CrossRef] [PubMed]
  11. Frumento, S.; Menicucci, D.; Hitchcott, P.K.; Zaccaro, A.; Gemignani, A. Systematic Review of Studies on Subliminal Exposure to Phobic Stimuli: Integrating Therapeutic Models for Specific Phobias. Front. Neurosci. 2021, 15, 654170. [Google Scholar] [CrossRef]
  12. Hyde, J.; Ryan, K.M.; Waters, A.M. Psychophysiological Markers of Fear and Anxiety. Curr. Psychiatry Rep. 2019, 21, 56. [Google Scholar] [CrossRef]
  13. Presseller, E.K.; Patarinski, A.G.G.; Fan, S.C.; Lampe, E.W.; Juarascio, A.S. Sensor technology in eating disorders research: A systematic review. Int. J. Eat. Disord. 2022, 55, 573–624. [Google Scholar] [CrossRef]
  14. Blumenthal, T.D.; Cuthbert, B.N.; Filion, D.L.; Hackley, S.; Lipp, O.V.; Van Boxtel, A. Committee report: Guidelines for human startle eyeblink electromyographic studies. Psychophysiology 2005, 42, 1–15. [Google Scholar] [CrossRef]
  15. Clarkson, M.G.; Keith Berg, W. Bioelectric and Potentiometric Measures of Eyeblink Amplitude in Reflex Modification Paradigms. Psychophysiology 1984, 21, 237–241. [Google Scholar] [CrossRef] [PubMed]
  16. Blumenthal, T.D.; Goode, C.T. The Startle Eyeblink Response to Low Intensity Acoustic Stimuli. Psychophysiology 1991, 28, 296–306. [Google Scholar] [CrossRef] [PubMed]
  17. Mühlberger, A.; Bülthoff, H.; Wiedemann, G.; Pauli, P. Virtual Reality for the Psychophysiological Assessment of Phobic Fear: Responses During Virtual Tunnel Driving. Psychol. Assess. 2007, 19, 340–346. [Google Scholar] [CrossRef] [PubMed]
  18. Haus, M.; Rooney, C.; Barnett, J.; Westley, D.; Wong, B.L. Evaluating the Effect of Startling and Surprising Events in Immersive Training Systems for Emergency Response. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Boston, MA, USA, 22–26 October 2012. [Google Scholar]
  19. Miller, M.W.; Curtin, J.J.; Patrick, C.J. A startle-probe methodology for investigating the effects of active avoidance on negative emotional reactivity. Biol. Psychol. 1999, 50, 235–257. [Google Scholar] [CrossRef]
  20. Blumenthal, T.D. Startle modification: Implications for neuroscience, cognitive science, and clinical science. In Short Lead Interval Startle Modification; Cambridge University Press: Cambridge, UK, 1999; pp. 51–71. [Google Scholar]
  21. Yamasakp, K.; Miyata, Y. Facilitation of human startle eyeblink responses by pure-tone background stimulation. Jpn. Psychol. Res. 1982, 24, 161–164. [Google Scholar] [CrossRef]
  22. Evinger, C.; Manning, K.A. Pattern of extraocular muscle activation during reflex blinking. Exp. Brain Res. 1993, 92, 502–506. [Google Scholar] [CrossRef]
  23. Hackley, S.A.; Boelhouwer, A.J.W. The more or less startling effects of weak prestimulation—revisited: Prepulse modulation of multicomponent blink reflexes. In Attention and Orienting: Sensory and Motivational Processes; Lawrence Erlbaum Associates Publishers: Mahwah, NJ, USA, 1997; pp. 205–227. [Google Scholar]
  24. Quezada-Scholz, V.E.; Laborda, M.A.; San Martín, C.; Miguez, G.; Alfaro, F.; Mallea, J.; Díaz, F. Cued fear conditioning in humans using immersive Virtual Reality. Learn. Motiv. 2022, 78, 101803. [Google Scholar] [CrossRef]
  25. Courtney, C.G.; Dawson, M.E.; Schell, A.M.; Iyer, A.; Parsons, T.D. Better than the real thing: Eliciting fear with moving and static computer-generated stimuli. Int. J. Psychophysiol. 2010, 78, 107–114. [Google Scholar] [CrossRef]
  26. Mühlberger, A.; Wieser, M.J.; Pauli, P. Darkness-enhanced startle responses in ecologically valid environments: A virtual tunnel driving experiment. Biol. Psychol. 2008, 77, 47–52. [Google Scholar] [CrossRef]
  27. Alvarez, R.P.; Johnson, L.; Grillon, C. Contextual-specificity of short-delay extinction in humans: Renewal of fear-potentiated startle in a virtual environment. Learn. Mem. 2007, 14, 247–253. [Google Scholar] [CrossRef] [Green Version]
  28. Cuperus, A.A.; Laken, M.; van den Hout, M.A.; Engelhard, I.M. Degrading emotional memories induced by a virtual reality paradigm. J. Behav. Ther. Exp. Psychiatry 2016, 52, 45–50. [Google Scholar] [CrossRef]
  29. Gandiglio, G.; Fra, L. Further observations on facial reflexes. J. Neurol. Sci. 1967, 5, 273–285. [Google Scholar] [CrossRef]
  30. Bischoff, C.; Liscic, R.; Meyer, B.U.; Machetanz, J.; Conrad, B. Magnetically elicited blink reflex: An alternative to conventional electrical stimulation. Electromyogr. Clin. Neurophysiol. 1993, 33, 265–269. [Google Scholar]
  31. Lissek, S.; Baas, J.M.P.; Pine, D.S.; Orme, K.; Dvir, S.; Nugent, M.; Rosenberger, E.; Rawson, E.; Grillon, C. Airpuff startle probes: An efficacious and less aversive alternative to white-noise. Biol. Psychol. 2005, 68, 283–297. [Google Scholar] [CrossRef]
  32. Berg, W.K.; Balaban, M.T. Startle elicitation: Stimulus parameters, recording techniques, and quantification. In Startle Modification: Implications for Neuroscience, Cognitive Science, and Clinical Science; Cambridge University Press: Cambridge, UK, 1999; pp. 21–50. [Google Scholar]
  33. Grillon, C.; Ameli, R. Effects of threat and safety signals on startle during anticipation of aversive shocks, sounds, or airblasts. J. Psychophysiol. 1998, 12, 329–337. [Google Scholar]
  34. Beise, R.D.; Kohllöffel, L.U.E.; Claus, D. Blink reflex induced by controlled, ballistic mechanical impacts. Muscle Nerve 1999, 22, 443–448. [Google Scholar] [CrossRef]
  35. Ueoka, R.; Al Mutawaand, A.; Katsuki, H. Emotion hacking VR (EH-VR): Amplifying scary VR experience by accelerating real heart rate using false vibrotactile biofeedback. In Proceedings of the SA 2016—SIGGRAPH ASIA 2016 Emerging Technologies, Macao, China, 5–8 December 2016. [Google Scholar]
  36. Munoz, M.A.; Idrissi, S.; Sanchez-Barrera, M.B.; Fernandez-Santaella, M.C.; Vila, J. Tobacco craving and eyeblink startle modulation using 3D immersive environments: A pilot study. Psychol. Addict. Behav. 2013, 27, 243–248. [Google Scholar] [CrossRef]
  37. Robison-Andrew, E.J.; Duval, E.R.; Nelson, C.B.; Echiverri-Cohen, A.; Giardino, N.; Defever, A.; Norrholm, S.D.; Jovanovic, T.; Rothbaum, B.O.; Liberzon, I.; et al. Changes in trauma-potentiated startle with treatment of posttraumatic stress disorder in combat Veterans. J. Anxiety Disord. 2014, 28, 358–362. [Google Scholar] [CrossRef]
  38. Cornwell, B.R.; Johnson, L.; Berardi, L.; Grillon, C. Anticipation of public speaking in virtual reality reveals a relationship between trait social anxiety and startle reactivity. Biol. Psychiatry 2006, 59, 664–666. [Google Scholar] [CrossRef]
  39. Mertens, G.; Wagensveld, P.; Engelhard, I.M. Cue conditioning using a virtual spider discriminates between high and low spider fearful individuals. Comput. Hum. Behav. 2019, 91, 192–200. [Google Scholar] [CrossRef]
  40. Anton, C.; Mitrut, O.; Moldoveanu, A.; Moldoveanu, F.; Kosinka, J. A serious VR game for acrophobia therapy in an urban environment. In Proceedings of the 2020 IEEE International Conference on Artificial Intelligence and Virtual Reality, AIVR 2020, Utrecht, The Netherlands, 14–16 December 2020; IEEE Computer Society: Los Alamitos, CA, USA, 2020. [Google Scholar]
  41. Homayounpour, M.; Mortensen, J.D.; Merryweather, A.S. Auditory Warnings Invoking Startle Response Cause Faster and More Intense Neck Muscle Contractions Prior to Head Impacts. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting; SAGE Publications: Los Angeles, CA, USA, 2019. [Google Scholar]
  42. Luttmer, N.G.; Truong, T.E.; Boynton, A.M.; Carrier, D.; Minor, M.A. Treadmill Based Three Tether Parallel Robot for Evaluating Auditory Warnings while Running. In Proceedings of the IEEE International Conference on Robotics and Automation, Paris, France, 31 May–31 August 2020; pp. 9135–9142. [Google Scholar]
  43. Noronha, H.; Campos, P. Harnessing Virtual Reality Nature to Promote Well-Being. Interact. Comput. 2021, 33, 353–366. [Google Scholar] [CrossRef]
  44. Hartley, C.A.; Phelps, E.A. Extinction Learning. In Encyclopedia of the Sciences of Learning; Seel, N.M., Ed.; Springer USA: Boston, MA, USA, 2012; pp. 1252–1253. [Google Scholar]
  45. Beery, T.; Jørgensen, K.A. Children in nature: Sensory engagement and the experience of biodiversity. Environ. Educ. Res. 2018, 24, 13–25. [Google Scholar] [CrossRef]
  46. Graham, F.K. The More or Less Startling Effects of Weak Prestimulation. Psychophysiology 1975, 12, 238–248. [Google Scholar] [CrossRef]
  47. Lehning, J.R. Technological innovation, commercialization, and regional development: Computer graphics in Utah, 1965–1978. Inf. Cult. 2016, 51, 479–499. [Google Scholar]
  48. van Weelden, E.; Alimardani, M.; Wiltshire, T.J.; Louwerse, M.M. Aviation and neurophysiology: A systematic review. Appl. Ergon. 2022, 105, 103838. [Google Scholar] [CrossRef] [PubMed]
  49. Feltman, K.A.; Bernhardt, K.A.; Kelley, A.M. Measuring the Domain Specificity of Workload Using EEG: Auditory and Visual Domains in Rotary-Wing Simulated Flight. Hum. Factors 2021, 63, 1271–1283. [Google Scholar] [CrossRef] [PubMed]
  50. Xu, F.; Zhu, Q.; Li, S.; Song, Z.; Du, J. VR-Based Haptic Simulator for Subsea Robot Teleoperations. In Proceedings of the ASCE International Conference on Computing in Civil Engineering 2021, Orlando, FL, USA, 12–14 September 2021. [Google Scholar]
  51. Xia, P.; McSweeney, K.; Wen, F.; Song, Z.; Krieg, M.; Li, S.; Yu, X.; Crippen, K.; Adams, J.; Du, E.J. Virtual Telepresence for the Future of Rov Teleoperations: Opportunities and Challenges. In Proceedings of the SNAME 27th Offshore Symposium, Houston, TX, USA, 22 February 2022. [Google Scholar]
  52. Azadi, S.; Green, I.C.; Arnold, A.; Truong, M.; Potts, J.; Martino, M.A. Robotic Surgery: The Impact of Simulation and Other Innovative Platforms on Performance and Training. J. Minim. Invasive Gynecol. 2021, 28, 490–495. [Google Scholar] [CrossRef]
  53. Slater, M.; Usoh, M. Representations systems, perceptual position, and presence in immersive virtual environments. Presence Teleoperators Virtual Environ. 1993, 2, 221–233. [Google Scholar] [CrossRef]
  54. Richard, E.; Tijou, A.; Richard, P.; Ferrier, J.L. Multi-modal virtual environments for education with haptic and olfactory feedback. Virtual Real. 2006, 10, 207–225. [Google Scholar] [CrossRef]
  55. Dionisio, J. Virtual hell: A trip through the flames. IEEE Comput. Graph. Appl. 1997, 17, 11–14. [Google Scholar] [CrossRef]
  56. Ranasinghe, N.; Jain, P.; Karwita, S.; Tolley, D.; Do, E.Y.-L. Ambiotherm: Enhancing Sense of Presence in Virtual Reality by Simulating Real-World Environmental Conditions. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, Denver, CO, USA, 6–11 May 2017; pp. 1731–1742. [Google Scholar]
  57. Moon, T.; Kim, G. Design and Evaluation of a Wind Display for Virtual Reality. In Proceedings of the ACM Symposium on Virtual Reality Software and Technology, Hong Kong, 10–12 November 2004. [Google Scholar] [CrossRef]
  58. Nunez, D. How is presence in non-immersive, non-realistic virtual environments possible? In Proceedings of the 3rd International Conference on Computer Graphics, Virtual Reality, Visualisation and Interaction in Africa, Stellenbosch, South Africa, 3–5 November 2004; pp. 83–86. [Google Scholar]
  59. Deligiannidis, L.; Jacob, R.J.K. The VR Scooter: Wind and Tactile Feedback Improve User Performance. In Proceedings of the 3D User Interfaces (3DUI’06), Alexandria, VA, USA, 25–26 March 2006; pp. 143–150. [Google Scholar]
  60. Morton, M.H. Sensorama. U.S. Patent 3,050,870, 10 January 1961. [Google Scholar]
  61. Motekmedical. The Worlds Most Advanced Biomechanics Lab. Available online: https://www.motekmedical.com/solution/caren/ (accessed on 19 August 2022).
  62. Ronchi, E.; Mayorga, D.; Lovreglio, R.; Wahlqvist, J.; Nilsson, D. Mobile-powered head-mounted displays versus cave automatic virtual environment experiments for evacuation research. Comput. Animat. Virtual Worlds 2019, 30, e1873. [Google Scholar] [CrossRef]
  63. Cellini, R.; Paladina, G.; Mascaro, G.; Lembo, M.A.; Lombardo Facciale, A.; Ferrera, M.C.; Fonti, B.; Pergolizzi, L.; Buonasera, P.; Bramanti, P.; et al. Effect of Immersive Virtual Reality by a Computer Assisted Rehabilitation Environment (CAREN) in Juvenile Huntington’s Disease: A Case Report. Medicina 2022, 58, 919. [Google Scholar] [CrossRef]
  64. MacDonald, M.E.; Siragy, T.; Hill, A.; Nantel, J. Walking on Mild Slopes and Altering Arm Swing Each Induce Specific Strategies in Healthy Young Adults. Front. Sports Act. Living 2022, 3, 805147. [Google Scholar] [CrossRef]
  65. Parker, C.R.; Carrier, D.R.; Hollerbach, J.M. Validation of torso force feedback slope simulation through an energy cost comparison. In Proceedings of the 1st Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems; World Haptics Conference, WHC 2005, Pisa, Italy, 18–20 March 2005; IEEE: New York, NY, USA, 2005. [Google Scholar]
  66. Hollerbach, J.M.; Mills, R.; Tristano, D.; Christensen, R.R.; Thompson, W.B.; Xu, Y. Torso force feedback realistically simulates slope on treadmill-style locomotion interfaces. Int. J. Robot. Res. 2001, 20, 939–952. [Google Scholar] [CrossRef]
  67. Tristano, D.; Hollerbach, J.; Christensen, R. Slope display on a locomotion interface. In Experimental Robotics VI; Springer: London, UK, 2000. [Google Scholar]
  68. Hejrati, B.; Crandall, K.L.; Hollerbach, J.M.; Abbott, J.J. Kinesthetic force feedback and belt control for the treadport locomotion interface. IEEE Trans. Haptics 2015, 8, 176–187. [Google Scholar] [CrossRef]
  69. Tant, G.R.; Raitor, M.; Collins, S.H. Bump’em: An Open-Source, Bump-Emulation System for Studying Human Balance and Gait. In Proceedings of the IEEE International Conference on Robotics and Automation, Paris, France, 1 May–31 August 2020; IEEE: New York, NY, USA, 2020. [Google Scholar]
  70. Chesebrough, S.; Hejrati, B.; Hollerbach, J. The Treadport: Natural Gait on a Treadmill. Hum. Factors 2019, 61, 736–748. [Google Scholar] [CrossRef]
  71. Lefler, P. Olfactory Display for the Treadport Active Wind Tunnel; The University of Utah: Salt Lake City, UT, USA, 2012. [Google Scholar]
  72. Sokal, R.R. Biometry: The Principles and Practice of Statistics in Biological Research, 3rd ed.; W.H. Freeman: New York, NY, USA, 1995. [Google Scholar]
  73. Betancourt, J.; Wojtkowski, B.; Castillo, P.; Thouvenin, I. Exocentric control scheme for robot applications: An immersive virtual reality approach. IEEE Trans. Vis. Comput. Graph. 2022; early access. [Google Scholar] [CrossRef]
  74. Mine, M. Towards Virtual Reality for the masses: 10 years of research at Disney’s VR Studio. In Proceedings of the Workshop on Virtual Environments, EGVE’03, Zurich, Switzerland, 22–23 May 2003. [Google Scholar] [CrossRef]
Figure 1. Scent, mist, and heat displays (left). Overview of MS.TPAWT in the VR game (right).
Figure 1. Scent, mist, and heat displays (left). Overview of MS.TPAWT in the VR game (right).
Virtualworlds 01 00005 g001
Figure 2. Illustration of the bird startle.
Figure 2. Illustration of the bird startle.
Virtualworlds 01 00005 g002
Figure 3. Illustration of the beam startle.
Figure 3. Illustration of the beam startle.
Virtualworlds 01 00005 g003
Figure 4. Thunder startle. Before thunder (left), and during thunder (right).
Figure 4. Thunder startle. Before thunder (left), and during thunder (right).
Virtualworlds 01 00005 g004
Figure 5. Bird startle transformed EMG and survey data plots showing means and confidence intervals.
Figure 5. Bird startle transformed EMG and survey data plots showing means and confidence intervals.
Virtualworlds 01 00005 g005
Figure 6. Beam startle plots showing average values and 95% confidence intervals for transformed EMG and survey data.
Figure 6. Beam startle plots showing average values and 95% confidence intervals for transformed EMG and survey data.
Virtualworlds 01 00005 g006
Figure 7. Thunder startle plots showing average values and 95% confidence intervals for transformed EMG and survey data.
Figure 7. Thunder startle plots showing average values and 95% confidence intervals for transformed EMG and survey data.
Virtualworlds 01 00005 g007
Figure 8. Graphic plots showing average values and 95% confidence intervals for eye EMG and survey data.
Figure 8. Graphic plots showing average values and 95% confidence intervals for eye EMG and survey data.
Virtualworlds 01 00005 g008
Figure 9. Premonition plots showing average values and 95% confidence intervals.
Figure 9. Premonition plots showing average values and 95% confidence intervals.
Virtualworlds 01 00005 g009
Table 1. Factorial design of the study showing combinations of auditory and environmental stimulation.
Table 1. Factorial design of the study showing combinations of auditory and environmental stimulation.
Environment OffEnvironment On: Heat, Wind, Scent, Moisture
High Auditory Startle (90 dB)Aud: High, Env: OffAud: High, Env: On
Medium Auditory Startle (80 dB)Aud: Med., Env: OffAud: Med., Env: On
Background Noise Only (70 dB)Aud: Off, Env: OffAud: Off, Env: On
Table 2. Bird startle ANOVA results for EMG [Box–Cox-transformed] and survey data [VAS]. * = statistically significant results (p < 0.05).
Table 2. Bird startle ANOVA results for EMG [Box–Cox-transformed] and survey data [VAS]. * = statistically significant results (p < 0.05).
EMG and SurveyBird
Neck EMGBack EMGEye EMGSurvey
Env: OnAud: High−1.77 ± 0.94−1.98 ± 0.47−1.66 ± 0.713.30 ± 1.30
Aud: Med−2.12 ± 0.85−1.87 ± 0.45−1.74 ± 0.573.00 ± 1.26
Aud: Off−2.31 ± 0.84−2.01 ± 0.33−2.27 ± 0.721.80 ± 1.15
Env: OffAud: High−1.98 ± 1.13−1.80 ± 0.50−1.70 ± 0.563.85 ± 0.99
Aud: Med−1.89 ± 0.82−1.74 ± 0.51−1.87 ± 0.753.35 ± 1.23
Aud: Off−2.51 ± 0.68−1.90 ± 0.39−2.08 ± 0.661.60 ± 0.88
Box Cox Transform: λ0.14840.24660.1958
Degrees of Freedom: df112108112119
Audio: F(2,df)3.481.055.4529.79
Environment: F(1,df)0.122.690.001.25
Aud*Env: F(2,df)0.760.070.551.15
Audio: p0.034 *0.3550.006 *<0.001 *
Environment: p0.7250.1040.9510.266
Aud*Env: p0.4720.9290.5800.319
Table 3. Beam startle ANOVA results for EMG [Box–Cox-transformed] and survey data [VAS]. * = statistically significant results (p < 0.05).
Table 3. Beam startle ANOVA results for EMG [Box–Cox-transformed] and survey data [VAS]. * = statistically significant results (p < 0.05).
EMG and SurveyBeam
Neck EMGBack EMGEye EMGSurvey
Env: OnAud: High−0.91 ± 1.10−1.68 ± 0.89−1.34 ± 0.674.35 ± 0.88
Aud: Med−1.07 ± 1.18−1.87 ± 1.08−1.22 ± 0.674.15 ± 1.23
Aud: Off−2.27 ± 0.88−2.40 ± 0.56−1.75 ± 0.652.95 ± 1.50
Env: OffAud: High−1.82 ± 1.03−1.71 ± 0.96−1.38 ± 0.694.50 ± 0.83
Aud: Med−1.61 ± 0.91−1.90 ± 0.84−1.48 ± 0.674.10 ± 0.97
Aud: Off−2.34 ± 0.68−2.35 ± 0.70−1.93 ± 0.672.65 ± 1.27
Box Cox Transform: λ0.15300.09930.2422
Degrees of Freedom: df104112111119
Audio: F(2,df)11.105.966.1923.13
Environment: F(1,df)7.11<0.0011.490.10
Aud*Env: F(2,df)1.610.020.270.39
Audio: p<0.001 *0.004 *0.003 *<0.001 *
Environment: p0.009 *0.9920.2240.748
Aud*Env: p0.2050.9770.7640.676
Table 4. Thunder startle ANOVA results with category mean and std for EMG [Box–Cox-transformed] and survey data [VAS]. * = statistically significant results (p < 0.05), ^ = notable results ( 0.05   p < 0.09 ).
Table 4. Thunder startle ANOVA results with category mean and std for EMG [Box–Cox-transformed] and survey data [VAS]. * = statistically significant results (p < 0.05), ^ = notable results ( 0.05   p < 0.09 ).
EMG and SurveyThunder
Neck EMGBack EMGEye EMGSurvey
Env: OnAud: High−1.81 ± 1.20−2.37 ± 0.90−1.28 ± 0.502.70 ± 1.34
Aud: Med−1.68 ± 1.43−2.01 ± 0.98−1.03 ± 0.612.60 ± 1.39
Aud: Off−2.56 ± 0.77−2.51 ± 0.68−1.46 ± 0.812.50 ± 1.15
Env: OffAud: High−1.80 ± 1.60−1.91 ± 0.89−1.32 ± 0.513.70 ± 1.34
Aud: Med−1.68 ± 1.19−2.00 ± 0.86−1.52 ± 0.673.35 ± 1.39
Aud: Off−2.30 ± 1.11−2.09 ± 0.96−1.60 ± 0.602.80 ± 1.20
Box Cox Transform: λ0.06170.05710.2675
Degrees of Freedom: df111113112119
Audio: F(2,df)3.671.031.831.80
Environment: F(1,df)0.153.253.688.23
Aud*Env: F(2,df)0.130.741.370.74
Audio: p0.029 *0.3620.1650.170
Environment: p0.7040.074 ^0.058 ^0.004 *
Aud*Env: p0.8780.4770.2570.480
Table 5. One-way ANOVA of the effects of graphical startle level results with category mean and std for EMG [Box–Cox-transformed] and survey data [VAS]. * = statistically significant results (p < 0.05).
Table 5. One-way ANOVA of the effects of graphical startle level results with category mean and std for EMG [Box–Cox-transformed] and survey data [VAS]. * = statistically significant results (p < 0.05).
Visual LevelsEye EMGSurvey
Vis: Small (Bird)−1.88 ± 0.693.28 ± 1.34
Vis: Large (Beam)−1.51 ± 0.703.75 ± 1.29
Vis: Large w/Prem (Thunder)−1.37 ± 0.632.93 ± 1.41
Degrees of Freedom: df186179
Visual: F(2,df)6.085.52
Visual: p0.003 *0.005 *
Table 6. Two-way ANOVA of the effects of premonition and environment with category mean and std for EMG [Box–Cox-transformed] and survey data [VAS]. * = statistically significant results (p < 0.05), ^ = notable results ( 0.05   p < 0.09 ).
Table 6. Two-way ANOVA of the effects of premonition and environment with category mean and std for EMG [Box–Cox-transformed] and survey data [VAS]. * = statistically significant results (p < 0.05), ^ = notable results ( 0.05   p < 0.09 ).
Premonition and EnvironmentNeck EMG Back EMGSurvey
Premonition: On (Thunder)Env: On−2.00 ± 1.22 −2.29 ± 0.87 2.60 ± 1.28
Env: Off−1.92 ± 1.34 −1.99 ± 0.89 3.28 ± 1.34
Premonition: Off (Beam)Env: On−1.37 ± 1.21 −1.97 ± 0.91 3.82 ± 1.36
Env: Off−1.94 ± 0.91 −1.98 ± 0.88 3.75 ± 1.30
Degrees of Freedom: df216226239
Premonition: F(1,df)3.582.0124.42
Environment: F(1,df)2.421.543.28
Prem*Env: F(1,df)4.121.604.85
Premonition: p0.059 ^0.157<0.001 *
Environment: p0.1210.2160.072 ^
Prem*Env: p0.043 *0.2070.028 *
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Truong, T.E.; Luttmer, N.G.; Eshete, E.R.; Zaki, A.B.M.; Greer, D.D.; Hirschi, T.J.; Stewart, B.R.; Gregory, C.A.; Minor, M.A. Evaluating the Effect of Multi-Sensory Stimulation on Startle Response Using the Virtual Reality Locomotion Interface MS.TPAWT. Virtual Worlds 2022, 1, 62-81. https://doi.org/10.3390/virtualworlds1010005

AMA Style

Truong TE, Luttmer NG, Eshete ER, Zaki ABM, Greer DD, Hirschi TJ, Stewart BR, Gregory CA, Minor MA. Evaluating the Effect of Multi-Sensory Stimulation on Startle Response Using the Virtual Reality Locomotion Interface MS.TPAWT. Virtual Worlds. 2022; 1(1):62-81. https://doi.org/10.3390/virtualworlds1010005

Chicago/Turabian Style

Truong, Takara E., Nathaniel G. Luttmer, Ebsa R. Eshete, Alia B. M. Zaki, Derek D. Greer, Tren J. Hirschi, Benjamin R. Stewart, Cherry A. Gregory, and Mark A. Minor. 2022. "Evaluating the Effect of Multi-Sensory Stimulation on Startle Response Using the Virtual Reality Locomotion Interface MS.TPAWT" Virtual Worlds 1, no. 1: 62-81. https://doi.org/10.3390/virtualworlds1010005

Article Metrics

Back to TopTop