Next Article in Journal
MOLI: Smart Conversation Agent for Mobile Customer Service
Previous Article in Journal
Dynamic Top-K Interesting Subgraph Query on Large-Scale Labeled Graphs
Previous Article in Special Issue
PROud—A Gamification Framework Based on Programming Exercises Usage Data
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Cloud Based Virtual Reality Exposure Therapy Service for Public Speaking Anxiety †

by
Justas Šalkevičius
1,*,
Audronė Miškinytė
2 and
Lukas Navickas
1
1
Department of Software Engineering, Kaunas University of Technology, Kaunas 44249, Lithuania
2
Department of Clinical Psychology, University of Bergen, 5020 Bergen, Norway
*
Author to whom correspondence should be addressed.
This paper is extended version of our paper presented during 2018 6th International Conference on Future Internet of Things and Cloud Workshops (FiCloud), Barcelona, Spain, 6–8 August 2018.
Information 2019, 10(2), 62; https://doi.org/10.3390/info10020062
Submission received: 16 January 2019 / Revised: 8 February 2019 / Accepted: 13 February 2019 / Published: 15 February 2019
(This article belongs to the Special Issue Cloud Gamification)

Abstract

:
Public speaking anxiety is commonly treated using cognitive behavioral therapy. During the therapy session, the patient is either asked to vividly imagine and describe the feared stimulus or is confronted with it in the real world. Sometimes, however, it can be hard to imagine the object of fear or to create a controllable environment that contains this stimulus. Virtual reality exposure therapy (VRET) can help solve these problems by placing the patient in a simulated 3D environment. While standalone VRET applications have been investigated for more than 25 years, we are analyzing the viability of a cloud-based VRET system. In this paper, we discuss the architectural and technical choices made in order to create a mobile and lightweight solution that can be easily adapted by any psychology clinic. Moreover, we are analyzing data gathered from 30 participants who have undergone a VRET session for public speaking anxiety. Finally, the collected psychophysiological signals including galvanic skin response (GSR) and skin temperature are processed and investigated in order to evaluate our cloud-based VRET system.

1. Introduction

According to World Health Organization statistics, anxiety disorders are reaching rates as high as 28.8% [1]. Public speaking anxiety is usually treated by employing cognitive behavioral therapy (CBT) combined with exposure therapy (ET). These therapy sessions are designed to let the patient experience their anxiety in a managed environment enabled by vivo (real-world confrontation of fear stimuli) and an imagined exposure (when the patient is asked to vividly imagine and describe the fear stimulus) [2]. However, providing manageable environments could be a vast and expensive task, whereas the use of virtual reality exposure therapy (VRET) can help to achieve same desired effect by putting the patient into a simulated 3D environment [3].
More than 24 VRET studies have been conducted and reviewed during the years 2012 to 2015 [4]. The authors of these VRET trials demonstrated that the use of virtual reality systems in the therapeutic setting is even more effective than a standard CBT session. However, as more and more virtual reality headsets reach the market, it is increasingly important to analyze the impact of new devices on the quality of the VRET sessions. Even simple hardware limitations like poor screen density or a poor VR sensor signal can reduce the effectiveness of the therapy [5]. Therefore, more effort should be applied to investigate and measure the impact of newly developed VRET applications that enable the psychologist to interact with and adapt the 3D environment for the patient in real-time. Public speaking anxiety has been successfully treated using various VRET implementations [6,7,8], but these implementations do not compare with a mobile VRET solution that can be easily accessed by wider audiences and that gives more VR scenario control options to the psychologist.
Nowadays, software as a service (SaaS) solutions are widely used to help various professionals improve their results. Cloud solutions, however, are still rarely used for VRET applications. We thus attempt to demonstrate the viability of the cloud-based VRET system.

2. Related Works

An extensive review of the multiple VRET studies was conducted by Botella et al. [8]. The authors noted that simulated VR scenarios can increase the effect of CBT sessions and can provide a faster result for the patient. The paper also states that VRET is limited by poor uptake by psychologists. This poor uptake is usually caused by the difficulty of setting up the VR hardware, lack of simple instructions for the user, and the cost of the VRET systems.
Another systematic review of the VRET applications was written by Carl et al. The authors noted that this kind of therapy is more effective than standard CBT treatments [9]. Moreover, this effectiveness can be seen across different phobias.
An online VRET application was developed and reviewed by Levy et al. They used the online VR scenario for the treatment of acrophobia [10]. However, the authors stated that the extensive requirements of the VR setup and the need for powerful PCs are stopping the adoption of VRET solutions.
Lindner et al. compared therapist-led and self-led VRET sessions conducted using mobile devices [11]. Both groups in their study experienced a significant reduction of public speaking anxiety. The advantages of mobile VRET solutions were also reviewed by Wiederhold et al. [12]. In their paper, the authors emphasized that while the mobile solutions cannot offer high definition displays, they can still help to significantly reduce pain and anxiety.
Automated VRET sessions for acrophobia were conducted by Freeman et al. [13]. Over a period of four weeks, these sessions managed to significantly reduce the fear of heights for the study group. These results demonstrated the possibility of reducing therapy costs using the VRET solution.
Manju et al. used a VR system to reduce social anxiety within a study group of kids with autism disorder [14]. The virtual environments provided the ability to repeat social interactions multiple times, which helped these patients to concentrate their minds on a single action.
The benefits of converting traditional one-session exposure therapy (OST) to the VRET system are discussed by Miloff et al. During OST, the patient is gradually exposed to the stimuli for up to 3 h, which can use up significant financial resources—while a VR system can help to reuse the same 3D environment for multiple patients [15]. Various authors reviewed the possibility of combining VRET systems with another stress management practice. Diaphragmatic breathing during the VRET session is suggested as a way to reduce stress [16]. A visual feedback system that could represent the current level of anxiety in the 3D environment was also investigated as an impactful way to control the anxiety [17]. A VRET solution with a haptic feedback device that emulates a heart beat was described by Seol et al. [18].
Suyanto et al. suggested that a mobile VRET solution could be improved by the Kinect sensor. Moreover, they used the State-Trait Anxiety Inventory (STAI) scale in order to evaluate the reduction of self-observed anxiety levels before and after the VRET session [19]. Similarly, Shunnaq et al. used the Hamilton scale to measure the efficiency of their VRET solution [20]. The Subjective Units of Distress (SUD) scale was used by Rauch et al. in an extensive VRET study for PTSD (post-traumatic stress disorder) [21].
A VR driving simulator was used by Zinzow et al. They investigated the possibility of using a VRET system as a tool for the treatment of PTSD [22]. While some participants complained about the virtual reality environments not being realistic enough, the study showed a significant reduction in risky and aggressive driving.
Sight impairments can be a major factor in the usefulness of VRET. Bun et al. separated their experiments’ participants into four groups—with and without the fear of heights and with and without sight impairments [23]. The authors observed that higher anxiety levels were induced by the VR stimuli for the group without sight impairment.
Marquardt et al. proposed that VRET applications could be improved using multisensory cues to increase engagement and to trigger stronger emotional response [24]. Their system includes an olfactory device, which can trigger a variety of smells aimed towards the user and which uses large fans to create a wind effect for acrophobia environments.
Gareth Walkom used biofeedback devices to measure psychophysiological signals [25]. Electrodermal activity and skin temperature were observed using an Affectiva Q Sensor wristband. The results showed a significant increase in the psychophysiological signals.
The effects of the 3D scene characteristics (space openness, tidiness, and color) in a virtual environment were analyzed by Christofi et al. Their research showed no significant increase of anxiety in messy virtual environments [26]. Contradicting participants report that messy rooms increased their stress.
Poeschl et al. investigated the effects of a virtual crowd during the VRET session [27]. The facial expressions and gestures of the virtual audience were highlighted as important factors for increased stimuli.

3. System Development

3.1. Hardware Selection

For any VR solution, it is important to pick the correct virtual reality headset, as it is essential to the VR experience. We reviewed widely-used consumer grade VR devices. Oculus Rift, HTC Vive, and Samsung Gear VR were considered as they are the most popular options. We emphasized that it should be easy to set up the VR sensor and it should require only minimal changes to the existing IT hardware. This was important as our target audience for the cloud system was psychology clinics. While HTC Vive and Oculus Rift can support more realistic 3D environments, they require a high-performance PC and an additional sensor setup in order to work.
The main goal of our cloud solution was to give access to the VRET scenarios through the internet and to make this access easy. This requirement led to our decision to use Gear VR. The chosen headset can work without any sensors and with any mobile phone that supports VR applications. The use of mobile devices also sets low requirements for the psychologist system. The service for the therapist, with embedded WebGL VR content, can run on a wide variety of PCs and laptops that usually are already used and owned by the psychologist.

3.2. System Architecture

The system architecture was designed for the SaaS (system as a service) solution. Our goal was to make a VRET system that could be accessible for psychology clinics anywhere in the world. Also, it was important for us to be able to scale the system up and to support multiple VRET sessions at the same time. The main components of our cloud solution are the web service, implemented using standard REST (representational state transfer) architecture, and the VRET dashboard application hosted on Azure.
The therapist has access to their account on our system and can accept new patients, track historical session data, and start a new session using VRET spectator. Each therapist is assigned a unique identifier.
This unique identifier is used in Unity multiplayer service as a hosted game key and helps to synchronize the VR simulation between the Gear VR application used by the patient and the WebGL application used by the psychologist. This allows us to enable the therapist to view patient behavior during the therapy session and to control the VRET session intensity by changing the 3D environment. The core parts of our system architecture can be seen in Figure 1.

3.3. Therapist Dashboard

The therapist dashboard is an entrance point providing the capability to manage all the patients treated by the therapist. The VRET session can be started through the browser after the new patient has been registered. Patient height is a key setting to provide a good VR experience, as GearVR does not have a height sensor. The VR camera has to be configured to use this value as a reference point in order to avoid vertigo. The system’s lightweight setup can be seen in Figure 2.
The psychologist can start a VRET session by picking the type of fear it will focus on and the 3D scene scenario. The system will load a WebGL based application that hosts a room in Unity Multiplayer service, picked up and loaded by the GearVR application used by the patient. The patient’s page, with a session history, is illustrated in Figure 3.

3.4. VRET Spectator Application

The Unity3D game engine was used to develop the spectator application. This application relies on WebGL technology and is served on the web browser. It communicates with the main components of the cloud system using the REST API. The spectator application downloads the 3D data and loads the scene.
The unique identifier is used to create a game session with the same name in a Unity Cloud Multiplayer service. This hosted game includes a 3D scene with the simulation scenario. Only the Gear VR application that is connected with the same identifier can join this VRET session. After a successful connection to the hosted game, the WebGL application starts to show the 3D environment, seen through the VR headset in the web browser.
The psychologist can interact with the loaded 3D scene using the special control buttons on the WebGL application. Initiating the interaction WebGL module fires the synchronization command in the Unity Multiplayer service, and the changes are transferred to the Gear VR application. Moreover, all interactions made by the therapist are logged with the calls to the REST service. Therefore, all collected records can be accessed and reviewed after the session is over.

3.5. Virtual Reality Application

The patient uses the Gear VR application created with the Unity game engine. At the initial stage of the setup, the user is prompted to enter the identifier provided by the therapist. This identifier is used by the GearVR application to find the patient on the database currently being used in the VRET session. Using this value with our REST API, all the collected data are assigned to the correct patient. Additionally, the unique identifier is used to join the correct game room in the Unity multiplayer service hosted by the psychologist on the WebGL application.
The selected VRET scenario scene is downloaded by the GearVR application from the asset repository. This is done by mapping the unique identifier to the VR scene. All downloaded assets are cached within the mobile device and updated only when they are changed. Finally, the VR application synchronizes patient behavior and movement in the virtual scene with the WebGL module running on the web browser.

3.6. System Usage Workflow

Our VRET cloud-based service is focused around the interactions of the psychologist and the patient. However, the session must be started by the psychologist. When they log into the system, information about his or her patient can be entered. Previously conducted sessions are presented on the dashboard. Alternatively, the psychologist can register a new patient by filling out the correct information—name, surname, and height. The psychologist and the patient can then start a VRET session. This is done by selecting an anxiety disorder and then selecting a specific VR scene. During the session, the psychologist can monitor the patient’s performance and see additional information. The psychologist can start specific events inside the environment by creating situations that require the patient to use their social skills to comprehend them. After the session is over, the psychologist can review all previous sessions.

4. VRET Module for Public Speaking Anxiety

Three VRET environments for the treatment of public speaking anxiety were designed and made available as part of our cloud service. During all the scenarios, the patients are put into a situation in which they must deliver an oral presentation. The scenarios have different audience sizes and differing levels of social importance for the oral presentation.

4.1. Interaction and Session Settings

The therapist can interact with the VRET scenario using the controls presented on the WebGL spectator application. For public speaking scenarios, we implemented an emotional crowd control system. It enables differing 3D avatar reactions to be made to the speaker. The virtual listeners can demonstrate approval, disapproval, happiness, and other emotions.
Moreover, the therapist can enable heart beat or heavy breathing sound effects, which are often associated with panic attacks. Finally, an unexpected situation like a computer crash in the middle of the presentation can be simulated. This forces the patient to learn to adapt and stay calm.
All available control options are illustrated in Figure 4.

4.2. VR Environments for Treatment of Public Speaking Anxiety

Three different VR scenes were developed, from a presentation in a small office to a big meeting in the corporate environment and, finally, a speech on the virtual stage in front of a big audience, with time pressure (Figure 5).
For the VRET sessions, the patient has to prepare a unique speech to deliver with the slideshow presentation. These slides are uploaded to our service and are then integrated into the VR environments. The patient can go back or forward into the slides using the GearVR controller like a mouse clicker.

4.2.1. Meeting Room

The office meeting room is the smallest and least demanding environment. This scene is created for those who are getting started with therapy and with virtual reality in general. The patient is standing in front of two colleagues and has to give them a presentation. In this scene, the psychologist can use different animations for the reaction of the avatars. For example, the therapist can initiate applause or disapproving reactions. The virtual PC that is demonstrating a presentation can also be turned off or restarted. Finally, the psychologist can trigger sound effects like a heart beat or the sound of your own breathing.

4.2.2. Conference Room

The conference room is a much bigger scene than the office meeting room and contains a lot more complexity (lights, projector, white screen, etc.). The room contains an audience of virtual colleagues (3D avatars). The scenario flow of this environment is the same as in the previous meeting room scene. As in the first environment, the psychologist can interact with the virtual crowd in the room, set their behaviors, and add additional sound effects.

4.2.3. Auditorium

The auditorium is the biggest and most crowded scene of the three environments. In this scene, there are more than 20 virtual avatars and there is a projector and an additional screen. There is also a timer that shows how much time the patient has left before the allotted presentation time will be over. This scene is unique because it does not start in the auditorium; the patient must walk through the lobby and behind the scene before entering the stage.

4.3. Biofeedback Tracking

Psychophysiological signals can help to measure the impact of the VRET session. We thus incorporated the capacity to set up biofeedback sensors, using Bluetooth connection. All collected signals are sent back to the cloud service.
In its current state, our system supports signals of galvanic skin response (GSR) and measurements of skin temperature. This was achieved by implementing an additional application for the Empatica E4 wristband sensor.

5. Experiment

5.1. Participants Demographics

A group of 30 people participated in our experiment. All subjects gave and signed their informed consent for inclusion before they participated in the study. The study and protocol was approved by the Kauno regional biomedical research ethics committee (reference number BE-2-60).
The group consisted of 13 females and 17 males, aged 21–34. Most of the participants (76%) had a job. They were thus familiar with workplace public speaking scenarios (presentations during meetings in the office). Participants’ previous experience with virtual reality systems and video games were recorded. All demographic data is shown in Table 1.

5.2. Flow of the Experiment

Each participant took part in a single therapy session. At the start of this session, the subject had to complete the initial demographic forms and consent to participation in the experiment. Multiple psychological self-observed scales were presented to the patient, including the Public Speaking Anxiety Scale (PSAS) and Personal Report of Communication Apprehension (PRCA-24) and Speech Anxiety Thoughts Inventory (SATI). Baseline galvanic skin response and skin temperature samples were collected from each subject. These data were later used as a baseline.
During the next stage of the session, the participant had a conversation with the psychologist and had 10 min to prepare a short speech on a suggested topic. The subject then delivered this prepared speech in front of the therapist, while biofeedback signals were collected. The speech lasted up to 5 min. Finally, the participant discussed the speech with the psychologist and evaluated it. The participant also registered Subjective Units of Distress Scale (SUDS) and self-observed signs of discomfort (vertigo, muscle tenseness, etc.).
The experimental session continued immediately with the VRET stage. The subject was given up to 10 min to rehearse a prepared speech. They then had to deliver the speech in front of the virtual auditorium while the psychologist controlled the 3D crowd and environment. As in previous steps, a biofeedback device was used to measure GSR and skin temperature following the VRET stimuli. This speech was limited to 5 min.
When the VRET session was finished, the participant evaluated the speech and filled out multiple scales, including the Public Speaking Anxiety Scale (PSAS), Personal Report of Communication Apprehension (PRCA-24) and Speech Anxiety Thoughts Inventory (SATI), Subjective Units of Distress Scale (SUDS), and Igroup presence questionnaire (IPQ). This was done in order to compare the values before and after the VRET session.
Finally, the subject had a discussion with the psychologist about his or her experiences during the session. The flow of the experiment conducted is demonstrated in Figure 6. The total length of the session ranged from 45 to 90 min.

5.3. Subjective Units of Distress Scale (SUDS) and Subjective Symptoms of Discomfort

The Subjective Units of Distress Scale is commonly used to measure the patient’s anxiety. The scale starts at 10 (alert and awake) and ends with the value of 100 (highest anxiety/distress ever felt). All values from SUDS are displayed in Table 2.
Subjects rated their anxiety levels during VRET sessions much higher than a baseline: 37.9 SUD against a baseline of 28.3 SUD. This indicates that the proposed cloud-based VRET system and virtual environments provided the stimuli for the anxiety. All reported SUD values are displayed in Figure 7.
We also compared SUDS values of the subjects who agreed or strongly agreed with the statement, “While preparing for giving a speech, I feel tense and nervous”, which was included in the PSAS scale (mean of 24.2 SUD baseline and 45.7 VRET session SUD, higher than a mean SUD). Comparison graphs can be seen in Figure 8.
Additionally, we compared the differences in the reported SUDS values based on the participants’ previous experience with VR applications and video games. We noticed that gamers and those who had tried VR before the therapy session reported higher SUDS values after the VRET session (Figure 9). This could indicate deeper immersion within the virtual environments because of more experience with virtual worlds (games). However, a dedicated study is required to investigate this effect further.
Some negative effects of virtual reality systems were noticed. This was done by analyzing subjective symptoms of discomfort reported by the participants. As expected, one of the negative reactions to the VR environments is a vertigo effect, which was indicated by nausea (one subject) and loss of balance (six subjects). Moreover, wearing a VR headset increased sweating (six subjects reported it during the speech exercise and nine subjects were affected during the VRET session). The rest of the observed symptoms associated with the increased anxiety were similar in both cases. Self-observed symptoms of all the subjects are shown in Table 3.

5.4. Biofeedback Data Collection

Our VRET system was developed with the capacity to support wearable biofeedback devices. In this experiment, we used an Empatica E4 wristband. The data from the skin temperature (ST) sensor were expressed in degrees on the Celsius (°C) scale (sampled at 4 Hz) and the galvanic skin response in μS (sampled at 4 Hz).
The collected values of GSR and the skin temperature signals contained considerable amounts of noise as the biofeedback device was worn on the participants’ wrist, and the signal was impacted by their hand gestures. We applied the low pass 1Hz Butterworth filter in order to clean the collected data and to remove outliers. Moreover, a moving average filter was used to smooth the signal. A comparison of the noisy raw signal and the processed signal can be seen in Figure 10.
GSR and the skin temperature signals were normalized using Equation (1):
X n o r m   g s r ,     s t   = X o r i g   g s r ,   s t   X min g s r ,   s t X m a x   g s r ,   s t   X min g s r ,   s t ,
where X n o r m   i is a normalized value, X m i n   i is a minimal value, X m a x   i is a maximum value, and X o r i g   i is an original value. This step is important to avoid subject dependency, as the baseline signal values are individual for each subject.
By analyzing the normalized values, we can observe that the public speaking exercise in front of the therapist and during the virtual reality therapy session had a psychophysiological impact for the majority of the participants. Boosted GSR and skin temperature values usually imply increased anxiety. Additionally, as our participant group had various levels of public speaking anxiety, we can spot some outliers in Figure 11. They were not affected either by the live speech exercise during the session or by the VRET session.

5.5. Galvanic Skin Response and Skin Temperature Feature Analysis

During the next stage of the analysis, we extracted and analyzed psychophysiological features using the smoothed and normalized GSR and the skin temperature signals. We chose the signal features reviewed by Picard et al. because they are commonly analyzed in emotion physiology literature [28]. These features included the mean absolute value of the raw signal, standard deviation of the raw signal, the first difference of the mean absolute value for the raw and normalized signal, and the second difference of the mean absolute value for the raw and normalized signal. The whole feature set is shown in Table 4.
Three series of the collected data were analyzed in order to evaluate the effects of the VRET system. The first series included the values recorded during the beginning of the experiment (the baseline), the second series included the values recorded throughout the speech exercise in front of the therapist, and the third series contained the values recorded during the VRET session. By comparing raw mean values of the signals from all participants, we can see a significant increase of GSR and skin temperature measurements. GSR signals were significantly more variable during the VRET session, which indicates an increased reaction to the stronger anxiety stimuli (Figure 12). This increase demonstrates higher participant involvement in the public speaking exercise during the VR simulation than during the live speech.
Finally, by looking at the values of the extracted features, we can state that VRET speaking scenarios created a stimulus that was equivalent or even stronger than a speech that was delivered in front of the therapist (Table 5). Values are increased during the live and VRET session when we compare these features to the baseline. Therefore, a cloud-based VRET service could be used as a helpful exposure therapy tool for treating or even self-treating public speaking anxiety disorders. By providing patients with the ability to emulate and prepare for their daily life challenges, we could enable them to adapt their reactions with or without the help of a therapist.

6. Conclusions

The analysis of the psychophysiological features and the values from Subjective Units of Distress Scale indicated that patients reacted to the stimuli induced by a VRET session. It can thus be stated that our VRET scenes can create a controllable anxiety during the treatment session.
Moreover, VRET delivered as a cloud-based service with minimal hardware requirements could become a more acceptable VR option for psychology clinics. Our work demonstrated and repeated the findings of multiple previous studies that VRET is a viable tool for the simulation of public speaking scenarios. Additionally, the VRET system on the cloud provides the ability to conduct these sessions more easily and can have a wider reach than a standard standalone (offline) VR solution.
While the psychophysiological signals collected from wearable devices may contain extensive noise, they can still be analyzed using filtering and smoothing techniques. Gathered and analyzed biofeedback data can provide valuable insights for the psychologist. Furthermore, it can be compared and tracked during the whole cycle of the treatment and can help to evaluate patient progress. It can be used as an additional tool for the therapist, by enriching the standard self-observed psychological anxiety measurement scales and tests.
Further work on our cloud system will focus on extracting and delivering biofeedback signal features in the real time on the therapist dashboard. This will help psychologists to evaluate the patient’s psychological state during the VRET session and increase or decrease the intensity of a simulated virtual environment. Moreover, a self-treatment online module will be developed, so patients will be able to use the system without interaction with the therapist. Finally, the system will be prepared to be used for a full randomized clinical trial, which will include the control group.

Author Contributions

Conceptualization, L.N., J.S. and A.M.; Formal Analysis, A.M. and J.S.; Methodology, A.M.; Writing-Original Draft Preparation, J.S.; Writing-Review and Editing, A.M.; Supervision, A.M.

Funding

This research was partially financed by the European Regional Development Fund within the “Intelektas. Bendri mokslo-verslo projektai” program (project reference number: J05-LVPA-K-01-0232).

Acknowledgments

We would like to give special thanks to psychologist Ilona Laukiene for consultations. She helped us to conduct VRET sessions that were described in this paper.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Kessler, R.C.; Berglund, P.; Demler, O.; Jin, R.; Merikangas, K.R.; Walters, E.E. Lifetime Prevalence and Age-of-Onset Distributions of DSM-IV Disorders in the National Comorbidity Survey Replication. Arch. Gen. Psychiatry 2005, 62, 593. [Google Scholar] [CrossRef] [PubMed]
  2. McCann, R.A.; Armstrong, C.M.; Skopp, N.A.; Edwards-Stewart, A.; Smolenski, D.J.; June, J.D.; Metzger-Abamukong, M.; Reger, G.M. Virtual reality exposure therapy for the treatment of anxiety disorders: An evaluation of research quality. J. Anxiety Disord. 2014, 28, 625–631. [Google Scholar] [CrossRef] [PubMed]
  3. Owens, M.E.; Beidel, D.C. Can Virtual Reality Effectively Elicit Distress Associated with Social Anxiety Disorder? J. Psychopathol. Behav. Assess. 2015, 37, 296–305. [Google Scholar] [CrossRef]
  4. Valmaggia, L.R.; Latif, L.; Kempton, M.J.; Rus-Calafell, M. Virtual reality in the psychological treatment for mental health problems: A systematic review of recent evidence. Psychiatry Res. 2016, 236, 189–195. [Google Scholar] [CrossRef] [PubMed]
  5. Diemer, J.; Alpers, G.W.; Peperkorn, H.M.; Shiban, Y.; hlberger, A.M. The impact of perception and presence on emotional reactions: A review of research in virtual reality. Front. Psychol. 2015, 6, 26. [Google Scholar] [CrossRef] [PubMed]
  6. Raghav, K.; Wijk, A.J.V.; Abdullah, F.; Islam, M.N.; Bernatchez, M.; Jongh, A.D. Efficacy of virtual reality exposure therapy for treatment of dental phobia: A randomized control trial. BMC Oral Health 2016, 16, 25. [Google Scholar] [CrossRef] [PubMed]
  7. Tudor, A.D.; Poeschl, S.; Doering, N. Virtual audience customization for public speaking training procedures. In Proceedings of the 2013 IEEE Virtual Reality (VR), Lake Buena Vista, FL, USA, 18–20 March 2013; pp. 61–62. [Google Scholar]
  8. Botella, C.; Fernández-Álvarez, J.; Guillén, V.; García-Palacios, A.; Baños, R. Recent Progress in Virtual Reality Exposure Therapy for Phobias: A Systematic Review. Curr. Psychiatry Rep. 2017, 19, 42. [Google Scholar] [CrossRef] [PubMed]
  9. Carl, E.; Stein, A.T.; Levihn-Coon, A.; Pogue, J.R.; Rothbaum, B.; Emmelkamp, P.; Asmundson, G.J.; Carlbring, P.; Powers, M.B. Virtual reality exposure therapy for anxiety and related disorders: A meta-analysis of randomized controlled trials. J. Anxiety Disord. 2019, 61, 27–36. [Google Scholar] [CrossRef] [PubMed]
  10. Levy, F.; Leboucher, P.; Rautureau, G.; Jouvent, R. E-virtual reality exposure therapy in acrophobia: A pilot study. J. Telemed. Telecare 2016, 22, 215–220. [Google Scholar] [CrossRef] [PubMed]
  11. Lindner, P.; Miloff, A.; Fagernäs, S.; Andersen, J.; Sigeman, M.; Andersson, G.; Furmark, T.; Carlbring, P. Therapist-led and self-led one-session virtual reality exposure therapy for public speaking anxiety with consumer hardware and software: A randomized controlled trial. J. Anxiety Disord. 2019, 61, 45–54. [Google Scholar] [CrossRef] [PubMed]
  12. Wiederhold, B.K.; Miller, I.T.; Wiederhold, M.D. Using Virtual Reality to Mobilize Health Care: Mobile Virtual Reality Technology for Attenuation of Anxiety and Pain. IEEE Consum. Electron. Mag. 2018, 7, 106–109. [Google Scholar] [CrossRef]
  13. Freeman, D.; Haselton, P.; Freeman, J.; Spanlang, B.; Kishore, S.; Albery, E.; Denne, M.; Brown, P.; Slater, M.; Nickless, A. Automated psychological therapy using immersive virtual reality for treatment of fear of heights: A single-blind, parallel-group, randomised controlled trial. Lancet Psychiatry 2018, 5, 625–632. [Google Scholar] [CrossRef]
  14. Manju, T.; Padmavathi, S.; Tamilselvi, D. A Rehabilitation Therapy for Autism Spectrum Disorder Using Virtual Reality. In Proceedings of the International Conference on Intelligent Information Technologies, Chennai, India, 20–22 December 2017. [Google Scholar]
  15. Miloff, A.; Lindner, P.; Hamilton, W.; Reuterskiöld, L.; Andersson, G.; Carlbring, P. Single-session gamified virtual reality exposure therapy for spider phobia vs. traditional exposure therapy: Study protocol for a randomized controlled non-inferiority trial. Trials 2016, 17, 60. [Google Scholar] [CrossRef] [PubMed]
  16. Shiban, Y.; Diemer, J.; Müller, J.; Brütting-Schick, J.; Pauli, P.; Mühlberger, A. Diaphragmatic breathing during virtual reality exposure therapy for aviophobia: Functional coping strategy or avoidance behavior? A pilot study. BMC Psychiatry 2017, 17, 29. [Google Scholar] [CrossRef] [PubMed]
  17. Skulimowski, S.; Badurowicz, M. Wearable sensors as feedback method in virtual reality anti-stress therapy. In Proceedings of the 2017 International Conference on Electromagnetic Devices and Processes in Environment Protection with Seminar Applications of Superconductors (ELMECO & AoS), Lublin, Poland, 3–6 December 2017; pp. 1–4. [Google Scholar]
  18. Seol, E.; Cho, C.H.; Choi, S.; Jung, D.; Min, S.; Seo, S.; Jung, S.; Lee, Y.; Lee, J.; Kim, G.; Cho, C.; Lee, S. “Drop the beat”: Virtual reality based mindfulness and cognitive behavioral therapy for panic disorder—A pilot study. In Proceedings of the 23rd ACM Symposium on Virtual Reality Software and Technology—VRST ’17, Gothenburg, Sweden, 8–10 November 2017; pp. 1–3. [Google Scholar]
  19. Suyanto, E.M.; Angkasa, D.; Turaga, H.; Sutoyo, R. Overcome Acrophobia with the Help of Virtual Reality and Kinect Technology. Procedia Comput. Sci. 2017, 116, 476–483. [Google Scholar] [CrossRef]
  20. Shunnaq, S.; Raeder, M. VirtualPhobia: A Model for Virtual Therapy of Phobias. In Proceedings of the 2016 XVIII Symposium on Virtual and Augmented Reality (SVR), Gramado, Brazil, 21–24 June 2016; pp. 59–63. [Google Scholar]
  21. Rauch, S.A.; Koola, C.; Post, L.; Yasinski, C.; Norrholm, S.D.; Black, K.; Rothbaum, B.O. In session extinction and outcome in Virtual Reality Exposure Therapy for PTSD. Behav. Res. Ther. 2018, 109, 1–9. [Google Scholar] [CrossRef] [PubMed]
  22. Zinzow, H.M.; Brooks, J.O.; Rosopa, P.J.; Jeffirs, S.; Jenkins, C.; Seeanner, J.; McKeeman, A.; Hodges, L.F. Virtual Reality and Cognitive-Behavioral Therapy for Driving Anxiety and Aggression in Veterans: A Pilot Study. Cogn. Behav. Pract. 2018, 25, 296–309. [Google Scholar] [CrossRef]
  23. Bun, P.; Gorski, F.; Grajewski, D.; Wichniarek, R.; Zawadzki, P. Low—Cost Devices Used in Virtual Reality Exposure Therapy. Procedia Comput. Sci. 2017, 104, 445–451. [Google Scholar] [CrossRef]
  24. Marquardt, A.; Trepkowski, C.; Maiero, J.; Kruijff, E.; Hinkeniann, A. Multisensory Virtual Reality Exposure Therapy. In Proceedings of the 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Reutlingen, Germany, 18–22 March 2018; pp. 769–770. [Google Scholar]
  25. Walkom, G. Virtual Reality Exposure Therapy: To Benefit Those Who Stutter and Treat Social Anxiety. In Proceedings of the 2016 International Conference on Interactive Technologies and Games (ITAG), Nottingham, UK, 26–27 October 2016; pp. 36–41. [Google Scholar]
  26. Christofi, M.; Michael-Grigoriou, D. Virtual environments design assessment for the treatment of claustrophobia. In Proceedings of the 22nd International Conference on Virtual System & Multimedia (VSMM), Kuala Lumpur, Malaysia, 17–21 October 2016; pp. 1–8. [Google Scholar]
  27. Poeschl, S.; Doering, N. Virtual training for Fear of Public Speaking—Design of an audience for immersive virtual environments. In Proceedings of the 2012 IEEE Virtual Reality (VR), Costa Mesa, CA, USA, 4–8 March 2012; pp. 101–102. [Google Scholar]
  28. Picard, R.; Vyzas, E.; Healey, J.N. Towards Machine Emotional Intelligence: Analysis of Affective Psychological State. IEEE Trans. Pattern Anal. Mach. Intell. 2001, 23, 1175–1191. [Google Scholar] [CrossRef]
Figure 1. System as a service (SaaS) architecture for a virtual reality exposure therapy. REST—representational state transfer; VRET—virtual reality exposure therapy.
Figure 1. System as a service (SaaS) architecture for a virtual reality exposure therapy. REST—representational state transfer; VRET—virtual reality exposure therapy.
Information 10 00062 g001
Figure 2. Minimal cloud VRET system setup.
Figure 2. Minimal cloud VRET system setup.
Information 10 00062 g002
Figure 3. Patient overview page with a session history.
Figure 3. Patient overview page with a session history.
Information 10 00062 g003
Figure 4. VRET control options.
Figure 4. VRET control options.
Information 10 00062 g004
Figure 5. Multiple VRET environments were created for the treatment of public speaking anxiety.
Figure 5. Multiple VRET environments were created for the treatment of public speaking anxiety.
Information 10 00062 g005
Figure 6. Flow of the experiment. VRET is conducted in step 6.
Figure 6. Flow of the experiment. VRET is conducted in step 6.
Information 10 00062 g006
Figure 7. Values reported on the Subjective Units of Distress scale.
Figure 7. Values reported on the Subjective Units of Distress scale.
Information 10 00062 g007
Figure 8. Subjective Units of Distress (SUD) evaluation: (a) those who agreed or strongly with the statement, “While preparing for giving a speech, I feel tense and nervous”; (b) all the remaining subjects. SUDS—SUD Scale.
Figure 8. Subjective Units of Distress (SUD) evaluation: (a) those who agreed or strongly with the statement, “While preparing for giving a speech, I feel tense and nervous”; (b) all the remaining subjects. SUDS—SUD Scale.
Information 10 00062 g008
Figure 9. Differences in the SUDS values reported after VRET session: (a) based on the previous gaming experience; (b) based on the previous VR experience.
Figure 9. Differences in the SUDS values reported after VRET session: (a) based on the previous gaming experience; (b) based on the previous VR experience.
Information 10 00062 g009
Figure 10. Raw and filtered biofeedback signals: (a) galvanic skin response (GSR) and (b) skin temperature.
Figure 10. Raw and filtered biofeedback signals: (a) galvanic skin response (GSR) and (b) skin temperature.
Information 10 00062 g010
Figure 11. Normalized GSR and skin temperature session values by experiment participant.
Figure 11. Normalized GSR and skin temperature session values by experiment participant.
Information 10 00062 g011
Figure 12. Normalized signal values: (a) galvanic skin response; (b) skin temperature.
Figure 12. Normalized signal values: (a) galvanic skin response; (b) skin temperature.
Information 10 00062 g012
Table 1. Demographic information. VR—virtual reality.
Table 1. Demographic information. VR—virtual reality.
MeasureValue
AgeM = 26.85, SD = 4.189
SexFemale (13), Male (17)
EmploymentStudies (7), Works (23)
VR experienceNever (13), Rare (16), Frequent (4)
Video games experienceNever (10), Rare (9), Frequent (11)
Table 2. Subjective Units of Distress scale.
Table 2. Subjective Units of Distress scale.
MeasureRating
Highest anxiety/distress ever felt100
Extremely anxious/distressed90
Very anxious/distressed; can’t concentrate80
Quite anxious/distressed; interference with functioning70
Moderate to strong anxiety or distress60
Moderate anxiety or distress, but can continue to function50
Mild to moderate anxiety or distress40
Mild anxiety or distress; no interference with functioning30
Minimal anxiety or distress20
Alert and awake10
Table 3. Subjective signs of discomfort. VRET—virtual reality exposure therapy.
Table 3. Subjective signs of discomfort. VRET—virtual reality exposure therapy.
MeasureDuring Public SpeakingPublic Speaking during VRET Session
Increased sweating26.09%39.13%
Dizziness8.70%13.04%
Increased heart rate78.26%69.57%
Shivers21.74%21.74%
Uneasiness, fear69.57%60.87%
Heavy chest8.70%8.70%
Muscle tension30.43%43.48%
Loss of balance8.70%26.09%
Nausea0.00%4.35%
Table 4. Extracted galvanic skin response (GSR) features.
Table 4. Extracted galvanic skin response (GSR) features.
FeatureFormula
Mean absolute of the raw signal 1 N n = 1 N X n
Standard deviation of the raw signal 1 N n = 1 N ( X n X m e a n ) 2
Mean absolute of first difference (raw signal) 1 N 1 n = 1 N 1 | X n + 1   X n |
Mean absolute of first difference (normalized signal) 1 N 1 n = 1 N 1 | X ˜ n + 1   X ˜ n |
Mean absolute of second difference (raw signal) 1 N 2 n = 1 N 2 | X n + 2   X n |
Mean absolute of second difference (normalized signal) 1 N 2 n = 1 N 2 | X ˜ n + 2   X ˜ n |
Table 5. Elevated GSR and skin temperature feature values.
Table 5. Elevated GSR and skin temperature feature values.
FeatureBaseline GSR/Skin temp.Session GSR/Skin temp.VRET GSR/Skin temp.
Raw signal mean absolute1.152/29.741.257/30.721.372/31.21
Raw signal standard deviation0.07/0.040.22/0.0230.106/0.019
Raw signal absolute mean of first difference0.0044/0.00150.0121/0.0110.0065/0.0011
Normalized signal absolute mean of first difference0.056/0.0580.036/0.0180.021/0.019
Raw signal absolute mean of second difference0.0008/0.00030.0019/0.000280.0014/0.0003
Normalized signal absolute mean of second difference0.0116/0.01490.0077/0.00540.0049/0.0057

Share and Cite

MDPI and ACS Style

Šalkevičius, J.; Miškinytė, A.; Navickas, L. Cloud Based Virtual Reality Exposure Therapy Service for Public Speaking Anxiety. Information 2019, 10, 62. https://doi.org/10.3390/info10020062

AMA Style

Šalkevičius J, Miškinytė A, Navickas L. Cloud Based Virtual Reality Exposure Therapy Service for Public Speaking Anxiety. Information. 2019; 10(2):62. https://doi.org/10.3390/info10020062

Chicago/Turabian Style

Šalkevičius, Justas, Audronė Miškinytė, and Lukas Navickas. 2019. "Cloud Based Virtual Reality Exposure Therapy Service for Public Speaking Anxiety" Information 10, no. 2: 62. https://doi.org/10.3390/info10020062

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop