December 27, 2022.
Influencing brain activity through virtual reality environments
Virtual reality and electroencephalography for modulation of emotional states through visual and auditory stimulus
A virtual reality application can influence a user mental activity in order to increase the relaxation or concentration. Iker López de Suso Sánchez, at Universitat de les Illes Balears (UIB), developed a software application motivated by providing technological support for mental health care, combining software games with Unity and virtual reality.
For this, is used an auditory stimulus known as binaural waves, configured in different ranges depending on the type of brain waves that are to be increased, and a neurofeedback is applied as a brain activity training technique. Likewise, the use of virtual reality favors both relaxation and individual attention thanks to the elimination of unwanted external stimuli, complete control of their experience, and the consequent feeling of immersion.
The objective evaluation of the user's mental state after using the application is possible thanks to the integration of a non-invasive EEG device that monitors brain activity throughout the session.
Combining Virtual Reality and EEG technologies, by adding visual and auditory stimulus, Iker López has created a software application for the modulation of emotional states with binaural waves and neurofeedback techniques.
Iker worked at Escola Politècnica Superior under the supervision of Dr. Francisco José Perales and Dr. José María Buades Rubio in the academic program 2020/2021 in the city of Palma, Majorca, Spain. This work led Universitat de les Illes Balears and Naxon Labs to agree on further academic, scientific and cultural collaboration.
“Virtual reality is a technology on which many expectations have been placed in recent years. The high costs of the devices in the market and the risk that companies run by investing large amounts of money in the development of content and devices have been the main brake on an expansion that was predicted to be exponential. However, virtual reality experiences have always been well received by developers and users, and the growth of the technology has been constant. The appearance of devices with great value for money such as the Meta Quest 2 (formerly Oculus Quest 2), and the interest shown by technological giants such as Meta (formerly Facebook), Google, Playstation, Valve, Microsoft, HTC, Sony and many others, show that this technology is in full growth and still has a lot to offer. In addition, it is not only revolutionizing the entertainment sector, its application in the form of Serious Games in fields such as health, education, sports, the aeronautical industry, the media or the corporate sector, reflect the possibility of a new boom similar to that of we had with mobile apps.”
In 2014 Facebook acquired the virtual reality headset maker Oculus Rift VR. Initially sold as Oculus Quest 2, Meta Quest 2 is a virtual reality headset developed by Meta Platforms (formerly Facebook, Inc.) released in 2020.
“Mental health is another protagonist in this project. The percentages of people affected by mental disorders are higher than what is popularly believed. 25% of the population suffers from some mental disorder throughout their lives. Mental illnesses represent 12.5% of all pathologies. 22% of the population suffers episodes of anxiety and depression at some point in their lives. In addition, in the year in which this work was carried out, a global health crisis caused by the COVID-19 virus was emerging, which has collaterally affected the mental health of millions of people, as warned by the Organization World Health Organization (WHO).
The seriousness of the matter is clearly evidenced in an analysis by a Canadian team, based on data from 55 international studies (with more than 190,000 participants between January and May 2021), in which they emphasize that post-traumatic stress disorders, anxiety and depression were, respectively, five, four and three times more frequent than usual, taking the data reported by the WHO as a reference. The awareness on the part of the population is growing, and the interest in taking care of mental health as well”.
“Neurofeedback and Binaural Waves are techniques by which a person's brain activity can be influenced in the long term. Some studies present them as complementary and even substitute treatments for psychoactive drugs in disorders such as attention deficit hyperactivity disorder (ADHD), epilepsy, anxiety or depression, showing positive and permanent results in the patient. On the other hand, there is also a growing interest in people with “normal” brain activity in using technologies that help them relax or concentrate. Some examples of commercial applications of this style are Tripp and Muse.
Taking this context into account, the current project proposes the development of a virtual reality application and the integration of an EEG device that allows the collection of data on the user's brain activity while neurofeedback techniques with binaural waves are applied.
The use of virtual reality makes it easier to obtain favorable results because uncontrolled stimuli from the environment are limited and immersion is greater. The distinctive element of the project compared to other applications such as Tripp, is the monitoring of the user's brain activity through an EEG system, which allows an objective analysis of the changes experienced and the effectiveness of the applied techniques.
For the realization of this work, the author has had at his disposal the Unity project corresponding to Miguel Sánchez's thesis work, in which a virtual reality system for the management of chronic pain in children and young people with rare diseases. It should be noted that said project consists of the binaural waves technique, so the same external library (AccelBrainBeat) has been used and the script that was developed for its configuration and reproduction has been reused.
Thus, it can be seen that the Binaural Waves configuration screen offers the same possibilities and has only changed in the visual aspect. The “Space” environment and the variety of music available are also originating from the aforementioned project”.
Iker analyzed brain waves in non-invasive EEG devices like Muse Band 2, which has 4 channels plus one for reference, using Bluetooth 4.2 communication:
● Gamma (32-100Hz)
○ High cognitive processing.
○ Problem solving.
● Beta (13-32Hz)
○ Decision making.
● Alpha (8-13Hz)
● Theta (4 - 8Hz)
○ Internal processing.
○ Dreams (REM), fears.
● Delta (0.5 - 4Hz)
○ Deep meditation.
○ Deep sleep (without dreaming).
A binaural beat is an auditory illusion perceived when two different pure-tone sine waves, both with frequencies lower than 1500 Hz, with less than a 40 Hz difference between them, are presented to a listener dichotically (one through each ear).
For example, if a 100 Hz pure tone is presented to a subject's right ear, while a 104 Hz pure tone is presented to the subject's left ear, the listener will perceive the auditory illusion of a third tone, in addition to the two pure tones presented to each ear. The third sound is called a binaural beat, and in this example would have a perceived pitch correlating to a frequency of 4 Hz, that being the difference between the 104 Hz and 100 Hz pure tones presented to each ear.
Figure: Binaural Beats
Binaural-beat perception originates in the inferior colliculus of the midbrain and the superior olivary complex of the brainstem, where auditory signals from each ear are integrated and precipitate electrical impulses along neural pathways through the reticular formation up the midbrain to the thalamus, auditory cortex, and other cortical regions.
Then the neurofeedback (NFB), also called neurotherapy, is a type of biofeedback that presents real-time feedback from brain activity in order to reinforce healthy brain function through operant conditioning. In this case, electrical activity from the brain is collected via sensors placed on the scalp using electroencephalography (EEG Muse Band 2), with feedback presented using video displays or sound.
“There’s decades of innovations ahead. We’re at the very beginning, where it’s just at the stage where we can bring in consumers but there’s so much further to go from there” said Brendan Iribe, CEO of Oculus Rift, the device Iker used for the application.
Also in his work, Iker cited Mark Zuckerberg, CEO of Facebook, now rebranded as Meta, to enter with these applications in the so called Metaverse: “The incredible thing about the technology is that you feel like you’re actually present in another place with other people. People who try it say it’s different from anything they’ve ever experienced in their lives.”
Iker considered different reality technologies: Augmented reality (AR), Virtual reality (VR) and Mixed Reality (MR).
Augmented reality (AR) adds digital elements to a live view often by using the camera on a smartphone. Examples of augmented reality experiences include Snapchat lenses and the game Pokemon Go.
Virtual reality (VR) implies a complete immersion experience that shuts out the physical world. Using virtual reality devices such as HTC Vive, Oculus Rift or Google Cardboard, users can be transported into a number of real-world and imagined environments such as the middle of a squawking penguin colony or even the back of a dragon.
In a Mixed Reality (MR) experience, which combines elements of both augmented reality and virtual reality, real-world and digital objects interact. Mixed reality technology is just now starting to take off with Microsoft’s HoloLens one of the most notable early mixed reality apparatuses.
The Meta Quest 2 device has a resolution per eye of 1920 x 1832 pixels, a refresh rate of 90Hz, and a FOV (Field of View) of 90°.
Figure: Meta Quest 2
Unity is a cross-platform game engine developed by Unity Technologies, first announced and released in June 2005. The engine has since been gradually extended to support a variety of desktop, mobile, console and virtual reality platforms. The engine can be used to create three-dimensional (3D) and two-dimensional (2D) games, as well as interactive simulations and other experiences.
Meta Quest and Quest 2 deliver the freedom of wireless, standalone virtual reality with the industry leading power and performance to drive your next immersive app. Both of these devices include spatially tracked controllers, integrated open-ear audio, and support for Oculus Link which enables users to access their Oculus Rift library of apps from their gaming compatible PC.
For this application Meta Quest 2 has been integrated with Unity to create the virtual reality environment, scene, game objects, the components defining the game object behavior and the materials that add texture and colors to objects.
The behavior and characteristics of the elements that interact in a Unity scene, known as GameObjects, are defined by the components they have attached. In Unity there are a multitude of components that provide very versatile characteristics to GameObjects, however, as we approach the particularities of a specific project, functions appear that require script programming, code units that control the behavior of the objects to which they are associated.
For the integration of the EEG device with the software application it was used the Naxon Explorer API to get the data at the right moment. With Naxon Labs platform and an EEG device like Interaxon’s Muse, you can create a mark derived from an external event in a sequence of brain activity expressed in waves. Naxon Explorer is a useful tool and neurofeedback system for researchers in Neuroscience, Psychology and Medicine. You can record brain data, get measurements and sessions data that will let you use machine learning and automatic pattern analysis. With the API, you can analyze brain behavior and its response to an external activity.In this application, Iker exposes the brain to visual and auditory stimulus at the same time informs Naxon Explorer through an API to register the moment accurately. With this, you can analyze the continuous brain waves and check what was the impact of the external event in the brain activity.
With the session started, you can now see the data that the EEG device is sending. The figure shows the micro volts per millisecond that each of the Muse Band 2 channels detects. The bar graph on the right shows the average values of each type of wave in real time, data obtained thanks to the application of the Fourier transform on the “raw” data. In the figure you can see the intensities of each type of wave for each sensor.
Figure: Visualization of raw data and the average intensity of each type of wave in real time.
There are a series of configurable parameters to calibrate the treatment of the data before displaying it on the screen, which transforms the values and changes their appearance. During the development of this work, the following parameters have been used:
• High pass filter: it is a filter that passes signals with a frequency higher than a certain cutoff frequency and attenuates signals with frequencies lower than said cutoff frequency.
• Low pass filter: This is a filter that passes signals with a frequency lower than the selected cutoff frequency and attenuates signals with frequencies higher than the cutoff frequency.
• Window (s): refers to how many seconds are displayed on the screen.
• uV amplitude: refers to how many microvolts the Y axis includes.
Figure: Data flow
In the application you can work with the session, the configuration, course of session, summary of results (graphics and information). You can also configure the environment, the binaural waves and the music.
In the lobby, the user is in a virtual environment where they can listen to music and binaural waves through headphones. The default settings of these elements are “Moon Night” in terms of environment, “Buddhist” in terms of music, and relaxation binaural waves, with the purpose of increasing the user's Alpha waves. In addition, there are 2 rays that correspond to the virtual reality controllers, with which the user can point and interact with the GUI elements.
Before starting the session, you must configure its duration and the type of activity that you want to measure. The different session durations range from a minimum of 5 minutes to a maximum of 45 minutes.
Regarding the type of session, each one has the following particularities:
• Relaxation: in this type of session the summary graph shown at the end symbolizes the relationship between the intensity of Alpha(α) and Beta(β). The value on the Y axis is directly proportional to the intensity of Alpha waves and inversely proportional to Beta waves. The formula used to calculate the relaxation index from these waves is α/β
During this session, the user must close their eyes and meditate, with the aim of increasing said index.
• Concentration: in the concentration type session, the relationship between the Alpha(α), Beta(β) and Theta(θ) waves is measured, which corresponds to the user's concentration index. This relationship is used in studies such as J. Park, H. Kwon, S. Kang, and Y. Lee, “The effect of binaural beat-based audiovisual stimulation on brain waves and concentration,” in 2018 International Conference on Information and Communication Technology Convergence (ICTC).
Alpha and Beta waves (especially Beta) appear when a person is concentrating on something, on the contrary, Theta waves appear when they are immersed in their imagination, distracted or sleeping. Said relationship is (α+β)/θ
When selecting this type of session, a sphere that levitates based on the concentration index appears on the screen. Therefore, the goal of the user during this type of session is to concentrate solely on the sphere. The greater the concentration of the user, the greater the height the ball will acquire. In this way, the Visual Neurofeedback technique is put into practice, allowing the user to train their mental activity with feedback in real time.
Once the duration and type of session have been selected, click "Start Session" to start it. Instantly, the WebSocketManager script sends the “start recording” command to the Naxon Explorer platform, after which the data that the Muse Band 2 monitors begins to arrive.
As future directions of this work, Iker Lopez indicated the statistical study to validate the effectiveness of the techniques used, increase the stimulus perceived by the user (shaders, particles), the expansion of neurofeedback techniques, the development of a backend project including communication with a server and a data base, the Inclusion of alternative input systems (head movement, voice control) and design and develop an in-game tutorial.