When simulation causes hallucination
A number of years ago during the initial wave of virtual reality in the 1990s, it seemed everything was going virtual. In the arcades there were Virtuality pods with Dactyl and racing games, and in some instances you played games like Descent or Doom. This was the early days of virtual reality as-public-consciousness modality and it seemed so promising.
Many will say that the reason VR never really took off back then was because the graphics weren’t that great or because the hardware was bulky, but the truth is stranger than fiction.
When we look at the situation honestly, we can’t say that Nintendo would have failed because the NES was 8-bit or SNES was 16-bit. In the bigger picture, not even the clunky 3D graphics of the N64 made it a failure. In fact, for the time they were available, they were cutting edge.
Even with all of the pop culture saturation and buzzword hype, virtual environments took a dive toward the end of the 1990s despite all of the movies, the VFX Headset, and even Vuzix making capable headsets. VFX eventually went out of business (if I recall correctly) but Vuzix is around today even though they began in 1997.
Jaron Lanier back then (the guy who popularized the term Virtual Reality) was head of VPL which did virtual reality research. I’m pretty sure they invented things like the Data Glove and a lot of other things. Jaron consulted for Johnny Pneumonic and likely Lawnmower Man. Nintendo even brought him in to help design the Power Glove.
Virtual worlds go through phases... and this is something that has been quite inevitable for twenty years or longer. The problem with this situation is that we seldom are capable of seeing the historical and long-tail view of things but instead are focused on the pigeon hole that is right now to the exclusion of empirical evidence to the contrary.
In the early 1990s, there was Worlds Inc and then Active Worlds. They enjoyed marginal success in the early VR hype phase when it seemed the whole world was expecting a virtual world revolution. During this time there also existed VRML/X3D as a web standard for embedding virtual worlds on the Internet.
The interesting thing about history is that it repeats itself.
VFX is one of those companies that went out of business making headsets for gamers despite having support for some of the hottest game titles on the planet at the time and the ability to be backwards compatible with other games that didn’t even natively support it. But then we look at Vuzix which began in 1997 at arguably the end of the first VR phase when companies were just going out of business left and right... and they exist to this day making VR and AR headsets.
The first thing to consider is that VR as Consumer Device isn’t exactly a good idea. While I understand that the technology has gotten better, faster and smaller, there is one thing that hasn’t been solved in 2014 and it’s the underlying reason why HMDs aren’t ubiquitous nor have they historically saved the VR industry.
Quite the opposite, actually... and that is the point of this post.
Somewhere around 1991, Sega was rolling in cash from their Megadrive Sales and wanted to make an add-on. What came about from this ambition was the Sega VR headset which never made it out of prototype phase into release.
They made all sorts of promotional materials for it, debuted it multiple times at CES all the way up to 1993, and they even commissioned Stanford Research Institute in Palo Alto to do a consumer study just to be absolutely sure.
The details of that study were never released, but whatever it had in it was enough to cause Sega to entirely shut down the Sega VR project and discontinue it before it ever was released.
That being said, while SRI nor Sega ever released the findings of that research study, a number of people who worked on that research and also at Sega have since come forward with quite a lot of damning details from it. The video below is a quick overview of what happened -
Stanford Research Institute tested the prototype out rigorously, with their final report warning Sega that prolonged use constituted a 'health hazard'.
This is where it gets interesting... and we can go back to latency issues, lag, bad graphics, etc... but that was just a fraction of the overall issue. Yes it contributed to the problem, though it wasn’t actually the main factor. This is something that current generation virtual reality companies have completely overlooked, and it is bound to come back and bite them in the ass.
So what was it that Stanford Research Institute stumbled upon that constituted a ‘health hazard’ based on ‘prolonged use’?
Well, it was essentially something that Sega wanted to bury indefinitely and did an amazing job at it. What SRI discovered were the three stages of Immersion Sickness.
“Wait a minute!” I hear you saying, “What do you mean three stages?”
When we talk about Immersion Sickness today, (cybersickness, simulation sickness), we always assume it’s a type of motion sickness or cue sensory mismatch issue. The trouble is, this is just Stage One, or Acute Cybersickness.
There are two more stages to immersion sickness that are rarely talked about, but recent research is starting to tread ground again and make the connection that was lost in the 1990s and again as early as 1984. In 1995, the full effect of Immersion Sickness was touched upon with a few isolated reports of hallucinations after prolonged use of VR but nobody really thought much about it.
In order to understand the full scope of immersion sickness, we have to start in 1984 with the concept of “The Tetris Effect”. Prolonged exposure to a video game creates altered perceptions of reality as well as audio and visual hallucinations. Even more interestingly, this situation is a lot like simulation “burn-in” where you are re-wiring your neural pathways for a simulated reality interaction that is contradictory to how you perceive the actual real world.
HMDs (Head Mounted Displays) and more recently products like Oculus Rift use the same premise of emissive LCD type displays mere inches from your eyes. Even taking into account stereoscopic views, the focal point of your eyes is no longer natural nor is the way your eyes are taking in the light from the simulated reality. In this “false” reality, you are re-training your eyes to see a simulated environment which is contradictory to how it naturally is wired to see and process things.
The problem becomes that prolonged usage creates enough new neural pathways to teach your brain the wrong way to see reality but how to adjust to virtual reality. So whenever I hear that symptoms like headache, nausea and blurred vision subside after time, it is no relief – because that means the user has graduated to Phase Two Immersion Sickness, whereby your mind has become adjusted to virtual reality (where it was originally rejecting it) but now it is trained to see reality as incorrect and reject it.
If you look at comprehensive studies in this field you will find that blurred vision, persistent nausea and other after effects continued in those cases... but when you spend even more time in a simulated environment (because you’ve become accustomed to it) you become prone to that Tetris Effect... but on a much grander scale. The more realistic the immersion, the more profound the immersion sickness and hallucination after effects.
That’s Stage Three Immersion Sickness.
Audio and Visual hallucinations in real life based on simulated environment exposure, combined with altered perception of reality and in many cases the original rejection of reality in Phase Two intermittently (blurred vision/altered perception, nausea, etc)
Stage One: Initial VR Rejection: Blurred vision, nausea, headache
Stage Two: VR Rejection Subsides, but manifest in Reality Rejection
Stage Three: Auditory and Visual Hallucinations (Tetris Effect)
Now, a lot of you reading this will say: But that’s just from the bad graphics and poor hardware of the 1990s! Oculus Rift is far better and overcomes this!
The problem with that statement is that it misses the underlying root of what simulation sickness is and how/why it is caused.
With more recent graphics, hardware and games, a recent study was published in December 2013 titled Altered Visual Perception in Game Transfer Phenomena
Essentially it’s rediscovering the Tetris Effect in the modern age, which is to say – prolonged exposure to synthetic environments manifest all sorts of screwed up after effects that may persist for minutes, hours and in some cases weeks or months afterward.
What originally was coined in 1984 as “The Tetris Effect”, resurfacing in 1993 with SRI and Sega but with further symptoms due to to immersion, again in 1995/1996 with reports of audio and visual hallucinations, and now again in 2014 with another report...
Of course, this afflicts any gamer that plays for prolonged periods of time whether it’s Tetris or Call of Duty. Measured even as far back as Doom and Descent in the 1990s, which I’ll tell you (from a personal standpoint) is all sorts of screwed up.
The reason I know about the three stages of immersion sickness is because I’ve experienced it – particularly with Doom II, Descent and Virtua Racing in the arcade. I used to play a lot of VR games with the headset.
Perception is Everything
This is something that I’ve actually talked with Ed Tang about over at Avegant. He (just like myself) is very concerned with immersion sickness and his team addresses it seriously (or at least tries to).
With Avegant and their Glyph headset, they use a VRD (Virtual Retinal Display) that projects the images directly on your retina with a low level light source reflected from a micromirror array. This simulates much better how your eyes actually see the real world through passive light and not emissive.
In the case of Oculus Rift, you have LCD displays mere inches from your eyes blasting emissive light directly at them. This in conjunction with the shortened focal range of a few inches for your eyes to converge creates a very bad situation that will ultimately increase simulation sickness while lending to prolonged issues of advanced simulation sickness.
While a lot can be done to alleviate this issue with how the environment itself is presented in the virtual world (no sudden jerky movements, forced camera movements, etc) this only masks the underlying issue and graduates the user to a Phase Two Immersion Sickness whereby they quickly adjust (if at all) to the virtual world but then suffer prolonged after effects once they take the headset off.
Overall, the basic premise has been alleviated but not eliminated. The bigger issue wasn’t the graphics fidelity but the perception of the environment and how it was displayed.
If anything, research actually shows that higher resolution graphics actually make immersion sickness worse and not better. By increasing the resolution, you shorten the time it takes for immersion sickness to manifest in the end-user.
The reason for this is actually very obvious, but as of yet hasn’t really been nailed down in a study.
What it boils down to is that makers of HMDs don’t actually take into account how the human eye or mind actually sees or perceives the environment. You would think this is straightforward where you just show an image and there it is, but there are a number of nuances that are completely left out or counterproductively ignored in a total opposite of how we see. One of the reasons 3D TV didn’t take off like the industry had hoped, despite the complete technological media blitz to attempt to force the issue, was simply because of simulation sickness in much the same reason that Nintendo 3DS had issues as well.
There are a few things to consider when dealing with HMDs and how we perceive the world. One of the most important factors is simply that by naturally focusing on things in the real world (convergence) we are actively omitting the world outside of that view and reducing the information.
Our eyes only see detail from a tiny central spot. We perceive detail by our brains stitching together images as our eyes look around and focus.
Our peripheral vision is highly sensitive to motion, but not at all to detail. To see this, keep your eyes locked on one object. The side of the image will have no detail, just light, dark and shape. For more fun, look ahead and sense the limits of your angle of view, usually about 180 degrees. Now wiggle your fingers and bring your hand forward from behind you. You can see your wiggling fingers beyond this 180 degree field of view. You can see motion a little bit behind you!
This is actually a monumental thing that the industry has completely ignored, and it’s one of the leading factors of immersion sickness.
For instance, did you know in your peripheral view that you only see in black and white and low resolution comparatively to the center of your view which is in higher detail and in color?
Yet, when you are wearing a typical HMD, the LCD is a few inches from your eyes literally blasting a high resolution 3D view at you in a way that your eyes cannot omit or properly focus out. As we just covered above, the center of your view is color and detail oriented and your peripheral view is black and white, lower resolution, but highly sensitive to motion.
So what happens when the entire field of view is force feeding you high resolution, color and high motion images at a distance that makes it impossible to focus and omit most of it through a convergence point? Well, you completely overload your visual sense and ability to process it all.
That’s because the convergence point is so close that everything is forced into view in full resolution, and when you move your head around in real life... do you know what happens? The entire world drops to low resolution, you mentally omit much of the scene/world and then refocus. But you can’t do that with an HMD simply because it doesn’t take this into account... you are being force fed high resolution graphics at high speed and in full color. There’s no way for you to calibrate your focus naturally to alter your perception to compensate.
This is why you get eye strain and fatigue, why you get headaches. It is a completely unnatural force feeding of visual information at close distances that you cannot compensate for... or actually the problem is that over time you actually do compensate (if you’re lucky/unlucky).
See, when you continue to force feed that high resolution so close in a way that your eyes cannot adjust for, and there is no dynamic adjustment or compensation to simulated how your eyes see things, you wind up retraining your brain how to see. Except the convergence point is shallow, the information saturation is overkill, and when you take the headset off...
Suddenly your eyes can focus on the proper convergence point and start processing the world normally while omitting information selectively. But then, you’ve already trained your mind to see the “incorrect” way in a virtual environment and so now it initially starts rejecting reality in the way that originally it was trying to reject the virtual reality.
Remember when we were kids and our parents told us not to sit so close to the television because it would ruin our eyes?
At what point did we think it would be alright to strap LCD televisions two inches from our eyes?
The thing is, unless an HMD can also dynamically adjust focus resolution based on eye tracking in order to simulate your natural gaze, while also adjusting in a multi-resolution mode to accommodate how the eyes actually see... it’s going to cause immersion sickness. The only other way to begin alleviating this properly is to eliminate the actual screen altogether so the eyes can focus naturally and comfortably.
This is what Avegant does with their Glyph headset using Virtual Retinal Display technology, though even this isn’t quite enough. It is, however, a really large step in the right direction.
The following are some basic breakdowns of what contributes to the multi-phase process of Immersion Sickness. I went into a lot of detail over the visual process alone, but there is still more than that... I just wanted to illustrate the point that the basic design and implementation for Oculus Rift hasn’t changed much since the 1990s and will be subject to the same issues.
Higher resolution graphics aren’t a savior for it (and likely will make the situation worse).
A) Visual convergence: Causes headaches, blurred vision, nausea
B) Proprioception: Sense of self. It gets skewed badly in a virtual world. Think of the game Mirror’s Edge... they had to add a dot on the screen to focus on because people were getting “motion sick”. This was actually proprioception and the cue mismatch.
C) Neural Configuration: Re-learning bad visual habits. Teaching the brain to accept Virtual Reality (contradictory sensory input) and reject Reality (proper sensory input).
D) Audio & Visual Hallucinations: Prolonged immersion will lead to a higher percentage of audio and visual hallucination situation from participants, of which the severity will be dependent on time immersed.
The obvious answer for D is simply not to use VR for too long and to take off the headset after maybe a half hour or an hour, right?
Well, yes... but let’s think about this logically for a moment:
Oculus Rift is targeted toward hardcore gamers and higher end virtual reality users. This isn’t something the average person is going to buy to play games with or log into Second Life. This demographic niche alone is the type of user/gamer who specifically does not casually play/stay logged in.
The casual game player wouldn’t get an Oculus Rift... which means the niche that would get an Oculus Rift are the same people prone to prolonged usage, and thus Stage Three Immersion Sickness.
Let’s be honest... when was the last time you just logged into Second Life for an hour or played a game for an hour and were done for the night?
You know as well as I do the answer is “Never”
No, we’re looking at hardcore players who will get lost in the virtual world or the game for four or more hours at a clip, and maybe far longer than that because if it’s comfortable to wear, you are disconnected from reality entirely... which means you might as well be in Las Vegas... where there are no clocks on the wall, no indication of time and what happens in Vegas stays there (just like Second Life).
Of course, there are other reasons you may lose track of time in Vegas...
Just like “checking your email”, you find yourself staring at random cat pictures online 9 hours later and wonder where all the time went. As for the rest of the issues with total immersion sickness... well, you’ll just have to read the research papers linked in this post.
I’m not against immersive virtual environments or even HMDs in general... I’m actually interested in truly solving the issues. I really like immersive virtual reality... so much that when I was younger I ended up with Stage Three Immersion Sickness from it. I’d like to see a day when wearing an HMD won’t do that to me again so I can enjoy virtual worlds like they were meant to be.
The problem is that when Oculus Rift finally does launch, the problem is going to get a hell of a lot worse and widespread. And just like in the 90s when virtual reality jumped the shark after the headsets and hype pushed it all over the edge, I fear that Oculus Rift is likely to do the same for this generation.
The irony in all of this is that the solution to Virtual Reality and immersion sickness may actually be answered by Augmented Reality. Just some food for thought.