Unity VR Pt. 2


I decided to do a separate blog for this, instead of adding it to the previous blog, mainly because I feel this one needs its own blog, due to the amount of content that will most likely be provided. This blog is about Optimization for VR in Unity.

Optimization for VR in Unity

As achieving the target frame rate for your chosen platform is an essential part of ensuring users have a great, nausea-free VR experience, optimization is a critical part of VR development. With VR it’s best to optimize early and often, rather than leaving it to a later stage in development. Testing regularly as well.

Being aware of the the common issues when creating a VR experience, can help with the development and help design around the issues, saving time and a lot of stressful hard work later in the cycle.

Oculus provide a lot on the subject on Optimization, and I’m hoping that these guidelines or information can be used in the same way for the HTC Vive, as I could not find much on the subject of optimization for unity using the HTC Vive kit. So I will discuss the main points for optimization provided from Oculus, which should be helpful with my project. Team Tree House have written a blog on VR Performance Guidelines as well, which I will also be referencing.

Oculus Best Practices


  • Use distortion shaders. Approximating your own distortion solution, even when it “looks about right,” is often discomforting for users.
  • Get the projection matrix exactly right. Any deviation from the optical flow that accompanies real world head movement creates oculomotor issues and bodily discomfort.
  • Maintain VR immersion from start to finish.
  • The images presented to each eye should differ only in terms of viewpoint; post-processing effects (e.g., light distortion, bloom) must be applied to both eyes consistently as well as rendered in z-depth correctly to create a properly fused image.
  • Consider supersampling and/or anti-aliasing to remedy low apparent resolution, which will appear worst at the center of each eye’s screen.

Minimizing Latency

  • Your code should run at a frame rate equal to or greater than the display refresh rate, v-synced and unbuffered. Lag and dropped frames produce judder which is discomforting in VR.
  • Ideally, target 20ms or less motion-to-photon latency. Organize your code to minimize the time from sensor fusion to rendering.
  • Game loop latency is not a single constant and varies over time. The SDK uses some tricks (e.g., predictive tracking, TimeWarp) to shield the user from the effects of latency, but do everything you can to minimize variability in latency across an experience.
  • Use the SDK’s predictive tracking, making sure you provide an accurate time parameter to the function call. The predictive tracking value varies based on application latency and must be tuned per application.


  • Decrease eye-render buffer resolution to save video memory and increase frame rate.
  • Although dropping display resolution can seem like a good method for improving performance, the resulting benefit comes primarily from its effect on eye-render buffer resolution. Dropping the eye-render buffer resolution while maintaining display resolution can improve performance with less of an effect on visual quality than doing both.

Head-tracking and Viewpoint

  • Avoid visuals that upset the user’s sense of stability in their environment. Rotating or moving the horizon line or other large components of the user’s environment in conflict with the user’s real-world self-motion (or lack thereof) can be discomforting.
  • The display should respond to the user’s movements at all times, without exception. Even in menus, when the game is paused, or during cut scenes, users should be able to look around.
  • Use the SDK’s position tracking and head model to ensure the virtual cameras rotate and move in a manner consistent with head and body movements; discrepancies are discomforting.

Positional Tracking

  • The rendered image must correspond directly with the user’s physical movements; do not manipulate the gain of the virtual camera’s movements. A single global scale on the entire head model is fine (e.g. to convert feet to meters, or to shrink or grow the player), but do not scale head motion independent of interpupillary distance (IPD).
  • With positional tracking, users can now move their viewpoint to look places you might have not expected them to, such as under objects, over ledges, and around corners. Consider your approach to culling and backface rendering, and so on.
  • Under certain circumstances, users might be able to use positional tracking to clip through the virtual environment (e.g., put their head through a wall or inside objects). Our observation is that users tend to avoid putting their heads through objects once they realize it is possible, unless they realize an opportunity to exploit game design by doing so. Regardless, developers should plan for how to handle the cameras clipping through geometry.
  • Provide the user with warnings as they approach (but well before they reach) the edges of the position camera’s tracking volume as well as feedback for how they can re-position themselves to avoid losing tracking.
  • Augmenting or disabling position tracking is discomforting. Avoid doing so whenever possible, and darken the screen or at least retain orientation tracking using the SDK head model when position tracking is lost.


  • Acceleration creates a mismatch among your visual, vestibular, and proprioceptive senses. Minimize the duration and frequency of such conflicts. Make accelerations as short (preferably instantaneous) and infrequent as you can.
  • Remember that “acceleration” does not just mean speeding up while going forward; it refers to any change in the motion of the user, whether in direction or speed. Slowing down or stopping, turning while moving or standing still, and stepping or getting pushed sideways are all forms of acceleration.
  • Have accelerations initiated and controlled by the user whenever possible. Shaking, jerking, or bobbing the camera will be uncomfortable for the player.

Movement Speed

  • Viewing the environment from a stationary position is most comfortable in VR; however, when movement through the environment is required, users are most comfortable moving through virtual environments at a constant velocity. Real-world speeds will be comfortable for longer. For reference, humans walk at an average rate of 1.4 m/s.
  • Teleporting between two points instead of walking between them is worth experimenting with in some cases, but can also be disorienting. If using teleportation, provide adequate visual cues so users can maintain their bearings, and preserve their original orientation if possible.
  • Movement in one direction while looking in another direction can be disorienting. Minimize the necessity for the user to look away from the direction of travel, particularly when moving faster than a walking pace.


  • Zooming in or out with the camera can induce or exacerbate simulator sickness, particularly if doing so head and camera movements to fall out of 1-to-1 correspondence with each other. We advise against using “zoom” effects until further research and development finds a comfortable and user-friendly implementation.
  • For third-person content, be aware that the guidelines for accelerations and movements still apply to the camera regardless of what the avatar is doing. Furthermore, users must always have the freedom to look all around the environment, which can add new requirements to the design of your content.
  • Avoid using Euler angles whenever possible; quaternions are preferable. Try looking straight up and straight down to test your camera. It should always be stable and consistent with your head orientation.
  • Do not use “head bobbing” camera effects. They create a series of small but uncomfortable accelerations.

Managing and Testing Simulator Sickness

  • Test your content with a variety of un-biased users to ensure it is comfortable to a broader audience. As a developer, you are the worst test subject. Because you have grown a tolerance to the experience.
  • People’s responses and tolerance to sickness vary, and visually induced motion sickness occurs more readily in virtual reality headsets than with computer or TV screens. Your audience will not “muscle through” an overly intense experience, nor should they be expected to do so.
  • Consider implementing mechanisms that allow users to adjust the intensity of the visual experience. This will be content-specific, but adjustments might include movement speed, the size of accelerations, or the breadth of the displayed FOV. Any such settings should default to the lowest-intensity experience.
  • An independent visual background that matches the player’s real-world inertial reference frame can reduce visual conflict with the vestibular system and increase comfort.
  • High spatial frequency imagery (e.g., stripes, fine textures) can enhance the perception of motion in the virtual environment, leading to discomfort. Use—or offer the option of—flatter textures in the environment (such as solid-colored rather than patterned surfaces) to provide a more comfortable experience to sensitive users.

Degree of Stereoscopic Depth (“3D-ness”)

  • For individualized realism and a correctly scaled world, use the middle-to-eye separation vectors supplied by the SDK from the user’s profile.
  • Be aware that depth perception from stereopsis is sensitive up close, but quickly diminishes with distance. Two mountains miles apart in the distance will provide the same sense of depth as two pens inches apart on your desk.
  • Although increasing the distance between the virtual cameras can enhance the sense of depth from stereopsis, beware of unintended side effects. First, this will force users to converge their eyes more than usual, which could lead to eye strain if you do not move objects farther away from the cameras accordingly. Second, it can give rise to perceptual anomalies and discomfort if you fail to scale head motion equally with eye separation.

User Interface

  • UIs should be a 3D part of the virtual world and sit approximately 2-3 meters away from the viewer—even if it’s simply drawn onto a floating flat polygon, cylinder or sphere that floats in front of the user.
  • Don’t require the user to swivel their eyes in their sockets to see the UI. Ideally, your UI should fit inside the middle 1/3rd of the user’s viewing area. Otherwise, they should be able to examine the UI with head movements.
  • Use caution for UI elements that move or scale with head movements (e.g., a long menu that scrolls or moves as you move your head to read it). Ensure they respond accurately to the user’s movements and are easily readable without creating distracting motion or discomfort.
  • Strive to integrate your interface elements as intuitive and immersive parts of the 3D world. For example, ammo count might be visible on the user’s weapon rather than in a floating HUD.
  • Draw any crosshair, reticle, or cursor at the same depth as the object it is targeting; otherwise, it can appear as a doubled image when it is not at the plane of depth on which the eyes are converged.

Controlling the Avatar

  • User input devices can’t be seen while wearing the Rift. Allow the use of familiar controllers as the default input method. If a keyboard is absolutely required, keep in mind that users will have to rely on tactile feedback (or trying keys) to find controls.
  • Consider using head movement itself as a direct control or as a way of introducing context sensitivity into your control scheme.


  • When designing audio, keep in mind that the output source follows the user’s head movements when they wear headphones, but not when they use speakers. Allow users to choose their output device in game settings, and make sure in-game sounds appear to emanate from the correct locations by accounting for head position relative to the output device.
  • Presenting NPC (non-player character) speech over a central audio channel or left and right channels equally is a common practice, but can break immersion in VR. Spatializing audio, even roughly, can enhance the user’s experience.
  • Keep positional tracking in mind with audio design. For example, sounds should get louder as the user leans towards their source, even if the avatar is otherwise stationary.


  • For recommendations related to distance, one meter in the real world corresponds roughly to one unit of distance in Unity.
  • The optics make it most comfortable to view objects that fall within a range of 0.75 to 3.5 meters from the user’s eyes. Although your full environment may occupy any range of depths, objects at which users will look for extended periods of time (such as menus and avatars) should fall in that range.
  • Converging the eyes on objects closer than the comfortable distance range above can cause the lenses of the eyes to misfocus, making clearly rendered objects appear blurry as well as lead to eyestrain.
  • Bright images, particularly in the periphery, can create noticeable display flicker for sensitive users; if possible, use darker colors to prevent discomfort.
  • Consider the size and texture of your artwork as you would with any system where visual resolution and texture aliasing is an issue (e.g. avoid very thin objects).
  • Unexpected vertical accelerations, like those that accompany traveling over uneven or undulating terrain, can create discomfort. Consider flattening these surfaces or steadying the user’s viewpoint when traversing such terrain.
  • Be aware that your user has an unprecedented level of immersion, and frightening or shocking content can have a profound effect on users (particularly sensitive ones) in a way past media could not. Make sure players receive warning of such content in advance so they can decide whether or not they wish to experience it.
  • Don’t rely entirely on the stereoscopic 3D effect to provide depth to your content. Lighting, texture, parallax (the way objects appear to move in relation to each other when the user moves), and other visual features are equally (if not more) important to conveying depth and space to the user. These depth cues should be consistent with the direction and magnitude of the stereoscopic effect.
  • Design environments and interactions to minimize the need for strafing, back-stepping, or spinning, which can be uncomfortable in VR.
  • People will typically move their heads/bodies if they have to shift their gaze and hold it on a point farther than 15-20° of visual angle away from where they are currently looking. Avoid forcing the user to make such large shifts to prevent muscle fatigue and discomfort.
  • Don’t forget that the user is likely to look in any direction at any time; make sure they will not see anything that breaks their sense of immersion (such as technical cheats in rendering the environment).


Oculus Conclusion

I never expected to learn so much about VR optimization and truth be told all the information on their was too good to not include, their is so much you can do to make your VR game run smoothly as well as provide a great experience for the users. This is definitely going to be helpful when the final implementation of my demo is completed. I can already see what I can cut out from the demo, just the information about the graphical content was helpful in deciding a decision for me.

I learnt a lot from this and will use this as a way to optimize my demo, when the time comes.

Oculus Best Practices

Team Tree House – VR Guidelines
Nick Pettit

Overview of this blog, it provides information about some guidelines you can use in unity that will help developers. The man who wrote the blog is using his past experience of development to mention these guidelines. Not as big as I thought it was going to be, he mentions 3 main points with using unity VR. He mentions that most of the issues with developing VR projects come near the completion of the project, his main points are to help with the beginning or starting approach to a VR project.

HTC Vive has a 90fps, so keeping that in mind, the unity project should stay the same or exceed that frame rate, anything lower will cause latency and judder.

Use Forward Rednering and MSAA – Don’t change the default rendering path in Unity. It’s set to forward already, and that’s what you want. Set anti-aliasing to about 4x in the Quality Settings.

  • rendering path is a technique for drawing images to a digital display. The most common are the forward path and the deferred path.
  • The Deferred path renders in multiple passes that decouple geometry and lighting information.
  • The deferred rendering supports a large number of lights, because the cost of lights is coming from the number of pixels on the screen.
  • Huge drawback to deferred rendering is that it can only accomplish anti-aliasing through a screenspace shader. This approach comes at a considerable cost to performance.
  • The good news is that forward rendering enables MSAA (multi-sampling anti-aliasing), a great “AA” technique, at a very low cost to performance.

Set the Stereo Rendering Method to Single Pass – In the Player Settings, set the Stereo Rendering Method to Single Pass

Images in VR are displayed twice for each eye from different perspectives, earlier I mentioned how this can affect performance when developing, Unity has a solution to this that Nick has provided on his blog, instead of rendering two images, you can switch to a method called single pass, which renders a single image and splits it accordingly to each lens, I’m pretty sure I have already been doing this, unaware of that fact.

Art Direct for Performance – Design games with performance in mind, rather than trying to optimize at the end.

He doesn’t really provide anything new in this area, that I don’t already know. Keeping the frames at 90 and higher seems to be essential to developing a VR project, well keeping to the frame rate of the VR kit anyway. Basically he says to think of every idea for your project and then go through a process of deciding what is best suited for the project. This section requires a lot hard decisions and a lot of thinking. Thinking is cheap and optimization is difficult. It’s better to come up with an idea at the start that will perform well rather than trying to force a complicated idea into a performance box during crunch time.

Vr Performance Guidelines for New Unity Projects


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s