Smartphones Evolve to Meet the Demands of Virtual Reality
By Kerry Cunningham
When you put on a virtual reality (VR) headset you are transported to another world, a place where reality is just a whim of the imagination. What’s far from imaginary, though, is the impact VR is having on all of us, whether we personally use VR headsets or not.
Most people equate VR with gaming applications, but increasingly business, industry and government users are finding new applications for the evolving technology. The military, for example, has adopted VR for combat and training simulations. Healthcare providers utilize VR images from CAT scans to create 3D models of patients’ anatomy for diagnosis and treatment. Automobile manufacturers use VR in designing cars and to inspect automotive interiors and exteriors before they are manufactured.
These applications, and many more to come, will help drive the adoption of virtual reality and the devices used to access virtual environments. In fact, VR applications are now driving the evolution of smartphones like the one in your pocket or purse. Smartphones have become the single most important device in VR systems, in contrast to VR’s early days when bulky, dedicated headsets tethered to computers or consoles predominated.
So, if you like how much better and larger the display is on your current phone, or how much more memory it has for photos and apps, or how much longer its battery lasts, then get ready for even more improvements! As VR technologies push forward, these and other areas will continue to evolve.
Using a smartphone as the engine of a VR display has a compelling simplicity, and ultimately will allow VR to reach mass adoption because smartphones are so ubiquitous. There now are a number of wearable VR displays on the market that consist of a headmounted box or case with focusing lenses, into which the user inserts a smartphone (see figure 1).
Figure 1. VR display that uses a smartphone as the engine.
The phone serves as both computer and display, showing a stereo pair of images. How successfully your brain is tricked into believing you are actually in another virtual world depends on how well the phone enables the VR application to create the impression of your presence there.
The display requirements needed to achieve a desirable state of presence include the following: low image “persistence” and high screen-refresh rate; a high frame rate to avoid latency effects; high screen resolution; and a large field of view (FOV). All of these are designed to take advantage of the tricks the mind uses to perceive depth of field and focus, and for true 3D vision.
Image Persistence and Screen Refresh Rate
Image persistence and the screen refresh rate are closely linked, and in both cases, faster is better when it comes to replicating the real world. The need for speed is also why displays built from organic light-emitting diodes (OLEDs) have advantages over liquid crystal displays (LCDs).
Persistence is the term used to identify the time it takes for a new image to replace the current one. The lower the persistence, the sharper the image will appear. If persistence is high, or “full,” the image will seem blurry.
The screen refresh rate, on the other hand, is the number of times the image on a display screen can be refreshed per second. The faster the screen can refresh, the lower persistence will be.
OLED displays have more than a 1000x faster response rate than LCD displays and are the standard for smartphone VR. Every millisecond is critical when trying to achieve a realistic state of presence in the virtual world.
OLED displays also eliminate motion blur and jitter, which have been linked to simulator sickness. In addition, VR-ready smartphones use OLED screens to meet the requirements for image quality, power efficiency (e.g., cooler and longer operation), and smaller form factors.
Another major issue for delivering truly realistic virtual reality is latency. Latency is defined as the time from when you move your head to when you actually see the correctly rendered view. Low latency is key to creating a believable virtual space.
More specifically, latency refers to the delay between an input and a response. In the real world, your head’s actual movement and the image of motion you perceive are in sync between the eye and the inner ear. By contrast, in virtual reality there is a delay between moving your head and the movement of the image in the VR headset. If the delay is too long, the VR immersion will feel unnatural.
Moreover, the disparity with your brain’s understanding of normal movement can lead to nausea or dizziness. To avoid this, and to achieve a smooth and natural VR experience, the latency (i.e., the frame rate for images) must be faster than 20 ms.
Field of View
FOV is the extent of the observable environment at a given moment. It is one of the more important aspects of VR because the wider the FOV, the more likely the user will feel present in the experience. FOV comprises both monocular and binocular vision that work in tandem to determine depth of focus and 3D vision.
Full human eye FOV is 170° horizontal x 130° vertical. But in VR, the limiting factor in achieving full FOV is the lenses in the headset. To get a better FOV, you must either move closer to the lenses or increase their size (see figure 2).
Figure 2. Field of view depends on both monocular and binocular vision. It is achieved in smartphone-based VR headsets by ensuring that the phone lies an optimal distance from the headset’s lenses, and that the lenses are large enough. (Source: vr-lens-lab.com)
Screen resolution as measured by pixels per inch (ppi) is another important requirement for smartphone-based VR systems. Because smartphone displays are placed fairly close to the eye in VR headsets (see figure 3) and are magnified by the headset’s lenses, pixels sometimes can be seen. Higher-density OLED screens eliminate this problem.
Figure 3. In smartphone-based VR systems where the phone is close to the eye and the view of the screen is magnified by optical lenses, screen resolution must be ≥800 ppi to eliminate pixilation.
Smartphone-based VR systems must also accommodate the vergence of our eyes. Vergence refers to the simultaneous turning of the pupils toward or away from one another as we focus on something. The closer an object is, the more our eyes will rotate inward, or converge, to keep it in the center of the FOV. When something is farther away, our eyes will rotate outward, or diverge, to keep it centered. At an infinite distance, the eyes are gazing in parallel.
This vergence gives the brain the geometric data it needs to triangulate and calculate the distance from us to the object: two angles (one from each eye) and our position. Most VR displays show a separate image to each eye in order to take full advantage of these depth cues. If these images are properly synced with each other and with the motion of the head, a tolerably convincing illusion of depth can be generated (see figure 4).
Figure 4. Human eyes use both binocular and monocular cues to focus, create depth perception and perceive distance. Near-eye imaging displays in VR systems use software to replicate these cues.
Applied Materials is a key supplier of the materials engineering solutions and manufacturing equipment required to build high-performance OLED displays. Applied’s technology is used to fabricate the key PECVD and PVD layers needed to build high-performance LTPS and MOx transistors that offer high electron mobility, highly reliable backplane performance, and OLED performance stability.
Applied Materials is also the leader in thin-film encapsulation to protect the highly sensitive organic materials in OLED emitters from exposure to moisture and air (see figure 5).
Figure 5. OLED materials are susceptible to degradation when exposed to environmental factors such as water and air. Thin-film encapsulation of OLED emitters is key to building high-performance, durable and stable OLED displays.
Materials engineering solutions from Applied Materials will become even more important in the future with the growing requirements for higher screen resolution and smaller pixels. For example, the impact of uniformity and particles on yield is significantly magnified as thin-film transistors (TFTs) get smaller and devices get larger (see figure 6).
Figure 6. As TFTs in OLED displays get smaller and devices get larger, particles that were not problematic in manufacturing processes before may become “killer defects” for smaller TFTs. Equipment manufacturers must therefore reduce both the number (density) and size of particles when scaling to higher resolution and smaller pixel sizes.
Displays are not the only smartphone component that VR requirements are driving forward. Given that a linear increase in screen resolution leads to a square function increase in pixel density, phones will need to incorporate very fast processors built with higher-density processes—and more memory—to drive so much data at high resolution and high frame rates.
Even if you’re not a fan of VR, you will benefit from it. Smartphones will continue to evolve based on VR requirements for better picture quality, OLED displays for stunning color and small form factors, higher DRAM content to support improved resolution, and faster video download capabilities.
If you are a fan of VR, the new features just may trick your brain into believing you have been transported to another time and space.
For additional information, contact firstname.lastname@example.org.