Want to see what exactly is inside the $500 (headset only price) Valve Index VR headset that was released last summer? Take a look at this teardown by [Ilja Zegars]. Not only does [Ilja] pull the device apart, but he identifies each IC and takes care to point out some of the more unique hardware aspects like the fancy diffuser on the displays, and the unique multilayered lenses (which are much thinner than one might expect.)
[Ilja] is no stranger to headset hardware design, and in addition to all the eye candy of high-res photographs, provides some insightful commentary to help make sense of them. The “tracking webs” pulled from the headset are an interesting bit, each is a long run of flexible PCB that connects four tracking sensors for each side of the head-mounted display back to the main PCB. These sensors are basically IR photodiodes, and detect the regular laser sweeps emitted by the base stations of Valve’s lighthouse tracking technology. [Ilja] also gives us a good look at the rod and spring mechanisms seen above that adjust distance between the two screens.
VR headsets are more and more common, but they aren’t perfect devices. That meant [Douglas Lanman] had a choice of problems to address when he joined Facebook Reality Labs several years ago. Right from the start, he perceived an issue no one seemed to be working on: the fact that the closer an object in VR is to one’s face, the less “real” it seems. There are several reasons for this, but the general way it presents is that the closer a virtual object is to the viewer, the more blurred and out of focus it appears to be. [Douglas] talks all about it and related issues in a great presentation from earlier this year (YouTube video) at the Electronic Imaging Symposium that sums up the state of the art for VR display technology while giving a peek at the kind of hard scientific work that goes into identifying and solving new problems.
[Douglas] chose to address seemingly-minor aspects of how the human eye and brain perceive objects and infer depth, and did so for two reasons: one was that no good solutions existed for it, and the other was that it was important because these cues play a large role in close-range VR interactions. Things within touching or throwing distance are a sweet spot for interactive VR content, and the state of the art wasn’t really delivering what human eyes and brain were expecting to see. This led to years of work on designing and testing varifocal and multi-focal displays which, among other things, were capable of presenting images in a variety of realistic focal planes instead of a single flat one. Not only that, but since the human eye expects things that are not in the correct focal plane to appear blurred (which is itself a depth cue), simulating that accurately was part of things, too.
The entire talk is packed full of interesting details and prototypes. If you have any interest in VR imaging and headset design and have a spare hour, watch it in the video embedded below.
With lockdown regulations sweeping the globe, many have found themselves spending altogether too much time inside with not a lot to do. [Peter Hall] is one such individual, with a penchant for flying quadcopters. With the great outdoors all but denied, he instead endeavoured to find a way to make flying inside a more exciting experience. We’d say he’s succeeded.
The setup involves using a SteamVR virtual reality tracker to monitor the position of a quadcopter inside a room. This data is then passed back to the quadcopter at a high rate, giving the autopilot fast, accurate data upon which to execute manoeuvres. PyOpenVR is used to do the motion tracking, and in combination with MAVProxy, sends the information over MAVLink back to the copter’s ArduPilot.
VR has been developing rapidly over the past decade, but headsets and associated equipment remain expensive. Without a killer app, the technology has yet to become ubiquitous in homes around the world. Wanting to experiment without a huge investment, [jamesvdberg] whipped up a low-cost headset for under $100 USD.
The build relies on Google-Cardboard-style optics, which are typically designed to work with a smartphone as the display. Instead, an 800×480 display intended for use with the Raspberry Pi is installed, hooked up over HDMI. An MPU6050 IMU is then installed to monitor the headset’s movements, hooked up to an Arduino Micro that passes this information to the attached PC. The rest of the build simply consists of cable management and power supply to all the hardware. It’s important to get this right, so that one doesn’t get tangled up by the umbilical when playing.
While it won’t outperform a commercial unit, the device nevertheless offers stereoscopic VR at a low cost. For a very cheap and accessible VR experience that’s compatible with the PC, it’s hard to beat. Others have done similar work too. Video after the break.
The Valve Index VR headset incorporates a number of innovations, one of which is the distinctive off-ear speakers instead of headphones or earbuds. [Emily Ridgway] of Valve shared the design and evolution of this unusual system in a deep dive into the elements of the Index headset. [Emily] explains exactly what they were trying to achieve, how they determined what was and wasn’t important to deliver good sound in a VR environment, and what they were able to accomplish.
Early research showed that audio was extremely important to providing a person with a good sense of immersion in a VR environment, but delivering a VR-optimized audio experience involved quite a few interesting problems that were not solved with the usual solutions of headphones or earbuds. Headphones and earbuds are optimized to deliver music and entertainment sounds, and it turns out that these aren’t quite up to delivering on everything Valve determined was important in VR.
The human brain is extremely good at using subtle cues to determine whether sounds are “real” or not, and all kinds of details come into play. For example, one’s ear shape, head shape, and facial geometry all add a specific tonal signature to incoming sounds that the brain expects to encounter. It not only helps to localize sounds, but the brain uses their presence (or absence) in deciding how “real” sounds are. Using ear buds to deliver sound directly into ear canals bypasses much of this, and the brain more readily treats such sounds as “not real” or even seeming to come from within one’s head, even if the sound itself — such as footsteps behind one’s back — is physically simulated with a high degree of accuracy. This and other issues were the focus of multiple prototypes and plenty of testing. Interestingly, good audio for VR is not all about being as natural as possible. For example, low frequencies do not occur very often in nature, but good bass is critical to delivering a sense of scale and impact, and plucking emotional strings.
The first prototype demonstrated the value of testing a concept as early as possible, and it wasn’t anything fancy. Two small speakers mounted on a skateboard helmet validated the idea of off-ear audio delivery. It wasn’t perfect: the speakers were too heavy, too big, too sensitive to variation in placement, and had poor bass response. But the results were positive enough to warrant more work.
In the end, what ended up in the Index headset is a system that leans heavily on Balanced Mode Radiator (BMR) speaker design. Cambridge Audio has a short and sweet description of how BMR works; it can be thought of as a hybrid between a traditional pistonic speaker drivers and flat-panel speakers, and the final design was able to deliver on all the truly important parts of delivering immersive VR audio in a room-scale environment.
As anyone familiar with engineering and design knows, everything is a tradeoff, and that fact is probably most apparent in cutting-edge technologies. For example, when Valve did a deep dive into field of view (FOV) in head-mounted displays, we saw just how complex balancing different features and tradeoffs could be.
Persistence is what a hacker needs to make it to their goal. That’s exactly what it took for [Erik] to make an untethered VR backpack system.
Starting way back in the Spring of 2019, [Erik] began working on an untethered VR system. Sure, the Oculus Quest was coming out, but it wouldn’t be compatible with the game library of PC based systems. [Erik] decided he wanted the best of both worlds, so he decided to build a backpack that carries a computer powerful enough to drive the Rift S.
The initial system was to use a cut-up backpack, an HP mini PC with an external Nvidia 1060 GPU, and a basic DC-DC converter. The result? Just about nothing worked. The HP’s boot process didn’t play well with an external GPU.
[Erik] went through several iterations of this project. He switched over to a standard PC motherboard and tried a few different DC-DC converters. He settled on a device from HDPLEX rated at 200 watts continuous. The converter plugs directly into a standard 24-pin ATX motherboard power connector and isn’t much larger than the connector itself.
The old backpack with its added padding and wood frame gave way to a Zotac VR go backpack. Only the straps and frame of the Zotac are used, with [Erik’s] custom parts mounted using plywood and 3D printed parts. The outer frame is aluminum, with acrylic panels.
Power comes from 7000 mAH LiFe batteries, with each pack providing an hour of runtime. The Backpack can hold two packs though, so wiring them up in parallel should double that runtime.
We have to say this is an extremely well-documented build. [Erik] explains how he chose each component and the advantages (and pitfalls) of the choices he made. An example would be the RAM he picked. He chose DDR4 with a higher spec than he needed, just so he could undervolt the parts for longer run-times.
There’s a scene in Bladerunner where Deckard puts a photograph in a magical machine that lets him zoom and enhance without limit, and even see around obstacles. In today’s climate, this is starting to seem more plausible, what with all the cameras everywhere. [Jasper van Loenen] explores this concept in Esper, a technological art installation he created in Seoul, Korea during an artist residency.
Esper is a two-part piece that turns virtual reality on its head by showing actual reality in VR. It covers two adjoining rooms, one to record reality, and the other for real-time virtual viewing on headsets. The first is outfitted with 60 ESP32 cameras on custom mounts, all pointing in different directions from various perches and ceiling drops. [Jasper] used an Android app based on openFrameworks to map the cameras’ locations in 3D space. The room next door is so empty, it’s even devoid of FOMO. You don’t want to miss this one, so check it out after the break.