A common complaint we’ve seen on many of the recent cyberdeck builds is that they don’t offer any display technology more advanced than a tablet-sized IPS panel. The argument goes that to be a true deck in the Gibsonian sense, it’s got to have some kind of virtual reality interface or at least a head mounted display. Unfortunately such technology is expensive, and often not particularly hacker friendly.
But assuming you can settle for a somewhat low-tech alternative, the simple head mounted display that [Jordan Brandes] has been fiddling with is certainly a viable option. By mounting a five inch 800×480 TFT LCD to the front of a pair of goggles designed for first person view (FPV) flying, you can throw together a workable rig for around $30 USD. Add in some headphones, and you’ve got a fairly immersive experience for not a lot.
Naturally the display will show whatever HDMI signal you give it, but in his case, [Jordan] has mounted a Raspberry Pi to the back of it to make it a complete wearable computer. With a Bluetooth travel keyboard in the mix, he’s even able to get some legitimate work done with this setup. If he ends up combining this with the ultrasonic keyboard he was working on earlier in the year, he’ll be getting pretty close to jacking into cyberspace for real.
Want to see what exactly is inside the $500 (headset only price) Valve Index VR headset that was released last summer? Take a look at this teardown by [Ilja Zegars]. Not only does [Ilja] pull the device apart, but he identifies each IC and takes care to point out some of the more unique hardware aspects like the fancy diffuser on the displays, and the unique multilayered lenses (which are much thinner than one might expect.)
[Ilja] is no stranger to headset hardware design, and in addition to all the eye candy of high-res photographs, provides some insightful commentary to help make sense of them. The “tracking webs” pulled from the headset are an interesting bit, each is a long run of flexible PCB that connects four tracking sensors for each side of the head-mounted display back to the main PCB. These sensors are basically IR photodiodes, and detect the regular laser sweeps emitted by the base stations of Valve’s lighthouse tracking technology. [Ilja] also gives us a good look at the rod and spring mechanisms seen above that adjust distance between the two screens.
VR headsets are more and more common, but they aren’t perfect devices. That meant [Douglas Lanman] had a choice of problems to address when he joined Facebook Reality Labs several years ago. Right from the start, he perceived an issue no one seemed to be working on: the fact that the closer an object in VR is to one’s face, the less “real” it seems. There are several reasons for this, but the general way it presents is that the closer a virtual object is to the viewer, the more blurred and out of focus it appears to be. [Douglas] talks all about it and related issues in a great presentation from earlier this year (YouTube video) at the Electronic Imaging Symposium that sums up the state of the art for VR display technology while giving a peek at the kind of hard scientific work that goes into identifying and solving new problems.
[Douglas] chose to address seemingly-minor aspects of how the human eye and brain perceive objects and infer depth, and did so for two reasons: one was that no good solutions existed for it, and the other was that it was important because these cues play a large role in close-range VR interactions. Things within touching or throwing distance are a sweet spot for interactive VR content, and the state of the art wasn’t really delivering what human eyes and brain were expecting to see. This led to years of work on designing and testing varifocal and multi-focal displays which, among other things, were capable of presenting images in a variety of realistic focal planes instead of a single flat one. Not only that, but since the human eye expects things that are not in the correct focal plane to appear blurred (which is itself a depth cue), simulating that accurately was part of things, too.
The entire talk is packed full of interesting details and prototypes. If you have any interest in VR imaging and headset design and have a spare hour, watch it in the video embedded below.
VR headsets have been seeing new life for a few years now, and when it comes to head-mounted displays, the field of view (FOV) is one of the specs everyone’s keen to discover. Valve Software have published a highly technical yet accessibly-presented document that explains why Field of View (FOV) is a complex thing when it pertains to head-mounted displays. FOV is relatively simple when it comes to things such as cameras, but it gets much more complicated and hard to define or measure easily when it comes to using lenses to put images right up next to eyeballs.
The document goes into some useful detail about head-mounted displays in general, the design trade-offs, and naturally talks about the brand-new Valve Index VR headset in particular. The Index uses proprietary lenses combined with a slight outward cant to each eye’s display, and they explain precisely what benefits are gained from each design point. Eye relief (distance from eye to lens), lens shape and mounting (limiting how close the eye can physically get), and adjustability (because faces and eyes come in different configurations) all have a role to play. It’s a situation where every millimeter matters.
If there’s one main point Valve is trying to make with this document, it’s summed up as “it’s really hard to use a single number to effectively describe the field of view of an HMD.” They plan to publish additional information on the topics of modding as well as optics, so keep an eye out on their Valve Index Deep Dive publication list.
What does it smell like when the wheels heat up on that Formula 1 car you drive at night and on the weekends? You have no idea because the Virtual Reality experience that lets you do so doesn’t come with a nasal component. Yet.
Shown here is an olfactory device that works with Oculus Rift and other head-mounted displays. The proof of concept is hte work of [Kazuki Hashimoto], [Yosuke Maruno], and [Takamichi Nakamoto] and was shown of at last year’s IEEE VR conference. It lets the wearer smell the oranges when approaching a tree in a virtual environment. In other words, it makes your immersive experience smelly.
As it stands this a pretty cool little device which atomizes odor droplets while a tiny fan wafts them to the wearer’s nose. There is a paper which presumably has more detail but it’s behind a pay wall so for now check out the brief demo video below. Traditionally an issue with scent systems is the substance stuck in the lines, which this prototype overcomes with direct application from the reservoir. Yet to be solved is the availability for numerous different scents.
This build came to our attention via an UploadVR article that does a good job of covering some of the scent-based experiments over the years. They see some of the same hurdles we do: odors linger and there is a limited palette that can be produced. We assume the massive revenue of the gaming industry is going to drive research in this field, but we won’t be lining up to smell gunpowder and dead bodies (or rotting zombies) anytime soon.
The more noble effort is in VR applications like taking the elderly and immobile back for another tour of places they’ll never again be able to visit in their lives. Adding the sense of smell, which has the power to unlock so many memories, makes that use case so much more powerful. We think that’s something everyone can be hopeful about!
[Harris Shallcross] decided to build a pair of smart glasses and recently completed a first prototype of his project ‘Ochi’ – an STM32 based, BLE-connected, OLED eyeglass display. There are of course several homebrew smart glasses projects out there; many are more polished-looking and nearly all of them also display information from a smartphone over Bluetooth. This one is interesting partly because it highlights many of the design challenges that smart glasses and other near-eye displays face. It also demonstrates the iterative development process: begin by getting something working to learn what does and doesn’t cut it at a basic level, and don’t optimize prematurely; let the process bring problems to the surface.
For his project, [Harris Shallcross] used a small 0.95″ diagonal 96×64 color OLED as the display. The lens is from a knockoff Google Cardboard headset, and is held in a 3D printed piece that slides along a wire rail to adjust focus. The display uses a custom font and is driven by an STM32 microcontroller on a small custom PCB, with an HM11 BLE module to receive data wirelessly. Power is provided by a rechargeable lithium-ion battery with a boost converter. An Android app handles sending small packets of data over Bluetooth for display. The prototype software handles display of time and date, calendar, BBC news feed, or weather information.
Devices like these have a lot to deal with. Weight and distribution of that weight is a concern, the size and comfort of the optics is important, and displaying data on a small OLED is only part of the battle – choosing what information to display and when are vital to the device being actually useful in any way, otherwise it’s just a tech demo.
This project set out to show whether it was possible to use the parts listed to make a glasses mounted smart display that was at least somewhat functional, and the software to support it. Clearly, [Harris Shallcross] succeeded at that, but what really showcases the development process is his list of improvements – what he decided needs to go into a second version, and why. One of those goals is to improve the optics; perhaps there’s something to learn from The $60 Bluetooth Head Mounted Display project, which used a similar OLED and a prism to locate the display off to the side instead of in front.
For [Tony]’s entry for The Hackaday Prize, he’s doing something we’ve all seen before – a head mounted display, connected to a Bluetooth module, displaying information from a smartphone. What we haven’t seen before is a cheap version of this tech, and a version of Google Glass that folds – you know, like every other pair of glasses on the planet – edges this project over from ‘interesting’ to ‘nearly practical’.
For the display, [Tony] is using a 0.96″ OLED connected to an Arduino Nano. This screen is directed into the wearer’s eye with a series of optics that, along with every other part of the frame, was 3D printed on a Solidoodle 2. The frame itself not only folds along the temples, but also along the bridge, making this HMD surprisingly compact when folded up.
Everything displayed on this head mounted display is controlled by either an Android phone or a Bluetooth connection to a desktop. Using relatively simple display means [Tony] is limited to text and extremely simple graphics, but this is more than enough for some very interesting applications; reading SMS messages and checking email is easy, and doesn’t overpower the ‘duino.
The project featured in this post is an entry in The Hackaday Prize. Build something awesome and win a trip to space or hundreds of other prizes.