DIY Eye And Face Tracking For The Valve Index VR Headset

The Valve Index VR headset has been around for a few years now. It doesn’t come with eye or face tracking, but that didn’t stop inspired folks like [Physics-Dude] from adding DIY solutions in elegant and effective ways using a combination of hardware, open software, and 3D printable parts.

The whole assembly integrates tightly, thanks in part to the “frunk” designed into the Index for exactly this kind of thing.

This project leverages the EyeTrackVR project (and optionally, Project Babble for mouth tracking) which both have great applications particularly in social VR spaces.

These are open-source, self-contained and modular solutions intended for a variety of hardware platforms. Of course, every millimeter and gram tends to count when it’s something that gets worn on one’s head, so [Physics-Dude] tailored a solution specifically for the Valve Index. His project makes great use of the platform’s hacker-friendly hardware design.

[Physics-Dude] also makes excellent use of a certain widely-available “gumstick” style USB hub as an important part of his build. Combined with with the front-mounted USB port on the Index, it results in an extremely compact and tightly integrated solution that looks great. While it can be risky to rely on a particular off-the-shelf item in a build, doing so absolutely has its place here.

The documentation is fantastic, including welcome guidance on cable routing and step-by-step instructions. If you’ve been interested in adding eye tracking to a project, be sure to give it a look. Already have eye tracking in a project of your own? Tell us all about it!

Here’s How That Disney 360° Treadmill Works

One thing going slightly viral lately is footage of Disney’s “HoloTile” infinite floor, an experimental sort of 360° treadmill developed by [Lanny Smoot]. But how exactly does it work? Details about that are less common, but [Marques Brownlee] got first-hand experience with HoloTile and has a video all about the details.

HoloTile is a walking surface that looks like it’s made up of blueish bumps or knobs of some kind. When one walks upon the surface, it constantly works to move its occupant back to the center.

Whenever one moves, the surface works to move the user back to the center.

Each of these bumps is in fact a disk that has the ability spin one way or another, and pivot in different directions. Each disk therefore becomes a sort of tilted wheel whose edge is in contact with whatever is on its surface. By exerting fine control over each of these actuators, the control system is able to create a conveyor-belt like effect in any arbitrary direction. This can be leveraged in several different ways, including acting as a sort of infinite virtual floor.

[Marques] found the system highly responsive and capable of faster movement that many would find comfortable. When walking on it, there is a feeling of one’s body moving in an unexpected direction, but that was something he found himself getting used to. He also found that it wasn’t exactly quiet, but we suppose one can’t have everything.

How this device works has a rugged sort of elegant brute force vibe to it that we find appealing. It is also quite different in principle from other motorized approaches to simulate the feeling of walking while keeping the user in one place.

The whole video is embedded just below the page break, but if you’d like to jump directly to [Marques] explaining and showing exactly how the device works, you can skip to the 2:22 mark.

Continue reading “Here’s How That Disney 360° Treadmill Works”

Make 3D Scenes With A Holodeck-Like Voice Interface

The voice interface for the holodeck in Star Trek had users create objects by saying things like “create a table” and “now make it a metal table” and so forth, all with immediate feedback. This kind of interface may have been pure fantasy at the time of airing, but with the advent of AI and LLMs (large language models) this kind of natural language interface is coming together almost by itself.

A fun demonstration of that is [Dominic Pajak]’s demo project called VoxelAstra. This is a WebXR demo that works both in the Meta Quest 3 VR headset (just go to the demo page in the headset’s web browser) as well as on desktop.

The catch is that since the program uses OpenAI APIs on the back end, one must provide a working OpenAI API key. Otherwise, the demo won’t be able to do anything. Providing one’s API key to someone’s web page isn’t terribly good security practice, but there’s also the option of running the demo locally.

Either way, once the demo is up and running the user simply tells the system what to create. Just keep it simple. It’s a fun and educational demo more than anything and will try to do its work with primitive shapes like spheres, cubes, and cylinders. “Build a snowman” is suggested as a good starting point.

Intrigued by what you see and getting ideas of your own? WebXR can be a great way to give those ideas some life and looking at how someone else did something similar is a fine way to begin. Check out another of [Dominic]’s WebXR projects: a simulated BBC Micro, in VR.

The BBC Micro, Lovingly Simulated In VR

The BBC Micro was many peoples’ first exposure to home computing, and thanks to [Dominic Pajak], you can fire up this beloved hardware in WebXR. Is it an emulator? Yes, but it’s also much more than that.

The machine, the CRT, the keycaps, and even the sounds of the original keypresses are all brought to life as accurately as possible. The result is not just an emulator. It’s a lovingly-made BBC Micro simulator you can use with a VR headset. Or just use your browser and type on your real keyboard if you like.

Continue reading “The BBC Micro, Lovingly Simulated In VR”

Experiencing Visual Deficits And Their Impact On Daily Life, With VR

Researchers presented an interesting project at the 2024 IEEE Conference on Virtual Reality and 3D User Interfaces: it uses VR and eye tracking to simulate visual deficits such as macular degeneration, diabetic retinopathy, and other visual diseases and impairments.

Typical labels and pill bottles can be shockingly inaccessible to a variety of common visual deficits.

VR offers a unique method of allowing people to experience the impact of living with such conditions, a point driven home particularly well by having the user see for themselves the effect on simple real-world tasks such as choosing a pill bottle, or picking up a mug. Conditions like macular degeneration (which causes loss of central vision) are more accurately simulated by using eye tracking, a technology much more mature nowadays than it was even just a few years ago.

The abstract for the presentation is available here, and if you have some time be sure to check out the main index for all of the VR research demos because there are some neat ones there, including a method of manipulating a user’s perception of the shape of the ground under their feet by electrically-stimulating the tendons of the ankle.

Eye tracking is in a few consumer VR products nowadays, but it’s also perfectly feasible to roll your own in a surprisingly slick way. It’s even been used on jumping spiders to gain insights into the fascinating and surprisingly deep perceptual reality these creatures inhabit.

Apple Vision Pro’s Secret To Smooth Visuals? Subtly Substandard Optics

The displays inside the Apple Vision Pro have 3660 × 3200 pixels per eye, but veteran engineer [Karl Guttag]’s analysis of its subtly blurred optics reminds us that “resolution” doesn’t always translate to resolution, and how this is especially true for things like near-eye displays.

The Apple Vision Pro lacks the usual visual artifacts (like the screen door effect) which result from viewing magnified pixelated screens though optics. But [Karl] shows how this effect is in fact hiding in plain sight: Apple seems to have simply made everything just a wee bit blurry thanks to subtly out-of-focus lenses.

The thing is, this approach of intentionally de-focusing actually works very well for consuming visual content like movies or looking at pictures, where detail and pixel-to-pixel contrast is limited anyway.

Clever loophole, or specification shenanigans? You be the judge of that, but this really is evidence of how especially when it comes to things like VR headsets, everything is a trade-off. Improving one thing typically worsens others. In fact, it’s one of the reasons why VR monitor replacements are actually a nontrivial challenge.

Hackaday Links Column Banner

Hackaday Links: February 18, 2024

So it turns out that walking around with $4,000 worth of hardware on your head isn’t quite the peak technology experience that some people thought it would be. We’re talking about the recently released Apple Vision Pro headset, which early adopters are lining up in droves to return. Complaints run the gamut from totally foreseeable episodes of motion sickness to neck pain from supporting the heavy headset. Any eyeglass wearer can certainly attest to even lightweight frames and lenses becoming a burden by the end of the day. We can’t imagine what it would be like to wear a headset like that all day. Ergonomic woes aside, some people are feeling buyer’s remorse thanks to a lack of apps that do anything to justify the hefty price tag. The evidence for a wave of returns is mostly gleaned from social media posts, so it has to be taken with a grain of salt. We wouldn’t expect Apple to be too forthcoming with official return figures, though, so the ultimate proof of uptake will probably be how often you spot one in the wild. Apart from a few cities and only for the next few weeks, we suspect sightings will be few and far between.

Continue reading “Hackaday Links: February 18, 2024”