Bats Can No Longer Haunt Apple VR Headsets Via Web Exploit

Bug reporting doesn’t usually have a lot of visuals. Not so with the visionOS bug [Ryan Pickren] found, which fills a user’s area with screeching bats after visiting a malicious website. Even better, closing the browser doesn’t get rid of them! Better still? Doesn’t need to be bats, it could be spiders. Fun!

The bug has been fixed, but here’s how it worked: the Safari browser build for visionOS allowed a malicious website to fill the user’s 3D space with animated objects without interaction or permission. The code to trigger this is remarkably succinct, and is actually a new twist on an old feature: Apple AR Quick Look, an HTML-based feature for rendering 3D augmented reality content in Safari.

How about spiders, instead?

Leveraging this old feature is what lets an untrusted website launch an arbitrary number of animated 3D objects — complete with sound — into a user’s virtual space without any interaction from the user whatsoever. The icing on the cake is that Quick Look is a separate process, so closing Safari doesn’t get rid of the pests.

Providing immersive 3D via a web browser is a valuable way to deliver interactive content on both desktops and VR headsets; a good example is the fantastic virtual BBC Micro which uses WebXR. But on the Apple Vision Pro the user is always involved and there are privacy boundaries that corral such content. Things being launched into a user’s space in an interaction-free way is certainly not intended behavior.

The final interesting bit about this bug (or loophole) was that in a way, it defied easy classification and highlights a new sort of issue. While it seems obvious from a user experience and interface perspective that a random website spawning screeching crawlies into one’s personal space is not ideal, is this a denial-of-service issue? A privilege escalation that technically isn’t? It’s certainly unexpected behavior, but that doesn’t really capture the potential psychological impact such bugs can have. Perhaps the invasion of personal space and user boundaries will become a quantifiable aspect of bugs in these new platforms. What fun.

DIY Eye And Face Tracking For The Valve Index VR Headset

The Valve Index VR headset has been around for a few years now. It doesn’t come with eye or face tracking, but that didn’t stop inspired folks like [Physics-Dude] from adding DIY solutions in elegant and effective ways using a combination of hardware, open software, and 3D printable parts.

The whole assembly integrates tightly, thanks in part to the “frunk” designed into the Index for exactly this kind of thing.

This project leverages the EyeTrackVR project (and optionally, Project Babble for mouth tracking) which both have great applications particularly in social VR spaces.

These are open-source, self-contained and modular solutions intended for a variety of hardware platforms. Of course, every millimeter and gram tends to count when it’s something that gets worn on one’s head, so [Physics-Dude] tailored a solution specifically for the Valve Index. His project makes great use of the platform’s hacker-friendly hardware design.

[Physics-Dude] also makes excellent use of a certain widely-available “gumstick” style USB hub as an important part of his build. Combined with with the front-mounted USB port on the Index, it results in an extremely compact and tightly integrated solution that looks great. While it can be risky to rely on a particular off-the-shelf item in a build, doing so absolutely has its place here.

The documentation is fantastic, including welcome guidance on cable routing and step-by-step instructions. If you’ve been interested in adding eye tracking to a project, be sure to give it a look. Already have eye tracking in a project of your own? Tell us all about it!

Two pictures of the same black dog, wearing two separate pairs of the AR glasses reviewed in these two articles

A Master-Class On Reverse-Engineering Six AR Glasses

Augmented reality (AR) tech is getting more and more powerful, the glasses themselves are getting sleeker and prettier, and at some point, hackers have to conquer this frontier and extract as much as possible. [Void Computing] is writing an open source SDK for making use of AR glasses, and, along the way, they’ve brought us two wonderful blog posts filled with technical information laid out in a fun to read way. The first article is titled “AR glasses USB protocols: the Good, the Bad and the Ugly”, and the second one follows as “the Worse, the Better and the Prettier”.

Have you ever wanted to learn how AR glasses and similar devices work, what’s their internal structure, which ones are designed well and which ones maybe not so much? These two posts have concise explanations, more than plenty of diagrams, six case studies of different pairs of AR glasses on the market, each pair demonstrated by our hacker’s canine assistant.

[Void Computing] goes in-depth on this tech — you will witness MCU firmware reverse-engineering, HID packet captures, a quick refresher on the USB-C DisplayPort altmode, hexdumps aplenty, and a reminder on often forgotten tools of the trade like Cunningham’s law.

If reverse-engineering lights your fire, these high-level retrospectives will teach you viable ways to reverse-engineer devices in your own life, and they certainly set a high bar for posts as far as write-ups go. Having read through these posts, one can’t help but think that some sort of AR glasses protocol standard is called for here, but fortunately, it appears like [Void Computing]’s SDK is the next best thing, and their mission to seize the good aspects of a tentative cyberpunk future is looking to be a success. We’ve started talking about AR glasses over a decade ago, and it’s reassuring to see hackers catching up on this technology’s advancements.

We thank [adistuder] for sharing this with us on the Hackaday Discord server!

Here’s How That Disney 360° Treadmill Works

One thing going slightly viral lately is footage of Disney’s “HoloTile” infinite floor, an experimental sort of 360° treadmill developed by [Lanny Smoot]. But how exactly does it work? Details about that are less common, but [Marques Brownlee] got first-hand experience with HoloTile and has a video all about the details.

HoloTile is a walking surface that looks like it’s made up of blueish bumps or knobs of some kind. When one walks upon the surface, it constantly works to move its occupant back to the center.

Whenever one moves, the surface works to move the user back to the center.

Each of these bumps is in fact a disk that has the ability spin one way or another, and pivot in different directions. Each disk therefore becomes a sort of tilted wheel whose edge is in contact with whatever is on its surface. By exerting fine control over each of these actuators, the control system is able to create a conveyor-belt like effect in any arbitrary direction. This can be leveraged in several different ways, including acting as a sort of infinite virtual floor.

[Marques] found the system highly responsive and capable of faster movement that many would find comfortable. When walking on it, there is a feeling of one’s body moving in an unexpected direction, but that was something he found himself getting used to. He also found that it wasn’t exactly quiet, but we suppose one can’t have everything.

How this device works has a rugged sort of elegant brute force vibe to it that we find appealing. It is also quite different in principle from other motorized approaches to simulate the feeling of walking while keeping the user in one place.

The whole video is embedded just below the page break, but if you’d like to jump directly to [Marques] explaining and showing exactly how the device works, you can skip to the 2:22 mark.

Continue reading “Here’s How That Disney 360° Treadmill Works”

Make 3D Scenes With A Holodeck-Like Voice Interface

The voice interface for the holodeck in Star Trek had users create objects by saying things like “create a table” and “now make it a metal table” and so forth, all with immediate feedback. This kind of interface may have been pure fantasy at the time of airing, but with the advent of AI and LLMs (large language models) this kind of natural language interface is coming together almost by itself.

A fun demonstration of that is [Dominic Pajak]’s demo project called VoxelAstra. This is a WebXR demo that works both in the Meta Quest 3 VR headset (just go to the demo page in the headset’s web browser) as well as on desktop.

The catch is that since the program uses OpenAI APIs on the back end, one must provide a working OpenAI API key. Otherwise, the demo won’t be able to do anything. Providing one’s API key to someone’s web page isn’t terribly good security practice, but there’s also the option of running the demo locally.

Either way, once the demo is up and running the user simply tells the system what to create. Just keep it simple. It’s a fun and educational demo more than anything and will try to do its work with primitive shapes like spheres, cubes, and cylinders. “Build a snowman” is suggested as a good starting point.

Intrigued by what you see and getting ideas of your own? WebXR can be a great way to give those ideas some life and looking at how someone else did something similar is a fine way to begin. Check out another of [Dominic]’s WebXR projects: a simulated BBC Micro, in VR.

The BBC Micro, Lovingly Simulated In VR

The BBC Micro was many peoples’ first exposure to home computing, and thanks to [Dominic Pajak], you can fire up this beloved hardware in WebXR. Is it an emulator? Yes, but it’s also much more than that.

The machine, the CRT, the keycaps, and even the sounds of the original keypresses are all brought to life as accurately as possible. The result is not just an emulator. It’s a lovingly-made BBC Micro simulator you can use with a VR headset. Or just use your browser and type on your real keyboard if you like.

Continue reading “The BBC Micro, Lovingly Simulated In VR”

Experiencing Visual Deficits And Their Impact On Daily Life, With VR

Researchers presented an interesting project at the 2024 IEEE Conference on Virtual Reality and 3D User Interfaces: it uses VR and eye tracking to simulate visual deficits such as macular degeneration, diabetic retinopathy, and other visual diseases and impairments.

Typical labels and pill bottles can be shockingly inaccessible to a variety of common visual deficits.

VR offers a unique method of allowing people to experience the impact of living with such conditions, a point driven home particularly well by having the user see for themselves the effect on simple real-world tasks such as choosing a pill bottle, or picking up a mug. Conditions like macular degeneration (which causes loss of central vision) are more accurately simulated by using eye tracking, a technology much more mature nowadays than it was even just a few years ago.

The abstract for the presentation is available here, and if you have some time be sure to check out the main index for all of the VR research demos because there are some neat ones there, including a method of manipulating a user’s perception of the shape of the ground under their feet by electrically-stimulating the tendons of the ankle.

Eye tracking is in a few consumer VR products nowadays, but it’s also perfectly feasible to roll your own in a surprisingly slick way. It’s even been used on jumping spiders to gain insights into the fascinating and surprisingly deep perceptual reality these creatures inhabit.