DIY Robotic Platform Aims To Solve Walking In VR

[Mark Dufour]’s TACO VR project is a sort of robotic platform that mimics an omnidirectional treadmill, and aims to provide a compact and easily transportable way to allow a user to walk naturally in VR.

Unenthusiastic about most solutions for allowing a user to walk in VR, [Mark] took a completely different approach. The result is a robotic platform that fits inside a small area whose sides fold up for transport; when packed up, it resembles a taco. When deployed, the idea is to have two disc-like platforms always stay under a user’s feet, keeping the user in one place while they otherwise walk normally.

It’s an ambitious project, but [Mark] is up to the task and the project’s GitHub respository has everything needed to stay up to date, or get involved yourself. The hardware is mainly focused on functionality right now; certainly a fall or stumble while using the prototype looks like it would be uncomfortable at the very best, but the idea is innovative. Continue reading “DIY Robotic Platform Aims To Solve Walking In VR”

Simulating Temperature In VR Apps With Trigeminal Nerve Stimulation

Virtual reality systems are getting better and better all the time, but they remain largely ocular and auditory devices, with perhaps a little haptic feedback added in for good measure. That still leaves 40% of the five canonical senses out of the mix, unless of course this trigeminal nerve-stimulating VR accessory catches on.

While you may be tempted to look at this as a simple “Smellovision”-style olfactory feedback, the work by [Jas Brooks], [Steven Nagels], and [Pedro Lopes] at the University of Chicago’s Human-Computer Integration Lab is intended to provide a simulation of different thermal regimes that a VR user might experience in a simulation. True, the addition to an off-the-shelf Vive headset does waft chemicals into the wearer’s nose using three microfluidics pumps with vibrating mesh atomizers, but it’s the choice of chemicals and their target that makes this work. The stimulants used are odorless, so instead of triggering the olfactory bulb in the nose, they target the trigeminal nerve, which also innervates the lining of the nose and causes more systemic sensations, like the generalized hot feeling of chili peppers and the cooling power of mint. The headset leverages these sensations to change the thermal regime in a simulation.

The video below shows the custom simulation developed for this experiment. In addition to capsaicin’s heat and eucalyptol’s cooling, the team added a third channel with 8-mercapto-p-menthan-3-one, an organic compound that’s intended to simulate the smoke from a generator that gets started in-game. The paper goes into great detail on the various receptors that can be stimulated and the different concoctions needed, and full build information is available in the GitHub repo. We’ll be watching this one with interest.

Continue reading “Simulating Temperature In VR Apps With Trigeminal Nerve Stimulation”

A person sits on a couch in the background wearing a VR headset. A keyboard is on their lap and a backpack studded with antennas and cables sits in the foreground.

2022 Cyberdeck Contest: Cyberpack VR

Feeling confined by the “traditional” cyberdeck form factor, [adam] decided to build something a little bigger with his Cyberpack VR. If you’ve ever dreamed of being a WiFi-equipped porcupine, then this is the cyberdeck you’ve been waiting for.

Craving the upgradability and utility of a desktop in a more portable format, [adam] took an old commuter backpack and squeezed in a Windows 11 PC, Raspberry Pi, multiple wifi networks, an ergonomic keyboard, a Quest VR headset, and enough antennas to attract the attention of the FCC. The abundance of network hardware is due to [adam]’s “new interest: a deeper understanding of wifi, and control of my own home network even if my teenage kids become hackers.”

The Quest is setup to run multiple virtual displays via Immersed, and you can relax on the couch while leaving the bag on the floor nearby with the extra long umbilical. One of the neat details of this build is repurposing the bag’s external helmet mount to attach the terminal unit when not in use. Other details we love are the toggle switches and really integrated look of the antenna connectors and USB ports. The way these elements are integrated into the bag makes it feel borderline organic – all the better for your cyborg chic.

For more WiFi backpacking goodness you may be interested in the Pwnton Pack. We’ve also covered other non-traditional cyberdecks including the Steampunk Cyberdeck and the Galdeano. If you have your own cyberdeck, you have until September 30th to submit it to our 2022 Cyberdeck Contest!

DIY Haptic-Enabled VR Gun Hits All The Targets

This VR Haptic Gun by [Robert Enriquez] is the result of hacking together different off-the-shelf products and tying it all together with an ESP32 development board. The result? A gun frame that integrates a VR controller (meaning it can be tracked and used in VR) and provides mild force feedback thanks to a motor that moves with each shot.

But that’s not all! Using the WiFi capabilities of the ESP32 board, the gun also responds to signals sent by a piece of software intended to drive commercial haptics hardware. That software hooks into the VR game and sends signals over the network telling the gun what’s happening, and [Robert]’s firmware acts on those signals. In short, every time [Robert] fires the gun in VR, the one in his hand recoils in synchronization with the game events. The effect is mild, but when it comes to tactile feedback, a little can go a long way.

The fact that this kind of experimentation is easily and affordably within the reach of hobbyists is wonderful, and VR certainly has plenty of room for amateurs to break new ground, as we’ve seen with projects like low-cost haptic VR gloves.

[Robert] walks through every phase of his gun’s design, explaining how he made various square pegs fit into round holes, and provides links to parts and resources in the project’s GitHub repository. There’s a video tour embedded below the page break, but if you want to jump straight to a demonstration in Valve’s Half-Life: Alyx, here’s a link to test firing at 10:19 in.

There are a number of improvements waiting to be done, but [Robert] definitely understands the value of getting something working, even if it’s a bit rough. After all, nothing fills out a to-do list or surfaces hidden problems like a prototype. Watch everything in detail in the video tour, embedded below.

Continue reading “DIY Haptic-Enabled VR Gun Hits All The Targets”

Svelte VR Headsets Coming?

According to Standford and NVidia researchers, VR adoption is slowed by the bulky headsets required. They want to offer a slim solution. A SIGGRAPH paper earlier this year lays out their plan or you can watch the video below. There’s also a second video, also below, covers some technical questions and answers.

The traditional headset has a display right in front of your eyes. Special lenses can make them skinnier, but this new method provides displays that can be a few millimeters thick. The technology seems pretty intense and appears to create a hologram at different apparent places using a laser, a geometric phase lens, and a pupil-replicating waveguide.

Continue reading “Svelte VR Headsets Coming?”

VR Prototypes Reveal Facebook’s Surprisingly Critical Research Directions

A short while ago, Tested posted a video all about hands-on time with virtual reality (VR) headset prototypes from Meta (which is to say, Facebook) and there are some genuinely interesting bits in there. The video itself is over an hour long, but if you’re primarily interested in the technical angles and why they matter for VR, read on because we’ll highlight each of the main points of research.

As absurd as it may seem to many of us to have a social network spearheading meaningful VR development, one can’t say they aren’t taking it seriously. It’s also refreshing to see each of the prototypes get showcased by a researcher who is clearly thrilled to talk about their work. The big dream is to figure out what it takes to pass the “visual Turing test”, which means delivering visuals that are on par with that of a physical reality. Some of these critical elements may come as a bit of a surprise, because they go in directions beyond resolution and field-of-view.

Solid-state varifocal lens demo, capable of 32 discrete focal steps.

At 9:35 in on the video, [Douglas Lanman] shows [Norman Chan] how important variable focus is to delivering a good visual experience, followed by a walk-through of all the different prototypes they have used to get that done. Currently, VR headsets display visuals at only one focal plane, but that means that — among other things — bringing a virtual object close to one’s eyes gets blurry. (Incidentally, older people don’t find that part very strange because it is a common side effect of aging.)

The solution is to change focus based on where the user is looking, and [Douglas] shows off all the different ways this has been explored: from motors and actuators that mechanically change the focal length of the display, to a solid-state solution composed of stacked elements that can selectively converge or diverge light based on its polarization. [Doug]’s pride and excitement is palpable, and he really goes into detail on everything.

At the 30:21 mark, [Yang Zhao] explains the importance of higher resolution displays, and talks about lenses and optics as well. Interestingly, the ultra-clear text rendering made possible by a high-resolution display isn’t what ended up capturing [Norman]’s attention the most. When high resolution was combined with variable focus, it was the textures on cushions, the vividness of wall art, and the patterns on walls that [Norman] found he just couldn’t stop exploring.

Continue reading “VR Prototypes Reveal Facebook’s Surprisingly Critical Research Directions”

Someone setting down an arUco tag

Make Your Own Virtual Set

An old adage says out of cheap, fast, and good, choose two. So if you’re like [Philip Moss] and trying to make a comedy series on a limited budget rapidly, you will have to take some shortcuts to have it still be good. One shortcut [Philip] took was to do away with the set and make it all virtual.

If you’ve heard about the production of a certain western-style space cowboy that uses a virtual set, you probably know what [Philip] did. But for those who haven’t been following, the idea is to have a massive LED wall and tracking of where the camera is. By creating a 3d set, you can render that to the LED wall so that the perspective is correct to the camera. While a giant LED wall was a little out of budget for [Philip], good old green screen fabric wasn’t. The idea was to set up a large green screen backdrop, put some props in, get some assets online, and film the different shots needed. The camera keeps track of where in the virtual room it is, so things like calculating perspective are easy. They also had large arUco tags to help unreal know where objects are. You can put a wall right where the actors think there’s a wall or a table exactly where you put a table covered in green cloth.

Initially, the camera was tracked using a Vive tracker and LiveLink though the tracking wasn’t smooth enough while moving to be used outside of static shots. However, this wasn’t a huge setback as they could move the camera, start a new shot, and not have to change the set in Unreal or fiddle with compositing. Later on, they switched to a RealSense camera instead of the Vive and found it much smoother, though it did tend to drift.

The end result called ‘Age of Outrage’, was pretty darn good. Sure, it’s not perfect, but it doesn’t jump out and scream “rendered set!” the way CGI tv shows in the 90’s did. Not too shabby considering the hardware/software used to create it!