Showing the end result car, with mechanum wheels and a green chassis with what seems to be a camera window on top

2022 FPV Contest: ESP32-Powered FPV Car Uses Javascript For VR Magic

You don’t always need much to build an FPV rig – especially if you’re willing to take advantage of the power of modern smartphones. [joe57005] is showing off his VR FPV build – a fully-printable small Mechanum wheels car chassis, equipped with an ESP32-CAM board serving a 720×720 stream through WiFi. The car uses regular 9g servos to drive each wheel, giving you omnidirectional movement wherever you want to go. An ESP32 CPU and a single low-res camera might not sound like much if you’re aiming for a VR view, and all the ESP32 does is stream the video feed over WebSockets – however, the simplicity is well-compensated for on the frontend. Continue reading “2022 FPV Contest: ESP32-Powered FPV Car Uses Javascript For VR Magic”

VR Sickness: A New, Old Problem

Have you ever experienced dizziness, vertigo, or nausea while in a virtual reality experience? That’s VR sickness, and it’s a form of motion sickness. It is not a completely solved problem, and it affects people differently, but it all comes from the same root cause, and there are better and worse ways of dealing with it.

If you’ve experienced a sudden onset of VR sickness, it was most likely triggered by flying, sliding, or some other kind of movement in VR that caused a strong and sudden feeling of vertigo or dizziness. Or perhaps it was not sudden, and was more like a vague unease that crept up, leaving you nauseated and unwell.

Just like car sickness or sea sickness, people are differently sensitive. But the reason it happens is not a mystery; it all comes down to how the human body interprets and reacts to a particular kind of sensory mismatch.

Why Does It Happen?

The human body’s vestibular system is responsible for our sense of balance. It is in turn responsible for many boring, but important, tasks such as not falling over. To fulfill this responsibility, the brain interprets a mix of sensory information and uses it to build a sense of the body, its movements, and how it fits in to the world around it.

These sensory inputs come from the inner ear, the body, and the eyes. Usually these inputs are in agreement, or they disagree so politely that the brain can confidently make a ruling and carry on without bothering anyone. But what if there is a nontrivial conflict between those inputs, and the brain cannot make sense of whether it is moving or not? For example, if the eyes say the body is moving, but the joints and muscles and inner ear disagree? The result of that kind of conflict is to feel sick.

Common symptoms are dizziness, nausea, sweating, headache, and vomiting. These messy symptoms are purposeful, for the human body’s response to this particular kind of sensory mismatch is to assume it has ingested something poisonous, and go into a failure mode of “throw up, go lie down”. This is what is happening — to a greater or lesser degree — by those experiencing VR sickness.

Continue reading “VR Sickness: A New, Old Problem”

DIY Robotic Platform Aims To Solve Walking In VR

[Mark Dufour]’s TACO VR project is a sort of robotic platform that mimics an omnidirectional treadmill, and aims to provide a compact and easily transportable way to allow a user to walk naturally in VR.

Unenthusiastic about most solutions for allowing a user to walk in VR, [Mark] took a completely different approach. The result is a robotic platform that fits inside a small area whose sides fold up for transport; when packed up, it resembles a taco. When deployed, the idea is to have two disc-like platforms always stay under a user’s feet, keeping the user in one place while they otherwise walk normally.

It’s an ambitious project, but [Mark] is up to the task and the project’s GitHub respository has everything needed to stay up to date, or get involved yourself. The hardware is mainly focused on functionality right now; certainly a fall or stumble while using the prototype looks like it would be uncomfortable at the very best, but the idea is innovative. Continue reading “DIY Robotic Platform Aims To Solve Walking In VR”

Hackaday Links Column Banner

Hackaday Links: November 13, 2022

Talk about playing on hard mode! The news this week was rife with stories about Palmer Luckey’s murder-modified VR headset, which ostensibly kills the wearer if their character dies in-game. The headset appears to have three shaped charges in the visor pointing right at the wearer’s frontal lobe, and would certainly do a dandy job of executing someone. In a blog post that we suspect was written with tongue planted firmly in cheek, Luckey, the co-founder of Oculus, describes that the interface from the helmet to the game is via optical sensors that watch the proceeding on the screen, and fire when a certain frequency of flashing red light is detected. He’s also talking about ways to prevent the removal of the headset once donned, in case someone wants to tickle the dragon’s tail and try to quickly rip off the headset as in-game death approaches. We’re pretty sure this isn’t serious, as Luckey himself suggested that it was more of an office art thing, but you never know what extremes a “three commas” net worth can push someone to.

There’s light at the end of the Raspberry Pi supply chain tunnel, as CEO Eben Upton announced that he foresees the Pi problems resolving completely by this time next year. Upton explains his position in the video embedded in the linked article, which is basically that the lingering effects of the pandemic should resolve themselves over the next few months, leading to normalization of inventory across all Pi models. That obviously has to be viewed with some skepticism; after all, nobody saw the supply chain issues coming in the first place, and there certainly could be another black swan event waiting for us that might cause a repeat performance. But it’s good to hear his optimism, as well as his vision for the future now that we’re at the ten-year anniversary of the first Pi’s release.

Continue reading “Hackaday Links: November 13, 2022”

Simulating Temperature In VR Apps With Trigeminal Nerve Stimulation

Virtual reality systems are getting better and better all the time, but they remain largely ocular and auditory devices, with perhaps a little haptic feedback added in for good measure. That still leaves 40% of the five canonical senses out of the mix, unless of course this trigeminal nerve-stimulating VR accessory catches on.

While you may be tempted to look at this as a simple “Smellovision”-style olfactory feedback, the work by [Jas Brooks], [Steven Nagels], and [Pedro Lopes] at the University of Chicago’s Human-Computer Integration Lab is intended to provide a simulation of different thermal regimes that a VR user might experience in a simulation. True, the addition to an off-the-shelf Vive headset does waft chemicals into the wearer’s nose using three microfluidics pumps with vibrating mesh atomizers, but it’s the choice of chemicals and their target that makes this work. The stimulants used are odorless, so instead of triggering the olfactory bulb in the nose, they target the trigeminal nerve, which also innervates the lining of the nose and causes more systemic sensations, like the generalized hot feeling of chili peppers and the cooling power of mint. The headset leverages these sensations to change the thermal regime in a simulation.

The video below shows the custom simulation developed for this experiment. In addition to capsaicin’s heat and eucalyptol’s cooling, the team added a third channel with 8-mercapto-p-menthan-3-one, an organic compound that’s intended to simulate the smoke from a generator that gets started in-game. The paper goes into great detail on the various receptors that can be stimulated and the different concoctions needed, and full build information is available in the GitHub repo. We’ll be watching this one with interest.

Continue reading “Simulating Temperature In VR Apps With Trigeminal Nerve Stimulation”

DIY Haptic-Enabled VR Gun Hits All The Targets

This VR Haptic Gun by [Robert Enriquez] is the result of hacking together different off-the-shelf products and tying it all together with an ESP32 development board. The result? A gun frame that integrates a VR controller (meaning it can be tracked and used in VR) and provides mild force feedback thanks to a motor that moves with each shot.

But that’s not all! Using the WiFi capabilities of the ESP32 board, the gun also responds to signals sent by a piece of software intended to drive commercial haptics hardware. That software hooks into the VR game and sends signals over the network telling the gun what’s happening, and [Robert]’s firmware acts on those signals. In short, every time [Robert] fires the gun in VR, the one in his hand recoils in synchronization with the game events. The effect is mild, but when it comes to tactile feedback, a little can go a long way.

The fact that this kind of experimentation is easily and affordably within the reach of hobbyists is wonderful, and VR certainly has plenty of room for amateurs to break new ground, as we’ve seen with projects like low-cost haptic VR gloves.

[Robert] walks through every phase of his gun’s design, explaining how he made various square pegs fit into round holes, and provides links to parts and resources in the project’s GitHub repository. There’s a video tour embedded below the page break, but if you want to jump straight to a demonstration in Valve’s Half-Life: Alyx, here’s a link to test firing at 10:19 in.

There are a number of improvements waiting to be done, but [Robert] definitely understands the value of getting something working, even if it’s a bit rough. After all, nothing fills out a to-do list or surfaces hidden problems like a prototype. Watch everything in detail in the video tour, embedded below.

Continue reading “DIY Haptic-Enabled VR Gun Hits All The Targets”

Svelte VR Headsets Coming?

According to Standford and NVidia researchers, VR adoption is slowed by the bulky headsets required. They want to offer a slim solution. A SIGGRAPH paper earlier this year lays out their plan or you can watch the video below. There’s also a second video, also below, covers some technical questions and answers.

The traditional headset has a display right in front of your eyes. Special lenses can make them skinnier, but this new method provides displays that can be a few millimeters thick. The technology seems pretty intense and appears to create a hologram at different apparent places using a laser, a geometric phase lens, and a pupil-replicating waveguide.

Continue reading “Svelte VR Headsets Coming?”