Mice Play In VR

Virtual Reality always seemed like a technology just out of reach, much like nuclear fusion, the flying car, or Linux on the desktop. It seems to be gaining steam in the last five years or so, though, with successful video games from a number of companies as well as plenty of other virtual reality adjacent technology that seems to be picking up steam as well like augmented reality. Another sign that this technology might be here to stay is this virtual reality headset made for mice. Continue reading “Mice Play In VR”

Inspect The RF Realm With Augmented Reality

Intellectually, we all know that we exist in a complex soup of RF energy. Cellular, WiFi, TV, public service radio, radar, ISM-band transmissions from everything from thermometers to garage door openers — it’s all around us. It would be great to see these transmissions, but alas, most of us don’t come from the factory with the correct equipment.

Luckily, aftermarket accessories like RadioFieldAR by [Manahiyo] make it possible to visualize RF signals. As the name suggests, this is an augmented reality system that lets you inspect the RF world around you. The core of the system is a tinySA, a pocket-sized spectrum analyzer that acts as a broadband receiver. A special antenna is connected to the tinySA; unfortunately, there are no specifics on the antenna other than it needs to have a label with an image of the Earth attached to it, for antenna tracking purposes. The tinySA is connected to an Android phone — one that supports Google’s ARCore — by a USB OTG cable, and a special app on the phone runs the show.

By slowly moving the antenna around in the field of view of the phone’s camera, a heat map of signal strength at a particular frequency is slowly built up. The video below shows it in action, and the results are pretty cool. If you don’t have a tinySA, fear not — [Manahiyo] has a version of the app that supports a plain old RTL-SDR dongle too. That should make it easy for just about anyone to try this out.

And if you’re feeling deja vu about this, you’re probably remembering the [Manahiyo]’s VR spectrum analyzer, upon which this project is based.

Continue reading “Inspect The RF Realm With Augmented Reality”

Watch Sony Engineers Tear Down Sony’s VR Hardware

Teardowns are great because they let us peek not only at a product’s components, but also gain insight into the design decisions and implementations of hardware. For teardowns, we’re used to waiting until enthusiasts and enterprising hackers create them, so it came as a bit of a surprise to see Sony themselves share detailed teardowns of the new PlayStation VR2 hardware. (If you prefer the direct video links, Engineer [Takamasa Araki] shows off the headset, and [Takeshi Igarashi] does the same for the controllers.)

The “adaptive trigger” module responsible for the unique feedback.

One particularly intriguing detail is the custom tool [Araki] uses to hold the headset at various stages of the disassembly, which is visible in the picture above. It looks 3D-printed and carefully designed, and while we’re not sure what it’s made from, it does have a strong resemblance to certain high-temperature SLA resins. Those cure into hard, glassy, off-yellow translucent prints like what we see here.

As for the controller, we get a good look at a deeply interesting assembly Sony calls their “adaptive trigger”. What’s so clever about it? Not only can it cause the user to feel a variable amount of resistance when pulling the trigger, it can even actively push back against one’s finger, and the way it works is simple and effective. It is pretty much the same as what is in the PS5 controller, so to find out all about how it works, check out our PS5 controller teardown coverage.

The headset and controller teardown videos are embedded just below. Did anything in them catch your interest? Know of any other companies doing their own teardowns? Let us know in the comments!

Continue reading “Watch Sony Engineers Tear Down Sony’s VR Hardware”

Showing the end result car, with mechanum wheels and a green chassis with what seems to be a camera window on top

2022 FPV Contest: ESP32-Powered FPV Car Uses Javascript For VR Magic

You don’t always need much to build an FPV rig – especially if you’re willing to take advantage of the power of modern smartphones. [joe57005] is showing off his VR FPV build – a fully-printable small Mechanum wheels car chassis, equipped with an ESP32-CAM board serving a 720×720 stream through WiFi. The car uses regular 9g servos to drive each wheel, giving you omnidirectional movement wherever you want to go. An ESP32 CPU and a single low-res camera might not sound like much if you’re aiming for a VR view, and all the ESP32 does is stream the video feed over WebSockets – however, the simplicity is well-compensated for on the frontend. Continue reading “2022 FPV Contest: ESP32-Powered FPV Car Uses Javascript For VR Magic”

VR Sickness: A New, Old Problem

Have you ever experienced dizziness, vertigo, or nausea while in a virtual reality experience? That’s VR sickness, and it’s a form of motion sickness. It is not a completely solved problem, and it affects people differently, but it all comes from the same root cause, and there are better and worse ways of dealing with it.

If you’ve experienced a sudden onset of VR sickness, it was most likely triggered by flying, sliding, or some other kind of movement in VR that caused a strong and sudden feeling of vertigo or dizziness. Or perhaps it was not sudden, and was more like a vague unease that crept up, leaving you nauseated and unwell.

Just like car sickness or sea sickness, people are differently sensitive. But the reason it happens is not a mystery; it all comes down to how the human body interprets and reacts to a particular kind of sensory mismatch.

Why Does It Happen?

The human body’s vestibular system is responsible for our sense of balance. It is in turn responsible for many boring, but important, tasks such as not falling over. To fulfill this responsibility, the brain interprets a mix of sensory information and uses it to build a sense of the body, its movements, and how it fits in to the world around it.

These sensory inputs come from the inner ear, the body, and the eyes. Usually these inputs are in agreement, or they disagree so politely that the brain can confidently make a ruling and carry on without bothering anyone. But what if there is a nontrivial conflict between those inputs, and the brain cannot make sense of whether it is moving or not? For example, if the eyes say the body is moving, but the joints and muscles and inner ear disagree? The result of that kind of conflict is to feel sick.

Common symptoms are dizziness, nausea, sweating, headache, and vomiting. These messy symptoms are purposeful, for the human body’s response to this particular kind of sensory mismatch is to assume it has ingested something poisonous, and go into a failure mode of “throw up, go lie down”. This is what is happening — to a greater or lesser degree — by those experiencing VR sickness.

Continue reading “VR Sickness: A New, Old Problem”

DIY Robotic Platform Aims To Solve Walking In VR

[Mark Dufour]’s TACO VR project is a sort of robotic platform that mimics an omnidirectional treadmill, and aims to provide a compact and easily transportable way to allow a user to walk naturally in VR.

Unenthusiastic about most solutions for allowing a user to walk in VR, [Mark] took a completely different approach. The result is a robotic platform that fits inside a small area whose sides fold up for transport; when packed up, it resembles a taco. When deployed, the idea is to have two disc-like platforms always stay under a user’s feet, keeping the user in one place while they otherwise walk normally.

It’s an ambitious project, but [Mark] is up to the task and the project’s GitHub respository has everything needed to stay up to date, or get involved yourself. The hardware is mainly focused on functionality right now; certainly a fall or stumble while using the prototype looks like it would be uncomfortable at the very best, but the idea is innovative. Continue reading “DIY Robotic Platform Aims To Solve Walking In VR”

Hackaday Links Column Banner

Hackaday Links: November 13, 2022

Talk about playing on hard mode! The news this week was rife with stories about Palmer Luckey’s murder-modified VR headset, which ostensibly kills the wearer if their character dies in-game. The headset appears to have three shaped charges in the visor pointing right at the wearer’s frontal lobe, and would certainly do a dandy job of executing someone. In a blog post that we suspect was written with tongue planted firmly in cheek, Luckey, the co-founder of Oculus, describes that the interface from the helmet to the game is via optical sensors that watch the proceeding on the screen, and fire when a certain frequency of flashing red light is detected. He’s also talking about ways to prevent the removal of the headset once donned, in case someone wants to tickle the dragon’s tail and try to quickly rip off the headset as in-game death approaches. We’re pretty sure this isn’t serious, as Luckey himself suggested that it was more of an office art thing, but you never know what extremes a “three commas” net worth can push someone to.

There’s light at the end of the Raspberry Pi supply chain tunnel, as CEO Eben Upton announced that he foresees the Pi problems resolving completely by this time next year. Upton explains his position in the video embedded in the linked article, which is basically that the lingering effects of the pandemic should resolve themselves over the next few months, leading to normalization of inventory across all Pi models. That obviously has to be viewed with some skepticism; after all, nobody saw the supply chain issues coming in the first place, and there certainly could be another black swan event waiting for us that might cause a repeat performance. But it’s good to hear his optimism, as well as his vision for the future now that we’re at the ten-year anniversary of the first Pi’s release.

Continue reading “Hackaday Links: November 13, 2022”