Inspect The RF Realm With Augmented Reality

Intellectually, we all know that we exist in a complex soup of RF energy. Cellular, WiFi, TV, public service radio, radar, ISM-band transmissions from everything from thermometers to garage door openers — it’s all around us. It would be great to see these transmissions, but alas, most of us don’t come from the factory with the correct equipment.

Luckily, aftermarket accessories like RadioFieldAR by [Manahiyo] make it possible to visualize RF signals. As the name suggests, this is an augmented reality system that lets you inspect the RF world around you. The core of the system is a tinySA, a pocket-sized spectrum analyzer that acts as a broadband receiver. A special antenna is connected to the tinySA; unfortunately, there are no specifics on the antenna other than it needs to have a label with an image of the Earth attached to it, for antenna tracking purposes. The tinySA is connected to an Android phone — one that supports Google’s ARCore — by a USB OTG cable, and a special app on the phone runs the show.

By slowly moving the antenna around in the field of view of the phone’s camera, a heat map of signal strength at a particular frequency is slowly built up. The video below shows it in action, and the results are pretty cool. If you don’t have a tinySA, fear not — [Manahiyo] has a version of the app that supports a plain old RTL-SDR dongle too. That should make it easy for just about anyone to try this out.

And if you’re feeling deja vu about this, you’re probably remembering the [Manahiyo]’s VR spectrum analyzer, upon which this project is based.

Continue reading “Inspect The RF Realm With Augmented Reality”

Watch Sony Engineers Tear Down Sony’s VR Hardware

Teardowns are great because they let us peek not only at a product’s components, but also gain insight into the design decisions and implementations of hardware. For teardowns, we’re used to waiting until enthusiasts and enterprising hackers create them, so it came as a bit of a surprise to see Sony themselves share detailed teardowns of the new PlayStation VR2 hardware. (If you prefer the direct video links, Engineer [Takamasa Araki] shows off the headset, and [Takeshi Igarashi] does the same for the controllers.)

The “adaptive trigger” module responsible for the unique feedback.

One particularly intriguing detail is the custom tool [Araki] uses to hold the headset at various stages of the disassembly, which is visible in the picture above. It looks 3D-printed and carefully designed, and while we’re not sure what it’s made from, it does have a strong resemblance to certain high-temperature SLA resins. Those cure into hard, glassy, off-yellow translucent prints like what we see here.

As for the controller, we get a good look at a deeply interesting assembly Sony calls their “adaptive trigger”. What’s so clever about it? Not only can it cause the user to feel a variable amount of resistance when pulling the trigger, it can even actively push back against one’s finger, and the way it works is simple and effective. It is pretty much the same as what is in the PS5 controller, so to find out all about how it works, check out our PS5 controller teardown coverage.

The headset and controller teardown videos are embedded just below. Did anything in them catch your interest? Know of any other companies doing their own teardowns? Let us know in the comments!

Continue reading “Watch Sony Engineers Tear Down Sony’s VR Hardware”

Image from the paper with items a-d. a) Schematic of the EC navigation system integrated with a smart contact lens consisting of GPS receiver module, Arduino UNO as a processor, and PB display. b) Photograph of contact lens placed on the 3D printed replica eyeball. c) Camera setup of the navigation system on the dashboard of a car. d) Driving schemes updating the direction signal: (1–4) images show the four cases of operational principles used in the navigation system. Based on 0.2 V applied to the common pin, 0 V (off-state) and 0.7 V (on-state) are applied alternately in 5 WEs, and operating voltages with relative voltages of −0.2 V and 0.5 V are obtained (From the figure reads left to right: the name of 6 pins used in the system, their on–off status, the applied voltage, and relative voltage). Scale bar is 2 mm.

Smart Contact Lenses Tell You Where To Go

Augmented Reality (AR) promises to relieve us from from the boredom of mundane reality and can also help you navigate unfamiliar environments. Current AR tech leaves something to be desired, but researchers at the Korea Electrotechnology Research Institute have brought AR contact lenses closer to actual reality.

The researchers micro-printed FeFe(CN)6 ink onto the contact substrate and thermally reduced it at 120˚C for nine seconds to form Prussian Blue, an electrochromic pigment. By confining the material with the meniscus of the ink, resolution was better than previous techniques to display data on contact lenses. While the ability to reversibly change from clear to blue faded after 200 cycles, the researchers were targeting a disposable type of smart contact lens, so degradation of the display wasn’t considered a deal breaker.

Since voltages applied were constant, it seems this isn’t a true bi-stable display like e-ink where power is only required to change states. The on condition of a section required 0.5 V while off was -0.2 V. The researchers printed a contact with straight, left, and right arrows as well as STOP and GO commands. Connected to a GPS-equipped Arduino Uno, they used it to navigate between ten different checkpoints as a demonstration. Only a 3D printed eyeball was brave enough (or had IRB approval) to wear the contact lens, so watching the state change through a macro lens attached to a smartphone camera had to do.

With more AR devices on the way, maybe it’s time to start embedding household objects with invisible QR codes or cleaning your workshop to get ready for your AR workbench.

Showing the end result car, with mechanum wheels and a green chassis with what seems to be a camera window on top

2022 FPV Contest: ESP32-Powered FPV Car Uses Javascript For VR Magic

You don’t always need much to build an FPV rig – especially if you’re willing to take advantage of the power of modern smartphones. [joe57005] is showing off his VR FPV build – a fully-printable small Mechanum wheels car chassis, equipped with an ESP32-CAM board serving a 720×720 stream through WiFi. The car uses regular 9g servos to drive each wheel, giving you omnidirectional movement wherever you want to go. An ESP32 CPU and a single low-res camera might not sound like much if you’re aiming for a VR view, and all the ESP32 does is stream the video feed over WebSockets – however, the simplicity is well-compensated for on the frontend. Continue reading “2022 FPV Contest: ESP32-Powered FPV Car Uses Javascript For VR Magic”

VR Sickness: A New, Old Problem

Have you ever experienced dizziness, vertigo, or nausea while in a virtual reality experience? That’s VR sickness, and it’s a form of motion sickness. It is not a completely solved problem, and it affects people differently, but it all comes from the same root cause, and there are better and worse ways of dealing with it.

If you’ve experienced a sudden onset of VR sickness, it was most likely triggered by flying, sliding, or some other kind of movement in VR that caused a strong and sudden feeling of vertigo or dizziness. Or perhaps it was not sudden, and was more like a vague unease that crept up, leaving you nauseated and unwell.

Just like car sickness or sea sickness, people are differently sensitive. But the reason it happens is not a mystery; it all comes down to how the human body interprets and reacts to a particular kind of sensory mismatch.

Why Does It Happen?

The human body’s vestibular system is responsible for our sense of balance. It is in turn responsible for many boring, but important, tasks such as not falling over. To fulfill this responsibility, the brain interprets a mix of sensory information and uses it to build a sense of the body, its movements, and how it fits in to the world around it.

These sensory inputs come from the inner ear, the body, and the eyes. Usually these inputs are in agreement, or they disagree so politely that the brain can confidently make a ruling and carry on without bothering anyone. But what if there is a nontrivial conflict between those inputs, and the brain cannot make sense of whether it is moving or not? For example, if the eyes say the body is moving, but the joints and muscles and inner ear disagree? The result of that kind of conflict is to feel sick.

Common symptoms are dizziness, nausea, sweating, headache, and vomiting. These messy symptoms are purposeful, for the human body’s response to this particular kind of sensory mismatch is to assume it has ingested something poisonous, and go into a failure mode of “throw up, go lie down”. This is what is happening — to a greater or lesser degree — by those experiencing VR sickness.

Continue reading “VR Sickness: A New, Old Problem”

DIY Robotic Platform Aims To Solve Walking In VR

[Mark Dufour]’s TACO VR project is a sort of robotic platform that mimics an omnidirectional treadmill, and aims to provide a compact and easily transportable way to allow a user to walk naturally in VR.

Unenthusiastic about most solutions for allowing a user to walk in VR, [Mark] took a completely different approach. The result is a robotic platform that fits inside a small area whose sides fold up for transport; when packed up, it resembles a taco. When deployed, the idea is to have two disc-like platforms always stay under a user’s feet, keeping the user in one place while they otherwise walk normally.

It’s an ambitious project, but [Mark] is up to the task and the project’s GitHub respository has everything needed to stay up to date, or get involved yourself. The hardware is mainly focused on functionality right now; certainly a fall or stumble while using the prototype looks like it would be uncomfortable at the very best, but the idea is innovative. Continue reading “DIY Robotic Platform Aims To Solve Walking In VR”

Simulating Temperature In VR Apps With Trigeminal Nerve Stimulation

Virtual reality systems are getting better and better all the time, but they remain largely ocular and auditory devices, with perhaps a little haptic feedback added in for good measure. That still leaves 40% of the five canonical senses out of the mix, unless of course this trigeminal nerve-stimulating VR accessory catches on.

While you may be tempted to look at this as a simple “Smellovision”-style olfactory feedback, the work by [Jas Brooks], [Steven Nagels], and [Pedro Lopes] at the University of Chicago’s Human-Computer Integration Lab is intended to provide a simulation of different thermal regimes that a VR user might experience in a simulation. True, the addition to an off-the-shelf Vive headset does waft chemicals into the wearer’s nose using three microfluidics pumps with vibrating mesh atomizers, but it’s the choice of chemicals and their target that makes this work. The stimulants used are odorless, so instead of triggering the olfactory bulb in the nose, they target the trigeminal nerve, which also innervates the lining of the nose and causes more systemic sensations, like the generalized hot feeling of chili peppers and the cooling power of mint. The headset leverages these sensations to change the thermal regime in a simulation.

The video below shows the custom simulation developed for this experiment. In addition to capsaicin’s heat and eucalyptol’s cooling, the team added a third channel with 8-mercapto-p-menthan-3-one, an organic compound that’s intended to simulate the smoke from a generator that gets started in-game. The paper goes into great detail on the various receptors that can be stimulated and the different concoctions needed, and full build information is available in the GitHub repo. We’ll be watching this one with interest.

Continue reading “Simulating Temperature In VR Apps With Trigeminal Nerve Stimulation”