Virtual Reality always seemed like a technology just out of reach, much like nuclear fusion, the flying car, or Linux on the desktop. It seems to be gaining steam in the last five years or so, though, with successful video games from a number of companies as well as plenty of other virtual reality adjacent technology that seems to be picking up steam as well like augmented reality. Another sign that this technology might be here to stay is this virtual reality headset made for mice. Continue reading “Mice Play In VR”
Virtual Reality243 Articles
Immersive Virtual Reality From The Humble Webcam
[Russ Maschmeyer] and Spatial Commerce Projects developed WonkaVision to demonstrate how 3D eye tracking from a single webcam can support rendering a graphical virtual reality (VR) display with realistic depth and space. Spatial Commerce Projects is a Shopify lab working to provide concepts, prototypes, and tools to explore the crossroads of spatial computing and commerce.
The graphical output provides a real sense of depth and three-dimensional space using an optical illusion that reacts to the viewer’s eye position. The eye position is used to render view-dependent images. The computer screen is made to feel like a window into a realistic 3D virtual space where objects beyond the window appear to have depth and objects before the window appear to project out into the space in front of the screen. The resulting experience is like a 3D view into a virtual space. The downside is that the experience only works for one viewer.
Eye tracking is performed using Google’s MediaPipe Iris library, which relies on the fact that the iris diameter of the human eye is almost exactly 11.7 mm for most humans. Computer vision algorithms in the library use this geometrical fact to efficiently locate and track human irises with high accuracy.
Generation of view-dependent images based on tracking a viewer’s eye position was inspired by a classic hack from Johnny Lee to create a VR display using a Wiimote. Hopefully, these eye-tracking approaches will continue to evolve and provide improved motion-responsive views into immersive virtual spaces.
Inspect The RF Realm With Augmented Reality
Intellectually, we all know that we exist in a complex soup of RF energy. Cellular, WiFi, TV, public service radio, radar, ISM-band transmissions from everything from thermometers to garage door openers — it’s all around us. It would be great to see these transmissions, but alas, most of us don’t come from the factory with the correct equipment.
Luckily, aftermarket accessories like RadioFieldAR by [Manahiyo] make it possible to visualize RF signals. As the name suggests, this is an augmented reality system that lets you inspect the RF world around you. The core of the system is a tinySA, a pocket-sized spectrum analyzer that acts as a broadband receiver. A special antenna is connected to the tinySA; unfortunately, there are no specifics on the antenna other than it needs to have a label with an image of the Earth attached to it, for antenna tracking purposes. The tinySA is connected to an Android phone — one that supports Google’s ARCore — by a USB OTG cable, and a special app on the phone runs the show.
By slowly moving the antenna around in the field of view of the phone’s camera, a heat map of signal strength at a particular frequency is slowly built up. The video below shows it in action, and the results are pretty cool. If you don’t have a tinySA, fear not — [Manahiyo] has a version of the app that supports a plain old RTL-SDR dongle too. That should make it easy for just about anyone to try this out.
And if you’re feeling deja vu about this, you’re probably remembering the [Manahiyo]’s VR spectrum analyzer, upon which this project is based.
Continue reading “Inspect The RF Realm With Augmented Reality”
Watch Sony Engineers Tear Down Sony’s VR Hardware
Teardowns are great because they let us peek not only at a product’s components, but also gain insight into the design decisions and implementations of hardware. For teardowns, we’re used to waiting until enthusiasts and enterprising hackers create them, so it came as a bit of a surprise to see Sony themselves share detailed teardowns of the new PlayStation VR2 hardware. (If you prefer the direct video links, Engineer [Takamasa Araki] shows off the headset, and [Takeshi Igarashi] does the same for the controllers.)
One particularly intriguing detail is the custom tool [Araki] uses to hold the headset at various stages of the disassembly, which is visible in the picture above. It looks 3D-printed and carefully designed, and while we’re not sure what it’s made from, it does have a strong resemblance to certain high-temperature SLA resins. Those cure into hard, glassy, off-yellow translucent prints like what we see here.
As for the controller, we get a good look at a deeply interesting assembly Sony calls their “adaptive trigger”. What’s so clever about it? Not only can it cause the user to feel a variable amount of resistance when pulling the trigger, it can even actively push back against one’s finger, and the way it works is simple and effective. It is pretty much the same as what is in the PS5 controller, so to find out all about how it works, check out our PS5 controller teardown coverage.
The headset and controller teardown videos are embedded just below. Did anything in them catch your interest? Know of any other companies doing their own teardowns? Let us know in the comments!
Continue reading “Watch Sony Engineers Tear Down Sony’s VR Hardware”
Smart Contact Lenses Tell You Where To Go
Augmented Reality (AR) promises to relieve us from from the boredom of mundane reality and can also help you navigate unfamiliar environments. Current AR tech leaves something to be desired, but researchers at the Korea Electrotechnology Research Institute have brought AR contact lenses closer to actual reality.
The researchers micro-printed FeFe(CN)6 ink onto the contact substrate and thermally reduced it at 120˚C for nine seconds to form Prussian Blue, an electrochromic pigment. By confining the material with the meniscus of the ink, resolution was better than previous techniques to display data on contact lenses. While the ability to reversibly change from clear to blue faded after 200 cycles, the researchers were targeting a disposable type of smart contact lens, so degradation of the display wasn’t considered a deal breaker.
Since voltages applied were constant, it seems this isn’t a true bi-stable display like e-ink where power is only required to change states. The on condition of a section required 0.5 V while off was -0.2 V. The researchers printed a contact with straight, left, and right arrows as well as STOP and GO commands. Connected to a GPS-equipped Arduino Uno, they used it to navigate between ten different checkpoints as a demonstration. Only a 3D printed eyeball was brave enough (or had IRB approval) to wear the contact lens, so watching the state change through a macro lens attached to a smartphone camera had to do.
With more AR devices on the way, maybe it’s time to start embedding household objects with invisible QR codes or cleaning your workshop to get ready for your AR workbench.
2022 FPV Contest: ESP32-Powered FPV Car Uses Javascript For VR Magic
You don’t always need much to build an FPV rig – especially if you’re willing to take advantage of the power of modern smartphones. [joe57005] is showing off his VR FPV build – a fully-printable small Mechanum wheels car chassis, equipped with an ESP32-CAM board serving a 720×720 stream through WiFi. The car uses regular 9g servos to drive each wheel, giving you omnidirectional movement wherever you want to go. An ESP32 CPU and a single low-res camera might not sound like much if you’re aiming for a VR view, and all the ESP32 does is stream the video feed over WebSockets – however, the simplicity is well-compensated for on the frontend. Continue reading “2022 FPV Contest: ESP32-Powered FPV Car Uses Javascript For VR Magic”
VR Sickness: A New, Old Problem
Have you ever experienced dizziness, vertigo, or nausea while in a virtual reality experience? That’s VR sickness, and it’s a form of motion sickness. It is not a completely solved problem, and it affects people differently, but it all comes from the same root cause, and there are better and worse ways of dealing with it.
If you’ve experienced a sudden onset of VR sickness, it was most likely triggered by flying, sliding, or some other kind of movement in VR that caused a strong and sudden feeling of vertigo or dizziness. Or perhaps it was not sudden, and was more like a vague unease that crept up, leaving you nauseated and unwell.
Just like car sickness or sea sickness, people are differently sensitive. But the reason it happens is not a mystery; it all comes down to how the human body interprets and reacts to a particular kind of sensory mismatch.
Why Does It Happen?
The human body’s vestibular system is responsible for our sense of balance. It is in turn responsible for many boring, but important, tasks such as not falling over. To fulfill this responsibility, the brain interprets a mix of sensory information and uses it to build a sense of the body, its movements, and how it fits in to the world around it.
These sensory inputs come from the inner ear, the body, and the eyes. Usually these inputs are in agreement, or they disagree so politely that the brain can confidently make a ruling and carry on without bothering anyone. But what if there is a nontrivial conflict between those inputs, and the brain cannot make sense of whether it is moving or not? For example, if the eyes say the body is moving, but the joints and muscles and inner ear disagree? The result of that kind of conflict is to feel sick.
Common symptoms are dizziness, nausea, sweating, headache, and vomiting. These messy symptoms are purposeful, for the human body’s response to this particular kind of sensory mismatch is to assume it has ingested something poisonous, and go into a failure mode of “throw up, go lie down”. This is what is happening — to a greater or lesser degree — by those experiencing VR sickness.