Blinks Are Useful In VR, But Triggering Blinks Is Tricky

In VR, a blink can be a window of opportunity to improve the user’s experience. We’ll explain how in a moment, but blinks are tough to capitalize on because they are unpredictable and don’t last very long. That’s why researchers spent time figuring out how to induce eye blinks on demand in VR (video) and the details are available in a full PDF report. Turns out there are some novel, VR-based ways to reliably induce blinks. If an application can induce them, it makes it easier to use them to fudge details in helpful ways.

It turns out that humans experience a form of change blindness during blinks, and this can be used to sneak small changes into a scene in useful ways. Two examples are hand redirection (HR), and redirected walking (RDW). Both are ways to subtly break the implicit one-to-one mapping of physical and virtual motions. Redirected walking can nudge a user to stay inside a physical boundary without realizing it, leading the user to feel the area is larger than it actually is. Hand redirection can be used to improve haptics and ergonomics. For example, VR experiences that use physical controls (like a steering wheel in a driving simulator, or maybe a starship simulator project like this one) rely on physical and virtual controls overlapping each other perfectly. Hand redirection can improve the process by covering up mismatches in a way that is imperceptible to the user.

There are several known ways to induce a blink reflex, but it turns out that one novel method is particularly suited to implementing in VR: triggering the menace reflex by simulating a fast-approaching object. In VR, a small shadow appears in the field of view and rapidly seems to approach one’s eyes. This very brief event is hardly noticeable, yet reliably triggers a blink. There are other approaches as well such as flashes, sudden noise, or simulating the gradual blurring of vision, but to be useful a method must be unobtrusive and reliable.

We’ve already seen saccadic movement of the eyes used to implement redirected walking, but it turns out that leveraging eye blinks allows for even larger adjustments and changes to go unnoticed by the user. Who knew blinks could be so useful to exploit?

Continue reading “Blinks Are Useful In VR, But Triggering Blinks Is Tricky”

Mice Play In VR

Virtual Reality always seemed like a technology just out of reach, much like nuclear fusion, the flying car, or Linux on the desktop. It seems to be gaining steam in the last five years or so, though, with successful video games from a number of companies as well as plenty of other virtual reality adjacent technology that seems to be picking up steam as well like augmented reality. Another sign that this technology might be here to stay is this virtual reality headset made for mice. Continue reading “Mice Play In VR”

Webcam VR

Immersive Virtual Reality From The Humble Webcam

[Russ Maschmeyer] and Spatial Commerce Projects developed WonkaVision to demonstrate how 3D eye tracking from a single webcam can support rendering a graphical virtual reality (VR) display with realistic depth and space. Spatial Commerce Projects is a Shopify lab working to provide concepts, prototypes, and tools to explore the crossroads of spatial computing and commerce.

The graphical output provides a real sense of depth and three-dimensional space using an optical illusion that reacts to the viewer’s eye position. The eye position is used to render view-dependent images. The computer screen is made to feel like a window into a realistic 3D virtual space where objects beyond the window appear to have depth and objects before the window appear to project out into the space in front of the screen. The resulting experience is like a 3D view into a virtual space. The downside is that the experience only works for one viewer.

Eye tracking is performed using Google’s MediaPipe Iris library, which relies on the fact that the iris diameter of the human eye is almost exactly 11.7 mm for most humans. Computer vision algorithms in the library use this geometrical fact to efficiently locate and track human irises with high accuracy.

Generation of view-dependent images based on tracking a viewer’s eye position was inspired by a classic hack from Johnny Lee to create a VR display using a Wiimote. Hopefully, these eye-tracking approaches will continue to evolve and provide improved motion-responsive views into immersive virtual spaces.

Inspect The RF Realm With Augmented Reality

Intellectually, we all know that we exist in a complex soup of RF energy. Cellular, WiFi, TV, public service radio, radar, ISM-band transmissions from everything from thermometers to garage door openers — it’s all around us. It would be great to see these transmissions, but alas, most of us don’t come from the factory with the correct equipment.

Luckily, aftermarket accessories like RadioFieldAR by [Manahiyo] make it possible to visualize RF signals. As the name suggests, this is an augmented reality system that lets you inspect the RF world around you. The core of the system is a tinySA, a pocket-sized spectrum analyzer that acts as a broadband receiver. A special antenna is connected to the tinySA; unfortunately, there are no specifics on the antenna other than it needs to have a label with an image of the Earth attached to it, for antenna tracking purposes. The tinySA is connected to an Android phone — one that supports Google’s ARCore — by a USB OTG cable, and a special app on the phone runs the show.

By slowly moving the antenna around in the field of view of the phone’s camera, a heat map of signal strength at a particular frequency is slowly built up. The video below shows it in action, and the results are pretty cool. If you don’t have a tinySA, fear not — [Manahiyo] has a version of the app that supports a plain old RTL-SDR dongle too. That should make it easy for just about anyone to try this out.

And if you’re feeling deja vu about this, you’re probably remembering the [Manahiyo]’s VR spectrum analyzer, upon which this project is based.

Continue reading “Inspect The RF Realm With Augmented Reality”

Watch Sony Engineers Tear Down Sony’s VR Hardware

Teardowns are great because they let us peek not only at a product’s components, but also gain insight into the design decisions and implementations of hardware. For teardowns, we’re used to waiting until enthusiasts and enterprising hackers create them, so it came as a bit of a surprise to see Sony themselves share detailed teardowns of the new PlayStation VR2 hardware. (If you prefer the direct video links, Engineer [Takamasa Araki] shows off the headset, and [Takeshi Igarashi] does the same for the controllers.)

The “adaptive trigger” module responsible for the unique feedback.

One particularly intriguing detail is the custom tool [Araki] uses to hold the headset at various stages of the disassembly, which is visible in the picture above. It looks 3D-printed and carefully designed, and while we’re not sure what it’s made from, it does have a strong resemblance to certain high-temperature SLA resins. Those cure into hard, glassy, off-yellow translucent prints like what we see here.

As for the controller, we get a good look at a deeply interesting assembly Sony calls their “adaptive trigger”. What’s so clever about it? Not only can it cause the user to feel a variable amount of resistance when pulling the trigger, it can even actively push back against one’s finger, and the way it works is simple and effective. It is pretty much the same as what is in the PS5 controller, so to find out all about how it works, check out our PS5 controller teardown coverage.

The headset and controller teardown videos are embedded just below. Did anything in them catch your interest? Know of any other companies doing their own teardowns? Let us know in the comments!

Continue reading “Watch Sony Engineers Tear Down Sony’s VR Hardware”

Image from the paper with items a-d. a) Schematic of the EC navigation system integrated with a smart contact lens consisting of GPS receiver module, Arduino UNO as a processor, and PB display. b) Photograph of contact lens placed on the 3D printed replica eyeball. c) Camera setup of the navigation system on the dashboard of a car. d) Driving schemes updating the direction signal: (1–4) images show the four cases of operational principles used in the navigation system. Based on 0.2 V applied to the common pin, 0 V (off-state) and 0.7 V (on-state) are applied alternately in 5 WEs, and operating voltages with relative voltages of −0.2 V and 0.5 V are obtained (From the figure reads left to right: the name of 6 pins used in the system, their on–off status, the applied voltage, and relative voltage). Scale bar is 2 mm.

Smart Contact Lenses Tell You Where To Go

Augmented Reality (AR) promises to relieve us from from the boredom of mundane reality and can also help you navigate unfamiliar environments. Current AR tech leaves something to be desired, but researchers at the Korea Electrotechnology Research Institute have brought AR contact lenses closer to actual reality.

The researchers micro-printed FeFe(CN)6 ink onto the contact substrate and thermally reduced it at 120˚C for nine seconds to form Prussian Blue, an electrochromic pigment. By confining the material with the meniscus of the ink, resolution was better than previous techniques to display data on contact lenses. While the ability to reversibly change from clear to blue faded after 200 cycles, the researchers were targeting a disposable type of smart contact lens, so degradation of the display wasn’t considered a deal breaker.

Since voltages applied were constant, it seems this isn’t a true bi-stable display like e-ink where power is only required to change states. The on condition of a section required 0.5 V while off was -0.2 V. The researchers printed a contact with straight, left, and right arrows as well as STOP and GO commands. Connected to a GPS-equipped Arduino Uno, they used it to navigate between ten different checkpoints as a demonstration. Only a 3D printed eyeball was brave enough (or had IRB approval) to wear the contact lens, so watching the state change through a macro lens attached to a smartphone camera had to do.

With more AR devices on the way, maybe it’s time to start embedding household objects with invisible QR codes or cleaning your workshop to get ready for your AR workbench.

Showing the end result car, with mechanum wheels and a green chassis with what seems to be a camera window on top

2022 FPV Contest: ESP32-Powered FPV Car Uses Javascript For VR Magic

You don’t always need much to build an FPV rig – especially if you’re willing to take advantage of the power of modern smartphones. [joe57005] is showing off his VR FPV build – a fully-printable small Mechanum wheels car chassis, equipped with an ESP32-CAM board serving a 720×720 stream through WiFi. The car uses regular 9g servos to drive each wheel, giving you omnidirectional movement wherever you want to go. An ESP32 CPU and a single low-res camera might not sound like much if you’re aiming for a VR view, and all the ESP32 does is stream the video feed over WebSockets – however, the simplicity is well-compensated for on the frontend. Continue reading “2022 FPV Contest: ESP32-Powered FPV Car Uses Javascript For VR Magic”