Tactile Feedback In VR, No Cumbersome Gloves Or Motors Required

This clever research from the University of Chicago’s Human Computer Integration Lab demonstrates a fascinating way to let users “feel” objects in VR, without anything getting in the way of using one’s hands and fingers normally. Certainly, the picture here shows hands with a device attached to them, but look closely and you’ll see that it’s on the back of the hand only.

There’s hardware attached to the hands, yes, but only to the backs. Hands and fingers can be used entirely normally while receiving tactile feedback.

The unique device consists of a control box, wires, and some electrodes attached to different spots on the back of the hand and fingers. Carefully modulated electrical signals create tactile sensations on the front, despite originating from electrodes on the back. While this has clear applications for VR, the team thinks the concept could also have applications in rehabilitation, or prosthetics.

Continue reading “Tactile Feedback In VR, No Cumbersome Gloves Or Motors Required”

Supercon 2022: Aedan Cullen Is Creating An AR System To Beat The Big Boys

There’s something very tantalizing about an augmented reality (AR) overlay that can provide information in daily life without having to glance at a smartphone display, even if it’s just for that sci-fi vibe. Creating a system that is both practical and useful is however far from easy, which is where Aedan Cullen‘s attempt at creating what he terms a ‘practical augmented reality device’.

In terms of requirements, this device would need to have a visual resolution comparable to that of a smartphone (50 pixels/degree) and with a comparable field of view (20 degrees diagonal). User input would need to be as versatile as a touchscreen, but ‘faster’, along with a battery life of at least 8 hours, and all of this in a package weighing less than 50 grams.

Continue reading “Supercon 2022: Aedan Cullen Is Creating An AR System To Beat The Big Boys”

Blinks Are Useful In VR, But Triggering Blinks Is Tricky

In VR, a blink can be a window of opportunity to improve the user’s experience. We’ll explain how in a moment, but blinks are tough to capitalize on because they are unpredictable and don’t last very long. That’s why researchers spent time figuring out how to induce eye blinks on demand in VR (video) and the details are available in a full PDF report. Turns out there are some novel, VR-based ways to reliably induce blinks. If an application can induce them, it makes it easier to use them to fudge details in helpful ways.

It turns out that humans experience a form of change blindness during blinks, and this can be used to sneak small changes into a scene in useful ways. Two examples are hand redirection (HR), and redirected walking (RDW). Both are ways to subtly break the implicit one-to-one mapping of physical and virtual motions. Redirected walking can nudge a user to stay inside a physical boundary without realizing it, leading the user to feel the area is larger than it actually is. Hand redirection can be used to improve haptics and ergonomics. For example, VR experiences that use physical controls (like a steering wheel in a driving simulator, or maybe a starship simulator project like this one) rely on physical and virtual controls overlapping each other perfectly. Hand redirection can improve the process by covering up mismatches in a way that is imperceptible to the user.

There are several known ways to induce a blink reflex, but it turns out that one novel method is particularly suited to implementing in VR: triggering the menace reflex by simulating a fast-approaching object. In VR, a small shadow appears in the field of view and rapidly seems to approach one’s eyes. This very brief event is hardly noticeable, yet reliably triggers a blink. There are other approaches as well such as flashes, sudden noise, or simulating the gradual blurring of vision, but to be useful a method must be unobtrusive and reliable.

We’ve already seen saccadic movement of the eyes used to implement redirected walking, but it turns out that leveraging eye blinks allows for even larger adjustments and changes to go unnoticed by the user. Who knew blinks could be so useful to exploit?

Continue reading “Blinks Are Useful In VR, But Triggering Blinks Is Tricky”

Mice Play In VR

Virtual Reality always seemed like a technology just out of reach, much like nuclear fusion, the flying car, or Linux on the desktop. It seems to be gaining steam in the last five years or so, though, with successful video games from a number of companies as well as plenty of other virtual reality adjacent technology that seems to be picking up steam as well like augmented reality. Another sign that this technology might be here to stay is this virtual reality headset made for mice. Continue reading “Mice Play In VR”

Webcam VR

Immersive Virtual Reality From The Humble Webcam

[Russ Maschmeyer] and Spatial Commerce Projects developed WonkaVision to demonstrate how 3D eye tracking from a single webcam can support rendering a graphical virtual reality (VR) display with realistic depth and space. Spatial Commerce Projects is a Shopify lab working to provide concepts, prototypes, and tools to explore the crossroads of spatial computing and commerce.

The graphical output provides a real sense of depth and three-dimensional space using an optical illusion that reacts to the viewer’s eye position. The eye position is used to render view-dependent images. The computer screen is made to feel like a window into a realistic 3D virtual space where objects beyond the window appear to have depth and objects before the window appear to project out into the space in front of the screen. The resulting experience is like a 3D view into a virtual space. The downside is that the experience only works for one viewer.

Eye tracking is performed using Google’s MediaPipe Iris library, which relies on the fact that the iris diameter of the human eye is almost exactly 11.7 mm for most humans. Computer vision algorithms in the library use this geometrical fact to efficiently locate and track human irises with high accuracy.

Generation of view-dependent images based on tracking a viewer’s eye position was inspired by a classic hack from Johnny Lee to create a VR display using a Wiimote. Hopefully, these eye-tracking approaches will continue to evolve and provide improved motion-responsive views into immersive virtual spaces.

Inspect The RF Realm With Augmented Reality

Intellectually, we all know that we exist in a complex soup of RF energy. Cellular, WiFi, TV, public service radio, radar, ISM-band transmissions from everything from thermometers to garage door openers — it’s all around us. It would be great to see these transmissions, but alas, most of us don’t come from the factory with the correct equipment.

Luckily, aftermarket accessories like RadioFieldAR by [Manahiyo] make it possible to visualize RF signals. As the name suggests, this is an augmented reality system that lets you inspect the RF world around you. The core of the system is a tinySA, a pocket-sized spectrum analyzer that acts as a broadband receiver. A special antenna is connected to the tinySA; unfortunately, there are no specifics on the antenna other than it needs to have a label with an image of the Earth attached to it, for antenna tracking purposes. The tinySA is connected to an Android phone — one that supports Google’s ARCore — by a USB OTG cable, and a special app on the phone runs the show.

By slowly moving the antenna around in the field of view of the phone’s camera, a heat map of signal strength at a particular frequency is slowly built up. The video below shows it in action, and the results are pretty cool. If you don’t have a tinySA, fear not — [Manahiyo] has a version of the app that supports a plain old RTL-SDR dongle too. That should make it easy for just about anyone to try this out.

And if you’re feeling deja vu about this, you’re probably remembering the [Manahiyo]’s VR spectrum analyzer, upon which this project is based.

Continue reading “Inspect The RF Realm With Augmented Reality”

Watch Sony Engineers Tear Down Sony’s VR Hardware

Teardowns are great because they let us peek not only at a product’s components, but also gain insight into the design decisions and implementations of hardware. For teardowns, we’re used to waiting until enthusiasts and enterprising hackers create them, so it came as a bit of a surprise to see Sony themselves share detailed teardowns of the new PlayStation VR2 hardware. (If you prefer the direct video links, Engineer [Takamasa Araki] shows off the headset, and [Takeshi Igarashi] does the same for the controllers.)

The “adaptive trigger” module responsible for the unique feedback.

One particularly intriguing detail is the custom tool [Araki] uses to hold the headset at various stages of the disassembly, which is visible in the picture above. It looks 3D-printed and carefully designed, and while we’re not sure what it’s made from, it does have a strong resemblance to certain high-temperature SLA resins. Those cure into hard, glassy, off-yellow translucent prints like what we see here.

As for the controller, we get a good look at a deeply interesting assembly Sony calls their “adaptive trigger”. What’s so clever about it? Not only can it cause the user to feel a variable amount of resistance when pulling the trigger, it can even actively push back against one’s finger, and the way it works is simple and effective. It is pretty much the same as what is in the PS5 controller, so to find out all about how it works, check out our PS5 controller teardown coverage.

The headset and controller teardown videos are embedded just below. Did anything in them catch your interest? Know of any other companies doing their own teardowns? Let us know in the comments!

Continue reading “Watch Sony Engineers Tear Down Sony’s VR Hardware”