UEVR Project Converts Games To VR, Whether They Like It Or Not

UEVR, or the Universal Unreal Engine VR Mod by [praydog] is made possible by some pretty neat software tricks. Reverse engineering concepts and advanced techniques used in game hacking are leveraged to add VR support, including motion controls, to applicable Unreal Engine games.

The UEVR project is a real-world application of various ideas and concepts, and the results are impressive. One can easily not only make a game render in VR, but it also handles managing the player’s perspective (there are options for attaching the camera view to game objects, for example) and also sensibly maps inputs from VR controllers to whatever the game is expecting. This isn’t the first piece of software that attempts to convert flatscreen software to VR, but it’s by far the most impressive.

There is an in-depth discussion of the techniques used to sensibly and effectively locate and manipulate game elements, not for nefarious purposes, but to enable impressive on-demand VR mods in a semi-automated manner. (Although naturally, some anti-cheat software considers this to be nefarious.)

Many of the most interesting innovations in VR rely on some form of modding, from magic in Skyrim that depends on your actual state of mind to adding DIY eye tracking to headsets in a surprisingly effective, modular, and low-cost way. As usual, to find cutting-edge experimentation, look to the modding community.

See Some Of The Stranger VR Ideas From SIGGRAPH

[Devin Coldewey] shared his experiences with some of the more unusual VR concepts on display at SIGGRAPH 2023. Some of these ideas are pretty interesting in their own right, and even if they aren’t going to actually become commercial products they give some insight into the kinds of problems that are being worked on. Read on to see if anything sparks ideas of your own.

In the area of haptics and physical feedback, Sony shared research prototypes that look like short batons in which are hidden movable weights. These weights can shift up or down on demand, altering their center of gravity. [Devin] states that these units had a mild effect on their own, but when combined with VR visuals the result was impressive. There’s a video demonstration of how they work. Continue reading “See Some Of The Stranger VR Ideas From SIGGRAPH”

Beautifully Rebuilding A VR Headset To Add AR Features

[PyottDesign] recently wrapped up a personal project to create himself a custom AR/VR headset that could function as an AR (augmented reality) platform, and make it easier to develop new applications in a headset that could do everything he needed. He succeeded wonderfully, and published a video showcase of the finished project.

Getting a headset with the features he wanted wasn’t possible by buying off the shelf, so he accomplished his goals with a skillful custom repackaging of a Quest 2 VR headset, integrating a Stereolabs Zed Mini stereo camera (aimed at mixed reality applications) and an Ultraleap IR 170 hand tracking module. These hardware modules have tons of software support and are not very big, but when sticking something onto a human face, every millimeter and gram counts.

Continue reading “Beautifully Rebuilding A VR Headset To Add AR Features”

Closing In On A PC Enabled PSVR2

When the PlayStation VR2 headset was released, people wondered whether it would be possible to get the headset to work as a PC VR headset. That would mean being able to plug it into a PC and have it work as a VR headset, instead of it only working on a PS5 as Sony intended.

Enthusiasts were initially skeptical and at times despondent about the prospects, but developer [iVRy]’s efforts recently had a breakthrough. A PC-compatible VR2 is looking more likely to happen.

So far [iVRy] is claiming they have 6 DOF SLAM (Simultaneous Localisation and Mapping), Prox sensor, and stereo camera data.

Most of the juicy bits are paywalled behind [iVRy]’s Patreon.  We’re hoping the jailbreak process will eventually be open-sourced.

The PS VR2 headset is quite unlike a PC VR headset in a number of ways, and it has not been historically easy to work with Sony’s products from a reverse-engineering perspective, whether it’s an attempt to improve the user experience of an annoying headset, or an attempt to understand the not-even-remotely-sanely-designed protocols behind the Sony Memory Stick. Getting the PS VR2 headset to work in a way it wasn’t intended was expected to be an uphill battle.

It’s not a finished job, but judging by the progress regularly shared on [iVRy]’s Twitter account, it might only be a matter of time.

Behold A Gallery Of Sony’s PS VR2 Prototypes

Every finished product stands at the end of a long line of prototypes, and Sony have recently shared an interview and images of their PlayStation VR2 prototypes.

Many of the prototypes focus on a specific functionality, and readers who are not familiar with building things might find it a bit wild to see just how big and ungainly un-optimized hardware can be.

Finished product (bottom) contrasted with functionally-identical prototype (top).

The images are definitely the best part of that link, but the interview has a few interesting bits. For example, one prototype was optimized for evaluating and testing camera placement with a high degree of accuracy, and it hardly looks like a VR headset at all.

The controllers on the other hand seem to have gone though more iterations based on the ergonomics and physical layout of controls. The VR2 controllers integrate the adaptive triggers from the PlayStation 5, which are of a genuinely clever design capable of variable resistance as well as an active force feedback effect that’s not quite like anything that’s come before.

There’s a lot of work that goes into developing something like a VR headset, as we see here and we’ve seen with Facebook’s (now Meta) VR research prototypes. But even when one can leverage pre-made modules as much as possible and doesn’t need to start entirely from scratch, making a VR headset remains a whole heap of work.

Jump Like Mario With This Weighted Wearable

Virtual reality has come a long way in the past decade, with successful commercial offerings for gaming platforms still going strong as well as a number of semi-virtual, or augmented, reality tools that are proving their worth outside of a gaming environment as well. But with all this success they still haven’t quite figured out methods of locomotion that feel natural like walking or running. One research group is leaping to solve one of these issues with JumpMod: a wearable device that enhances the sensation of jumping.

The group, led by [Pedro Lopes] at the University of Chicago, uses a two-kilogram weight worn on the back to help provide the feeling of jumping or falling. By interfacing it with the virtual reality environment, the weight can quickly move up or down its rails when it detects that the wearer is about to commit to an action that it thinks it can enhance. Wearers report feeling like they are jumping much higher, or even smashing into the ground harder. The backpack offers a compact and affordable alternative to the bulky and expensive hardware traditionally used for this purpose.

With builds like these, we would hope the virtual reality worlds that are being created become even more immersive and believable. Of course that means a lot more work into making other methods of movement in the virtual space feel believable (like walking, to start with) but it’s an excellent piece of technology that shows some progress. Augmenting the virtual space doesn’t always need bulky hardware like this, though. Take a “look” at this device which can build a believable virtual reality space using nothing more than a webcam.

Continue reading “Jump Like Mario With This Weighted Wearable”

Blinks Are Useful In VR, But Triggering Blinks Is Tricky

In VR, a blink can be a window of opportunity to improve the user’s experience. We’ll explain how in a moment, but blinks are tough to capitalize on because they are unpredictable and don’t last very long. That’s why researchers spent time figuring out how to induce eye blinks on demand in VR (video) and the details are available in a full PDF report. Turns out there are some novel, VR-based ways to reliably induce blinks. If an application can induce them, it makes it easier to use them to fudge details in helpful ways.

It turns out that humans experience a form of change blindness during blinks, and this can be used to sneak small changes into a scene in useful ways. Two examples are hand redirection (HR), and redirected walking (RDW). Both are ways to subtly break the implicit one-to-one mapping of physical and virtual motions. Redirected walking can nudge a user to stay inside a physical boundary without realizing it, leading the user to feel the area is larger than it actually is. Hand redirection can be used to improve haptics and ergonomics. For example, VR experiences that use physical controls (like a steering wheel in a driving simulator, or maybe a starship simulator project like this one) rely on physical and virtual controls overlapping each other perfectly. Hand redirection can improve the process by covering up mismatches in a way that is imperceptible to the user.

There are several known ways to induce a blink reflex, but it turns out that one novel method is particularly suited to implementing in VR: triggering the menace reflex by simulating a fast-approaching object. In VR, a small shadow appears in the field of view and rapidly seems to approach one’s eyes. This very brief event is hardly noticeable, yet reliably triggers a blink. There are other approaches as well such as flashes, sudden noise, or simulating the gradual blurring of vision, but to be useful a method must be unobtrusive and reliable.

We’ve already seen saccadic movement of the eyes used to implement redirected walking, but it turns out that leveraging eye blinks allows for even larger adjustments and changes to go unnoticed by the user. Who knew blinks could be so useful to exploit?

Continue reading “Blinks Are Useful In VR, But Triggering Blinks Is Tricky”