Closing In On A PC Enabled PSVR2

When the PlayStation VR2 headset was released, people wondered whether it would be possible to get the headset to work as a PC VR headset. That would mean being able to plug it into a PC and have it work as a VR headset, instead of it only working on a PS5 as Sony intended.

Enthusiasts were initially skeptical and at times despondent about the prospects, but developer [iVRy]’s efforts recently had a breakthrough. A PC-compatible VR2 is looking more likely to happen.

So far [iVRy] is claiming they have 6 DOF SLAM (Simultaneous Localisation and Mapping), Prox sensor, and stereo camera data.

Most of the juicy bits are paywalled behind [iVRy]’s Patreon.  We’re hoping the jailbreak process will eventually be open-sourced.

The PS VR2 headset is quite unlike a PC VR headset in a number of ways, and it has not been historically easy to work with Sony’s products from a reverse-engineering perspective, whether it’s an attempt to improve the user experience of an annoying headset, or an attempt to understand the not-even-remotely-sanely-designed protocols behind the Sony Memory Stick. Getting the PS VR2 headset to work in a way it wasn’t intended was expected to be an uphill battle.

It’s not a finished job, but judging by the progress regularly shared on [iVRy]’s Twitter account, it might only be a matter of time.

Behold A Gallery Of Sony’s PS VR2 Prototypes

Every finished product stands at the end of a long line of prototypes, and Sony have recently shared an interview and images of their PlayStation VR2 prototypes.

Many of the prototypes focus on a specific functionality, and readers who are not familiar with building things might find it a bit wild to see just how big and ungainly un-optimized hardware can be.

Finished product (bottom) contrasted with functionally-identical prototype (top).

The images are definitely the best part of that link, but the interview has a few interesting bits. For example, one prototype was optimized for evaluating and testing camera placement with a high degree of accuracy, and it hardly looks like a VR headset at all.

The controllers on the other hand seem to have gone though more iterations based on the ergonomics and physical layout of controls. The VR2 controllers integrate the adaptive triggers from the PlayStation 5, which are of a genuinely clever design capable of variable resistance as well as an active force feedback effect that’s not quite like anything that’s come before.

There’s a lot of work that goes into developing something like a VR headset, as we see here and we’ve seen with Facebook’s (now Meta) VR research prototypes. But even when one can leverage pre-made modules as much as possible and doesn’t need to start entirely from scratch, making a VR headset remains a whole heap of work.

Behold A DIY VR Headset Its Creator Will “Never” Build Again

Unsatisfied with commercial VR headset options, [dragonskyrunner] did what any enterprising hacker would: gathered parts over time and ultimately made their own. Behold the Hades Widebody (HWD), a DIY PC VR headset that aims for a wide field of view and even manages to integrate some face and eye tracking.

The Fresnel elements hugging the primary lenses provide a way of extending the display into the wearer’s peripheral vision.

[dragonskyrunner] is — and we quote — “NEVER building one of these again.” The reason is easily relatable to anyone who has spent a lot of time and effort creating something special: it does the job it was created for, but it also has limitations and is a lot of work. If one were to do it all over again, there would be a host of improvements and changes to consider. But one won’t be doing it all over again any time soon because it’s done now.

The good news is that [dragonskyrunner] made an effort to document things, so there is at least a parts list and enough details for any suitably motivated hacker to replicate the work and perhaps even put their own spin on it.

The Hades Widebody has a dual-lens arrangement and wide displays that aim to provide a wider field of view than most setups allow. There’s a main lens in front of the user’s eyes and a cut Fresnel lens providing a sort of extension to the side. [dragonskyrunner] claims that while there is certainly not a seamless transition between the lens elements, it does a better job than an Ambilight at providing a sense of visuals extending into the wearer’s peripheral vision.

The DIY spirit of making a piece of hardware to suit one’s own needs is exactly the sort of thing that would fit into our 2023 Cyberdeck content, and while a headset by itself isn’t quite enough to qualify (devices must have some form of usable input and output), it just might get those creative juices flowing.

Enhance VR Immersion By Shoehorning An Ambilight Into A Headset

Everyone wants a wider field of view in their VR headsets, but that’s not an easy nut to crack. [Statonwest] shows there’s a way to get at least some of the immersion benefits with a bit of simple hardware thanks to the VR Ambilight.

Continue reading “Enhance VR Immersion By Shoehorning An Ambilight Into A Headset”

Simple Cubes Show Off AI-Driven Runtime Changes In VR

AR and VR developer [Skarredghost] got pretty excited about a virtual blue cube, and for a very good reason. It marked a successful prototype of an augmented reality experience in which the logic underlying the cube as a virtual object was changed by AI in response to verbal direction by the user. Saying “make it blue” did indeed turn the cube blue! (After a little thinking time, of course.)

It didn’t stop there, of course, and the blue cube proof-of-concept led to a number of simple demos. The first shows off a row of cubes changing color from red to green in response to musical volume, then a bundle of cubes change size in response to microphone volume, and cubes even start moving around in space.

The program accepts spoken input from the user, converts it to text, sends it to a natural language AI model, which then creates the necessary modifications and loads it into the environment to make runtime changes in Unity. The workflow is a bit cumbersome and highlights many of the challenges involved, but it works and that’s pretty nifty.

The GitHub repository is here and a good demonstration video is embedded just under the page break. There’s also a video with a much more in-depth discussion of what’s going on and a frank exploration of the technical challenges.

If you’re interested in this direction, it seems [Skarredghost] has rounded up the relevant details. And should you have a prototype idea that isn’t necessarily AR or VR but would benefit from AI-assisted speech recognition that can run locally? This project has what you need.

Continue reading “Simple Cubes Show Off AI-Driven Runtime Changes In VR”

Jump Like Mario With This Weighted Wearable

Virtual reality has come a long way in the past decade, with successful commercial offerings for gaming platforms still going strong as well as a number of semi-virtual, or augmented, reality tools that are proving their worth outside of a gaming environment as well. But with all this success they still haven’t quite figured out methods of locomotion that feel natural like walking or running. One research group is leaping to solve one of these issues with JumpMod: a wearable device that enhances the sensation of jumping.

The group, led by [Pedro Lopes] at the University of Chicago, uses a two-kilogram weight worn on the back to help provide the feeling of jumping or falling. By interfacing it with the virtual reality environment, the weight can quickly move up or down its rails when it detects that the wearer is about to commit to an action that it thinks it can enhance. Wearers report feeling like they are jumping much higher, or even smashing into the ground harder. The backpack offers a compact and affordable alternative to the bulky and expensive hardware traditionally used for this purpose.

With builds like these, we would hope the virtual reality worlds that are being created become even more immersive and believable. Of course that means a lot more work into making other methods of movement in the virtual space feel believable (like walking, to start with) but it’s an excellent piece of technology that shows some progress. Augmenting the virtual space doesn’t always need bulky hardware like this, though. Take a “look” at this device which can build a believable virtual reality space using nothing more than a webcam.

Continue reading “Jump Like Mario With This Weighted Wearable”

Tactile Feedback In VR, No Cumbersome Gloves Or Motors Required

This clever research from the University of Chicago’s Human Computer Integration Lab demonstrates a fascinating way to let users “feel” objects in VR, without anything getting in the way of using one’s hands and fingers normally. Certainly, the picture here shows hands with a device attached to them, but look closely and you’ll see that it’s on the back of the hand only.

There’s hardware attached to the hands, yes, but only to the backs. Hands and fingers can be used entirely normally while receiving tactile feedback.

The unique device consists of a control box, wires, and some electrodes attached to different spots on the back of the hand and fingers. Carefully modulated electrical signals create tactile sensations on the front, despite originating from electrodes on the back. While this has clear applications for VR, the team thinks the concept could also have applications in rehabilitation, or prosthetics.

Continue reading “Tactile Feedback In VR, No Cumbersome Gloves Or Motors Required”