Quest 3 VR Headset Can Capture 3D Video (Some Tampering Required)

The Quest 3 VR headset is an impressive piece of hardware. It is also not open; not in the way most of us understand the word. One consequence of this is the inability in general for developers or users to directly access the feed of the two color cameras on the front of the headset. However, [Hugh Hou] shares a method of doing exactly this to capture 3D video on the Quest 3 headset for later playback on different devices.

The Quest 3 runs Android under the hood, and Developer Mode plus some ADB commands does the trick.

There are a few steps to the process and it involves enabling developer mode on the hardware then using ADB (Android Debug Bridge) commands to enable the necessary functionality, but it’s nothing the average curious hacker can’t handle. The directions are written out in the video’s description, along with a few handy links. (The video is embedded below just under the page break, but view it on YouTube to access the description and all the info in it.)

He also provides some excellent guidance on practical things like how to capture stable shots, editing the videos, and injecting the necessary metadata for optimal playback on different platforms, including hassle-free uploading to a service like YouTube. [Hugh] is no stranger to this kind of video and camera handling and really knows his stuff, and it’s great to see someone provide detailed instructions.

This kind of 3D video comes down to recording two different views, one for each eye. There’s another way to approach 3D video, however: light fields are also within reach of enterprising hackers, and while they need more hardware they yield far more compelling results.

Continue reading “Quest 3 VR Headset Can Capture 3D Video (Some Tampering Required)”

A Look At Sega’s 8-Bit 3D Glasses

From around 2012 onwards, there was a 3D viewing and VR renaissance in the entertainment industry. That hardware has grown in popularity, even if it’s not yet mainstream. However, 3D tech goes back much further, as [Nicole] shows us with a look at Sega’s ancient 8-bit 3D glasses [via Adafruit].

[Nicole]’s pair of Sega shutter glasses are battered and bruised, but she notes more modern versions are available using the same basic idea. The technology is based on liquid-crystal shutters, one for each eye. By showing the left and right eyes different images, it’s possible to create a 3D-vision effect even with very limited display hardware.

The glasses can be plugged directly into a Japanese Sega Master System, which hails from the mid-1980s. It sends out AC signals to trigger the liquid-crystal shutters via a humble 3.5mm TRS jack. Games like Space Harrier 3D, which were written to use the glasses, effectively run at a half-speed refresh rate. This is because of the 60 Hz NTSC or 50 Hz PAL screen refresh rate is split in half to serve each eye.  Unfortunately, though, the glasses don’t work on modern LCD screens, as their inherent display lag throws off the timing of the pulses the console sends to the glasses.

It’s a neat look at an ancient bit of display tech that had a small resurgence with 3DTVs in the 2010s. By and large, it seems like humans just aren’t that into 3D, at least beneath a full-VR experience. Meanwhile, if you’re wondering what 8-bit 3D looked like, we’ve got a 3D video (!) after the break.

Continue reading “A Look At Sega’s 8-Bit 3D Glasses”

small actor on giant table

NERF – Neural Radiance Fields

Making narrative film just keeps getting easier. What once took a studio is now within reach of the dedicated hobbyist. And Neural Radiance Fields are making it a dramatic step easier. The guys from [Corridor Crew] give an early peek.

Filming and editing have reached the cell phone and laptop stage of easy. But sets, costumes, actors, lighting, and so on haven’t gotten substantially cheaper, and making your own short film is still a major project.

Enter 3D graphics. With a good gaming laptop, anybody can make a photorealistic scene in Blender and place live action actors in it. But it takes both a lot of skill and work. And often, the scene you’re making is available as  a real place, but you can’t get permission to film or haul actors, props, crew, and so on to the set.

A new technology, NERF, for “NEural Radiance Fields”, has decreased the headaches a lot.  Instead of making a 3D model of the scene and using that to predict what reaches the camera, the software starts with video of the scene and machine learns a “radiance field” – a model of how light is reflected by the scene. Continue reading “NERF – Neural Radiance Fields”

Light Fields: Missing Ingredient For Immersive 3D Video Gets Improved

46 time-synchronized action cameras make up the guts of the capture device.

3D video content has a significant limitation, one that is not trivial to solve. Video captured by a camera — even one with high resolution and a very wide field of view — still records a scene as a flat plane, from a fixed point of view. The limitation this brings will be familiar to anyone who has watched a 3D video (or “360 video”) in VR and moved their head the wrong way. In these videos one is free to look around, but may not change the position of their head in the process. Put another way, pivoting one’s head to look up, down, left, or right is fine. Moving one’s head higher, lower, closer, further, or to the side? None of that works. Natural movements like trying to peek over an object, or moving slightly to the side for a better view simply do not work.

Light field video changes that. It is captured using a device like the one in the image above, and Google has a resource page giving an excellent overview of what light field video is, what it can look like, and how they are doing it. That link covers recent improvements to their camera apparatus as well as to video encoding and rendering, but serves as a great show-and-tell of what light fields are and what they can do.

Continue reading “Light Fields: Missing Ingredient For Immersive 3D Video Gets Improved”

How To Record 3D Video In 3D

Legalities of doing something like this aside, this concept by [MadSci labs] gives some insight into how one would go about recording a 3D movie in 3D.  Probably many of you have wondered if this could be done, but they took it one step further and actually made a device capable of doing just that.

[MadSci labs] solution involved taking some 3D glasses home from a theater, cutting them to size, and taping them to a HTC EVO 3D phone.  Each lens piece was taped over a different camera lens to separate out the two 3D elements needed to produce a stereoscopic image. Their experiment was successful, however some loss of quality was experienced.  Because of this, we’re not expecting to see a lot of in-theater movies pirated this way, but given a more professional-quality build, you never know what will happen.

You can see the “results” of their experiment after the break. As it’s not in 3D, it should give you an idea of what is going on. Continue reading “How To Record 3D Video In 3D”