Webcam VR

Immersive Virtual Reality From The Humble Webcam

[Russ Maschmeyer] and Spatial Commerce Projects developed WonkaVision to demonstrate how 3D eye tracking from a single webcam can support rendering a graphical virtual reality (VR) display with realistic depth and space. Spatial Commerce Projects is a Shopify lab working to provide concepts, prototypes, and tools to explore the crossroads of spatial computing and commerce.

The graphical output provides a real sense of depth and three-dimensional space using an optical illusion that reacts to the viewer’s eye position. The eye position is used to render view-dependent images. The computer screen is made to feel like a window into a realistic 3D virtual space where objects beyond the window appear to have depth and objects before the window appear to project out into the space in front of the screen. The resulting experience is like a 3D view into a virtual space. The downside is that the experience only works for one viewer.

Eye tracking is performed using Google’s MediaPipe Iris library, which relies on the fact that the iris diameter of the human eye is almost exactly 11.7 mm for most humans. Computer vision algorithms in the library use this geometrical fact to efficiently locate and track human irises with high accuracy.

Generation of view-dependent images based on tracking a viewer’s eye position was inspired by a classic hack from Johnny Lee to create a VR display using a Wiimote. Hopefully, these eye-tracking approaches will continue to evolve and provide improved motion-responsive views into immersive virtual spaces.

Cheap And Easy Motion Tracking

[Koppany Horvarth] set out to create a dirt-cheap optical tracking rig for VR that uses only two cameras and a certain amount of math to do its thing. He knew he could do theoretically, and wouldn’t cost a lot of money, but still required a lot of work and slightly absurd amount of math.

While playing around with a webcam that he’d set up to run an object-tracking Python script and discovered that his setup tended to display a translucent object with a LED inside of it as pure, washed-out white. This gave [Koppany] the idea that he could use such a light as part of his object tracking project. He 3D-printed 50mm hollow spheres out of transparent PLA, illuminated via a LED and powered by a 5V power supply hacked from an old USB cable. After dealing with some lens flares, he sanded down the PLA a little to diffuse the light and it worked like a charm.

To learn more check out his GitHub code repository. You can also take inspiration in some of the other motion tracking posts we’ve published in the past, like motion tracking on the cheap with a PIC and this OpenCV Airsoft turret.

Stealth Peephole Camera Watches Your Front Door

In this week’s links post we mentioned an over-powered DSLR peephole that purportedly cost $4000. So when we saw this tip regarding a relatively inexpensive digital peephole, we thought some of you might be a bit more interested.

The hardware is quite simple; a decent webcam, a Raspberry Pi, and a powered USB hub. The camera gets stripped down to its PCB and hidden inside the door itself. Even if you see this from the inside it’s just a suspicious-looking wire which wouldn’t make most people think a camera was in use.

On the software side of things, [Alex] set up his Raspberry Pi as a 24/7 webcam server to stream the video online. Unlike using a cheap wireless CCTV camera, his video signals are secure. He then runs Motion, a free software motion detector to allow the camera to trigger events when someone comes sneaking by. It can be setup to send you a text, call you, play an alarm, take a picture, record a video… the list goes on. His blog has a full DIY guide if you want to replicate this system. We just hope you have a stronger door!

We covered a similar project back in 2011, but it had made use of real server instead of an inexpensive Raspberry Pi.

[Thanks Alex!]