[PyottDesign] recently wrapped up a personal project to create himself a custom AR/VR headset that could function as an AR (augmented reality) platform, and make it easier to develop new applications in a headset that could do everything he needed. He succeeded wonderfully, and published a video showcase of the finished project.
Getting a headset with the features he wanted wasn’t possible by buying off the shelf, so he accomplished his goals with a skillful custom repackaging of a Quest 2 VR headset, integrating a Stereolabs Zed Mini stereo camera (aimed at mixed reality applications) and an Ultraleap IR 170 hand tracking module. These hardware modules have tons of software support and are not very big, but when sticking something onto a human face, every millimeter and gram counts.
We have to say, [Murtaza]’s example game in his latest video isn’t very exciting. However, the OpenCV technique he uses to track a hand and determine its distance from a single camera is pretty interesting. The demo shows a random button on the screen and you have to use your hand to press the button which then moves so you can try again. The hand measurement seems accurate to a few centimeters which is good enough for many applications.
The Python code is actually quite straightforward. Essentially, the software tracks your hand and by estimating its relative size to determine how far away it is. Of course, your hand might also rotate, and [Murtaza] works through all the cases step-by-step. If we wanted to know a distance, we’d probably turn to ultrasonics or a time of flight sensor. The problem is, those sensors can’t tell your hand from anything else that happens to be in front of it. The use of a single camera to track and locate is pretty impressive.
If you haven’t used OpenCV before, the channel has a lot of tutorials and they are all worth watching. Computer vision is a great technique and can replace a lot of things in some applications. GPS, for example. Or, try this creepier tracking application next Halloween.
Unimpressed by DIY wearables powered by dinky microcontrollers, [Teemu Laurila] has been working on a 3D printed head-mounted computer that puts a full-fledged Linux desktop in your field of view. It might not be as slim and ergonomic as Google Glass, but it more than makes up for it in terms of raw potential.
Featuring an overclocked Raspberry Pi Zero W, a ST7789VW 240×240 IPS display running at 60 Hz, and a front-mounted camera, the wearable makes a great low-cost platform for augmented reality experiments. [Teemu] has already put together an impressive hand tracking demonstration that can pick out the position of all ten fingers in near real-time. The processing has to be done on his desktop computer as the Zero isn’t quite up to the task, but as you can see in the video below, the whole thing works pretty well.
Structurally, the head-mounted unit is made up of nine 3D printed parts that clip onto a standard pair of glasses. [Teemu] says the parts will probably need to be tweaked to fit your specific frames, but the design is modular enough that it shouldn’t take too much effort. He’s using 0.6 mm PETG plastic for the front reflector, and the main lens was pulled from a cheap pair of VR goggles and manually cut down into a rectangle.
The evolution of the build has been documented in several videos, and it’s interesting to see how far the hardware has progressed in a relatively short time. The original version made [Teemu] look like he was cosplaying as a Borg drone from Star Trek, but the latest build appears to be far more practical. We still wouldn’t try to wear it on an airplane, but it would hardly look out of place at a hacker con.
Virtual reality is a slow-moving field in some respects. While a lot of focus is put on optical technologies and headsets, there’s a lot more involved when it comes to believably placing a human being in a virtual environment. So far, we’ve gotten a good start at the visuals and head tracking, but interaction technology is still lagging behind a lot. [Lucas] is working in just that area, iterating heavily on his homebrew VR gloves.
The gloves consists of potentiometers, fitted with spools and attached to the tip of each digit on a wearer’s hand by a string. As the user curls their fingers, the potentiometers turn and the position of the fingers can be measured. The potentiometers are all read via an Arduino, which communicates back to a PC via USB. A custom driver is then used to interact with Valve’s SteamVR software, allowing the glove to be used with a wide variety of existing software.
Thus far, the system is merely tracking finger position, but the spool and string based design is intended to support motors down the line for each finger to create resistance, so the user can gain the feeling of touching and interacting with virtual objects. The project has the potential to be a cheaper, more accessible alternative than current off-the-shelf solutions. We’ve seen other hand-tracking gloves before, too – though none that track the spread of a wearer’s hand as well as the finger extension. If you’re working on precisely that, please do drop us a line. Video after the break.
The folks behind the Atmos Extended Reality (XR) headset want to provide improved accessibility with an open ecosystem, and they aim to do it with a WebVR-capable headset design that is self-contained, 3D-printable, and open-sourced. Their immediate goal is to release a development kit, then refine the design for a wider release.
The front of the headset has a camera-based tracking board to provide all the modern goodies like inside-out head and hand tracking as well as the ability to pass through video. The design also provides for a variety of interface methods such as eye tracking and 6 DoF controllers.
With all that, the headset gives users maximum flexibility to experiment with and create different applications while working to keep development simple. A short video showing off the modular design of the HMD and optical assembly is embedded below.
Extended Reality (XR) has emerged as a catch-all term to cover broad combinations of real and virtual elements. On one end of the spectrum are completely virtual elements such as in virtual reality (VR), and towards the other end of the spectrum are things like augmented reality (AR) in which virtual elements are integrated with real ones in varying ratios. With the ability to sense the real world and pass through video from the cameras, developers can choose to integrate as much or as little as they wish.
Terms like XR are a sign that the whole scene is still rapidly changing and it’s fascinating to see how development in this area is still within reach of small developers and individual hackers. The Atmos DK 1 developer kit aims to be released sometime in July, so anyone interested in getting in on the ground floor should read up on how to get involved with the project, which currently points people to their Twitter account (@atmosxr) and invites developers to their Discord server. You can also follow along on their newly published Hackaday.io page.
The game of Anti-Tetris is played by standing in front of a monitor and watch falling Tetris pieces overlaid on a video image of your body. Each hand is used to make pieces disappear so that they don’t stack up to the top of the screen. We don’t see this as the next big indie game. What we do see are some very interesting techniques for hand tracking.
An FPGA drives the game, using a camera as input. To track your hands the Cornell students figured out that YUV images show a specific range of skin tones which can be coded as a filter to direct cursor placement. But they needed a bit of a hack to get at those values. They patched into the camera circuit before the YUV is converted to RGB for the NTSC output.
Registering hand movement perpendicular to the screen is also a challenge that they faced. Because the hand location has already been established they were able to measure distance between the upper and lower boundaries. If that distance changes fast enough it is treated as an input, making the current block disappear.