Tilt Five is an Augmented Reality (AR) system developed by Jeri Ellsworth and a group of other engineers that is aimed at tabletop gaming which is now up on Kickstarter. Though it appears to be a quite capable (and affordable at $299) system based on the Kickstarter campaign, the most remarkable thing about it is probably that it has its roots at Valve. Yes, the ones behind the Half Life games and the Steam games store.
Much of the history of the project has been covered by sites, such as this Verge article from 2013. Back then [Jeri Ellsworth] and [Rick Johnson] were working on project CastAR, which back then looked like a contraption glued onto the top of a pair of shades. When Valve chose to go with Virtual Reality instead of AR, project CastAR began its life outside of Valve, with Valve’s [Gabe] giving [Jeri] and [Rick] his blessing to do whatever they wanted with the project.
Six years later Tilt Five is the result of the work put in over those years. Looking more like a pair of protective glasses along with a wand controller that has an uncanny resemblance to a gas lighter for candles and BBQs, it promises a virtual world like one has never seen before. Courtesy of integrated HD projectors that are aimed at the retroreflective surface of the game board.
A big limitation of the system is also its primary marketing feature: by marketing it as for tabletop gaming, the fact that the system requires this game board as the projection surface means that the virtual world cannot exist outside the board, but for a tabetop game (like Dungeons and Dragons), that should hardly be an issue. As for the games themselves, they would run on an external system, with the signal piped into the AR system. Game support for the Tilt Five is still fairly limited, but more titles have been announced.
It’s one thing to know that your device is leaking electromagnetic interference (EMI), but if you really want to solve the problem, it might be helpful to know where the emissions are coming from. This heat-mapping EMI probe will answer that question, with style. It uses a webcam to record an EMI probe and the overlay a heat map of the interference on the image itself.
Regular readers will note that the hardware end of [Charles Grassin]’s EMI mapper bears a strong resemblance to the EMC probe made from semi-rigid coax we featured recently. Built as a cheap DIY substitute for an expensive off-the-shelf probe set for electromagnetic testing, the probe was super simple: just a semi-rigid coax jumper with one SMA plug lopped off and the raw end looped back and soldered. Connected to an SDR dongle, the probe proved useful for tracking down noisy circuits.
[Charles]’ project takes that a step further by adding a camera that looks down upon the device under test. OpenCV is used to track the probe, which is moved over the DUT manually with the help of an augmented reality display that helps track coverage, with a Python script recording its position and the RF power measurements. The video below shows the capture process and what the data looks like when reassembled as an overlay on top of the device.
Even if EMC testing isn’t your thing, this one seems like a lot of fun for the curious. [Charles] has kindly made the sources available on GitHub, so this is a great project to just knock out quickly and start mapping.
By now we’ve all seen the cheap headsets that essentially stick a smartphone a few inches away from your face to function as a low-cost alternative to devices like Oculus Rift. Available for as little as a few dollars, it’s hard to beat these gadgets for experimenting with VR on a budget. But what about if you’re more interested in working with augmented reality, where rendered images are superimposed onto your real-world view rather than replacing it?
As it turns out, there are now cheap headsets to do that with your phone as well. [kvtoet] picked one of these gadgets up for $30 USD on AliExpress, and used it as a base for a more capable augmented reality experience than the headset alone is capable of. The project is in the early stages, but so far the combination of this simple headset and some hardware liberated from inexpensive Chinese smartphones looks to hold considerable promise for delivering a sub-$100 USD development platform for anyone looking to jump into this fascinating field.
On their own, these cheap augmented reality headsets simply show a reflection of your smartphone’s screen on the inside of the lenses. With specially designed applications, this effect can be used to give the wearer the impression that objects shown on the phone’s screen are actually in their field of vision. It’s a neat effect to be sure, but it doesn’t hold much in the way of practical applications. To turn this into a useful system, the phone needs to be able to see what the wearer is seeing.
To that end, [kvtoet] relocated a VKWorld S8 smartphone’s camera module onto the front of the headset. Beyond its relatively cost, this model of phone was selected because it featured a long camera ribbon cable. With the camera on the outside of the headset, an Android application was created which periodically flashes a bright LED and looks for reflections in the camera’s feed. These reflections are then used to locate objects and markers in the real world.
In the video after the break, [kvtoet] demonstrates how this technique is put to use. The phone is able to track a retroreflector laying on the couch quickly and accurately enough that it can be used to adjust the rendering of a virtual object in real time. As the headset is moved around, it gives the impression that the wearer is actually viewing a real object from different angles and distances. With such a simplistic system the effect isn’t perfect, but it’s exciting to think of the possibilities now that this sort of technology is falling into the tinkerer’s budget.
It’s been more than a year since we first heard about Leap Motion’s new, Open Source augmented reality headset. The first time around, we were surprised: the headset featured dual 1600×1440 LCDs, 120 Hz refresh rate, 100 degree FOV, and the entire thing would cost under $100 (in volume), with everything, from firmware to mechanical design released under Open licenses. Needless to say, that’s easier said than done. Now it seems Leap Motion is releasing files for various components and a full-scale release might be coming sooner than we think.
Leap Motion first made a name for themselves with the Leap Motion sensor, a sort of mini-Kinect that only worked with hands and arms. Yes, we’re perfectly aware that sounds dumb, but the results were impressive: everything turned into a touchscreen display, you could draw with your fingers, and control robots with your hands. If you mount one of these sensors to your forehead, and reflect a few phone screens onto your retinas, you have the makings of a stereoscopic AR headset that tracks the movement of your hands. This is an over-simplified description, but conceptually, that’s what Project North Star is.
The files released now include STLs of parts that can be 3D printed on any filament printer, files for the electronics that drive the backlight and receive video from a laptop, and even software for doing actual Augmented Reality stuff in Unity. It’s not a complete project ready for prime time, but it’s a far cry from the simple spec sheet full of promises we saw in the middle of last year.
Have you ever wished you could see in the RF part of the radio spectrum? While such a skill would probably make it hard to get a good night’s rest, it would at least allow you to instantly see dead spots in your WiFi coverage. Not a bad tradeoff.
Unwilling to go full [Geordi La Forge] to be able to visualize RF, [Ken Kawamoto] built the next best thing – an augmented-reality RF signal strength app for his smartphone. Built to aid in the repositioning of his router in the post-holiday cleanup, the app uses the Android ARCore framework to figure out where in the house the phone is and overlays a color-coded sphere representing sensor data onto the current camera image. The spheres persist in 3D space, leaving a trail of virtual breadcrumbs that map out the sensor data as you warwalk the house. The app also lets you map Bluetooth and LTE coverage, but RF isn’t its only input: if your phone is properly equipped, magnetic fields and barometric pressure can also be AR mapped. We found the Bluetooth demo in the video below particularly interesting; it’s amazing how much the signal is attenuated by a double layer of aluminum foil. [Ken] even came up with an Arduino with a gas sensor that talks to the phone and maps the atmosphere around the kitchen stove.
The app is called AR Sensor and is available on the Play Store, but you’ll need at least Android 8.0 to play. If your phone is behind the times like ours, you might have to settle for mapping your RF world the hard way.
You may remember that earlier this year Leap Motion revealed Project North Star, a kind of open-source reference design for an Augmented Reality (AR) headset. While it’s not destined to make high scores in the fashion department, it aims to be hacker-friendly and boasts a large field of view. There’s also an attractive element of “what you see is what you get” when it comes to the displays and optical design, which is a good thing for hackability. Instead of everything residing in a black box, the system uses two forward-facing displays (one for each eye) whose images are bounced off curved reflective lenses. These are essentially semitransparent mirrors which focus the images properly while also allowing the wearer to see both the displays and the outside world at the same time. This co-existence of both virtual and real-world visuals are a hallmark of Augmented Reality.
When Leap Motion first announced their open-source AR headset, we examined the intruiguing specifications and the design has since been published to GitHub. At the time, we did note that the only option for the special lenses seemed to be to CNC them and then spring for a custom reflective coating.
If the lenses become affordable and mass-produced, that would make the design much more accessible. In addition, anyone wanting to do their own experiments with near-eye displays or HUDs would be able to use the frame and lenses as a basis for their own work, and that’s wonderful.
Perhaps it is true that if all you have is a hammer every problem you see looks like a nail. When you think of augmented reality (AR), you usually think of something like the poorly-received Google Glass where your phone or computer overlays imagery in your field of vision. Bose isn’t known for video, though, they are known for audio. So perhaps it isn’t surprising that their upcoming (January 2019) AR sunglasses won’t feature video overlays. Instead, the $200 sunglasses will tell you what you are looking at.
The thing hinges on your device knowing your approximate location and the glasses knowing their orientation due to an inertial measuring system. In other words, the glasses — combined with your smart device — know where you are and what you are looking at. Approximately. So at the museum, if you are looking at a piece of art, the glasses could tell you more information about it. There’s a video showing an early prototype from earlier this year, below.