Leap Motion’s Project North Star Gets Hardware

It’s been more than a year since we first heard about Leap Motion’s new, Open Source augmented reality headset. The first time around, we were surprised: the headset featured dual 1600×1440 LCDs, 120 Hz refresh rate, 100 degree FOV, and the entire thing would cost under $100 (in volume), with everything, from firmware to mechanical design released under Open licenses. Needless to say, that’s easier said than done. Now it seems Leap Motion is releasing files for various components and a full-scale release might be coming sooner than we think.

Leap Motion first made a name for themselves with the Leap Motion sensor, a sort of mini-Kinect that only worked with hands and arms. Yes, we’re perfectly aware that sounds dumb, but the results were impressive: everything turned into a touchscreen display, you could draw with your fingers, and control robots with your hands. If you mount one of these sensors to your forehead, and reflect a few phone screens onto your retinas, you have the makings of a stereoscopic AR headset that tracks the movement of your hands. This is an over-simplified description, but conceptually, that’s what Project North Star is.

The files released now include STLs of parts that can be 3D printed on any filament printer, files for the electronics that drive the backlight and receive video from a laptop, and even software for doing actual Augmented Reality stuff in Unity. It’s not a complete project ready for prime time, but it’s a far cry from the simple spec sheet full of promises we saw in the middle of last year.

Smartphone App Uses AR To Visualize The RF Spectrum

Have you ever wished you could see in the RF part of the radio spectrum? While such a skill would probably make it hard to get a good night’s rest, it would at least allow you to instantly see dead spots in your WiFi coverage. Not a bad tradeoff.

Unwilling to go full [Geordi La Forge] to be able to visualize RF, [Ken Kawamoto] built the next best thing – an augmented-reality RF signal strength app for his smartphone. Built to aid in the repositioning of his router in the post-holiday cleanup, the app uses the Android ARCore framework to figure out where in the house the phone is and overlays a color-coded sphere representing sensor data onto the current camera image. The spheres persist in 3D space, leaving a trail of virtual breadcrumbs that map out the sensor data as you warwalk the house. The app also lets you map Bluetooth and LTE coverage, but RF isn’t its only input: if your phone is properly equipped, magnetic fields and barometric pressure can also be AR mapped. We found the Bluetooth demo in the video below particularly interesting; it’s amazing how much the signal is attenuated by a double layer of aluminum foil. [Ken] even came up with an Arduino with a gas sensor that talks to the phone and maps the atmosphere around the kitchen stove.

The app is called AR Sensor and is available on the Play Store, but you’ll need at least Android 8.0 to play. If your phone is behind the times like ours, you might have to settle for mapping your RF world the hard way.

Continue reading “Smartphone App Uses AR To Visualize The RF Spectrum”

Lenses For DIY Augmented Reality Will Get A Bit Less Unobtainable

You may remember that earlier this year Leap Motion revealed Project North Star, a kind of open-source reference design for an Augmented Reality (AR) headset. While it’s not destined to make high scores in the fashion department, it aims to be hacker-friendly and boasts a large field of view. There’s also an attractive element of “what you see is what you get” when it comes to the displays and optical design, which is a good thing for hackability. Instead of everything residing in a black box, the system uses two forward-facing displays (one for each eye) whose images are bounced off curved reflective lenses. These are essentially semitransparent mirrors which focus the images properly while also allowing the wearer to see both the displays and the outside world at the same time. This co-existence of both virtual and real-world visuals are a hallmark of Augmented Reality.

A serious setback to the aspiring AR hacker has been the fact that while the design is open, the lenses absolutely are not off the shelf components. [Smart Prototyping] aims to change that, and recently announced in a blog post that they will be offering Project North Star-compatible reflective lenses. They’re in the final stages of approving manufacture, and listed pre-orders for the lenses in their store along with downloadable 3D models for frames.

When Leap Motion first announced their open-source AR headset, we examined the intruiguing specifications and the design has since been published to GitHub.  At the time, we did note that the only option for the special lenses seemed to be to CNC them and then spring for a custom reflective coating.

If the lenses become affordable and mass-produced, that would make the design much more accessible. In addition, anyone wanting to do their own experiments with near-eye displays or HUDs would be able to use the frame and lenses as a basis for their own work, and that’s wonderful.

Bringing Augmented Reality To The Workbench

[Ted Yapo] has big ideas for using Augmented Reality as a tool to enhance an electronics workbench. His concept uses a camera and projector system working together to detect objects on a workbench, and project information onto and around them. [Ted] envisions virtual displays from DMMs, oscilloscopes, logic analyzers, and other instruments projected onto a convenient place on the actual work area, removing the need to glance back and forth between tools and the instrument display. That’s only the beginning, however. A good camera and projector system could read barcodes on component bags to track inventory, guide manual PCB assembly by projecting which components go where, display reference data, and more.

An open-sourced, accessible machine vision system working in tandem with a projector would open a lot of doors. Fortunately [Ted] has prior experience in this area, having previously written the computer vision code for room-scale dynamic projection environments. That’s solid experience that he can apply to designing a workbench-scale system as his entry for The Hackaday Prize.

Leap Motion Announces Open Source Augmented Reality Headset

Leap Motion just dropped what may be the biggest tease in Augmented and Virtual Reality since Google Cardboard. The North Star is an augmented reality head-mounted display that boasts some impressive specs:

  • Dual 1600×1440 LCDs
  • 120Hz refresh rate
  • 100 degree FOV
  • Cost under $100 (in volume)
  • Open Source Hardware
  • Built-in Leap Motion camera for precise hand tracking

Yes, you read that last line correctly. The North Star will be open source hardware. Leap Motion is planning to drop all the hardware information next week.

Now that we’ve got you excited, let’s mention what the North Star is not — it’s not a consumer device. Leap Motion’s idea here was to create a platform for developing Augmented Reality experiences — the user interface and interaction aspects. To that end, they built the best head-mounted display they could on a budget. The company started with standard 5.5″ cell phone displays, which made for an incredibly high resolution but low framerate (50 Hz) device. It was also large and completely unpractical.

The current iteration of the North Star uses much smaller displays, which results in a higher frame rate and a better overall experience.  The secret sauce seems to be Leap’s use of ellipsoidal mirrors to achieve a large FOV while maintaining focus.

We’re excited, but also a bit wary of the $100 price point — Leap Motion is quick to note that the price is “in volume”. They also mention using diamond tipped tooling in a vibration isolated lathe to grind the mirrors down. If Leap hasn’t invested in some injection molding, those parts are going to make the whole thing expensive. Keep your eyes on the blog here for more information as soon as we have it!

Pavement Projection Provides Better Bicycle Visibility At Night

Few would question the health benefits of ditching the car in favor of a bicycle ride to work — it’s good for the body, and it can be a refreshing relief from rat race commuting. But it’s not without its perils, especially when one works late and returns after dark. Most car versus bicycle accidents occur in the early evening, and most are attributed to drivers just not seeing cyclists in the waning light of day.

To decrease his odds of becoming a statistics and increase his time on two wheels, [Dave Schneider] decided to build a better bike light. Concerned mainly with getting clipped from the rear, and having discounted the commercially available rear-mounted blinkenlights and wheel-mounted persistence of vision displays as insufficiently visible, [Dave] looked for ways to give drivers as many cues as possible. Noticing that his POV light cast a nice ground effect, he came up with a pavement projecting display using four flashlights. The red LED lights are arranged to flash onto the roadway in sequence, using the bike’s motion to sweep out a sort of POV “bumper” to guide motorists around the bike. The flashlight batteries were replaced with wooden plugs wired to the Li-ion battery pack and DC-DC converter in the saddle bag, with an Arduino tasked with the flashing duty.

The picture above shows a long exposure of the lights in action, and it looks very effective. We can’t help but think of ways to improve this: perhaps one flashlight with a servo-controlled mirror? Or variable flashing frequency based on speed? Maybe moving the pavement projection up front for a head-down display would be a nice addition too.

Head-Up Display Augments Bionic Turtle’s Reality

There’s a harsh truth underlying all robotic research: compared to evolution, we suck at making things move. Nature has a couple billion years of practice making things that can slide, hop, fly, swim and run, so why not leverage those platforms? That’s the idea behind this turtle with a navigation robot strapped to its back.

This reminds us somewhat of an alternative universe sci-fi story by S.M. Stirling called The Sky People.  In the story, Venus is teeming with dinosaurs that Terran colonists use as beasts of burden with brain implants that stimulate pleasure centers to control them. While the team led by [Phill Seung-Lee] at the Korean Advanced Institute of Science and Technology isn’t likely to get as much work from the red-eared slider turtle as the colonists in the story got from their bionic dinosaurs, there’s still plenty to learn from a setup like this. Using what amounts to a head-up display for the turtle in the form of a strip of LEDs, along with a food dispenser for positive reinforcement, the bionic terrapin is trained to associate food with the flashing LEDs. The LEDs are then used as cues as the turtle navigates between waypoints in a tank. Sadly, the full article is behind a paywall, but the video below gives you a taste of the gripping action.

Looking for something between amphibian and fictional dinosaurs to play mind games with? Why not make your best friend bionic? Continue reading “Head-Up Display Augments Bionic Turtle’s Reality”