Leap Motion’s Project North Star Gets Hardware

It’s been more than a year since we first heard about Leap Motion’s new, Open Source augmented reality headset. The first time around, we were surprised: the headset featured dual 1600×1440 LCDs, 120 Hz refresh rate, 100 degree FOV, and the entire thing would cost under $100 (in volume), with everything, from firmware to mechanical design released under Open licenses. Needless to say, that’s easier said than done. Now it seems Leap Motion is releasing files for various components and a full-scale release might be coming sooner than we think.

Leap Motion first made a name for themselves with the Leap Motion sensor, a sort of mini-Kinect that only worked with hands and arms. Yes, we’re perfectly aware that sounds dumb, but the results were impressive: everything turned into a touchscreen display, you could draw with your fingers, and control robots with your hands. If you mount one of these sensors to your forehead, and reflect a few phone screens onto your retinas, you have the makings of a stereoscopic AR headset that tracks the movement of your hands. This is an over-simplified description, but conceptually, that’s what Project North Star is.

The files released now include STLs of parts that can be 3D printed on any filament printer, files for the electronics that drive the backlight and receive video from a laptop, and even software for doing actual Augmented Reality stuff in Unity. It’s not a complete project ready for prime time, but it’s a far cry from the simple spec sheet full of promises we saw in the middle of last year.

Screen Shake In VR, Minus The Throwing Up

In first-person games, an effective way to heighten immersion is to give the player a sense of impact and force by figuratively shaking the camera. That’s a tried and true practice for FPS games played on a monitor, but to [Zulubo]’s knowledge, no one has implemented traditional screen shake in a VR title because it would be a sure way to trigger motion sickness. Unsatisfied with that limitation, some clever experimentation led [Zulubo] to a method of doing screen shake in VR that doesn’t cause any of the usual problems.

Screen shake doesn’t translate well to VR because the traditional method is to shake the player’s entire view. This works fine when viewed on a monitor, but in VR the brain interprets the visual cue as evidence that one’s head and eyeballs are physically shaking while the vestibular system is reporting nothing of the sort. This kind of sensory mismatch leads to motion sickness in most people.

The key to getting the essence of a screen shake without any of the motion sickness baggage turned out to be a mix of two things. First, the shake is restricted to peripheral vision only. Second, it is restricted to an “in and out” motion, with no tilting or twisting. The result is a conveyance of concussion and impact that doesn’t rely on shaking the player’s view, at least not in a way that leads to motion sickness. It’s the product of some clever experimentation to solve a problem, and freely downloadable for use by anyone who may be interested.

Speaking of fooling one’s senses in VR environments, here is a fascinating method of simulating zero gravity: waterproof the VR headset and go underwater.

[via Reddit]

Three Dimensions: What Does That Really Mean?

The holy grail of display technology is to replicate what you see in the real world. This means video playback in 3D — but when it comes to displays, what is 3D anyway?

You don’t need me to tell you how far away we are from succeeding in replicating real life in a video display. Despite all the hype, there are only a couple of different approaches to faking those three-dimensions. Let’s take a look at what they are, and why they can call it 3D, but they’re not fooling us into believing we’re seeing real life… yet.

Continue reading “Three Dimensions: What Does That Really Mean?”

A Low Cost VR Headset

Virtual reality systems have been at the forefront of development for several decades. While there are  commercial offerings now, it’s interesting to go back in time to when the systems were much more limited. [Colin Ord] recently completed his own VR system, modeled on available systems from 20-30 years ago, which gives us a look inside what those systems would have been like, as well as being built for a very low cost using today’s technology.

The core of this project is a head tracker, which uses two BBC Microbits as they have both the accelerometer and compass needed to achieve the project goals. It is also capable of tracking an item and its position in the virtual space. For this project, [Colin] built everything himself including the electronics and the programming. It also makes use of Google Cardboard to hold the screen, lenses, and sensors all in the headset. All of this keeps the costs down, unlike similar systems when they were first unveiled years ago.

The ground-up approach that this project takes is indeed commendable. Hopefully we can see the code released, and others can build upon this excellent work. You could even use it to take a virtual reality cycling tour of the UK.

Continue reading “A Low Cost VR Headset”

This Raspberry Pi Is A Stereo Camera And So Much More

Over the years we have featured a huge array of projects featuring the Raspberry Pi, but among them there is something that has been missing in all but a few examples. The Raspberry P Compute Module is the essentials of a Pi on a form factor close to that of a SODIMM module, and it is intended as a way to embed a Pi inside a commercial product. It’s refreshing then to see [Eugene]’s StereoPi project, a PCB that accepts a Compute Module and provides interfaces for two Raspberry Pi cameras.

What makes this board a bit special is that as well as the two camera connectors at the required spacing for stereophotography it also brings out all the interfaces you’d expect on a regular Pi, so there is the familiar 40-pin expansion header as well as USB and Ethernet ports. It has a few extras such as a pin-based power connector, and an on-off switch.

Where are they going with this one? So far we’ve seen demonstrations of the rig used to create depth maps with ROS (Robot Operating System). But even more fun is seeing the 3rd-person-view rig shown in the video below. You strap on a backpack that holds the stereo camera above your head, then watch yourself through VR goggles. Essentially you become the video game. We’ve seen this demonstrated before and now it looks like it will be easy to give it a try yourself as StereoPi has announced they’re preparing to crowdfund.

So aside from the stereophotography why is this special? The answer comes in that it is as close as possible to a fresh interpretation of a Raspberry Pi board without being from the Pi Foundation themselves. The Pi processors are not available to third party manufacturers, so aside from the Odroid W (which was made in very limited numbers) we have never seen a significant alternative take on a compatible Raspberry Pi. The idea that this could be achieved through the Compute Module is one that we hope might be taken up by other designers, potentially opening a fresh avenue in the Raspberry Pi story.

The Raspberry Pi Compute Module has passed through two iterations since its launch in 2014, but probably due to the lower cost of a retail Raspberry Pi we haven’t seen it in many projects save for a few game consoles. If the advent of boards like this means we see more of it, that can be no bad thing.

Continue reading “This Raspberry Pi Is A Stereo Camera And So Much More”

Lenses For DIY Augmented Reality Will Get A Bit Less Unobtainable

You may remember that earlier this year Leap Motion revealed Project North Star, a kind of open-source reference design for an Augmented Reality (AR) headset. While it’s not destined to make high scores in the fashion department, it aims to be hacker-friendly and boasts a large field of view. There’s also an attractive element of “what you see is what you get” when it comes to the displays and optical design, which is a good thing for hackability. Instead of everything residing in a black box, the system uses two forward-facing displays (one for each eye) whose images are bounced off curved reflective lenses. These are essentially semitransparent mirrors which focus the images properly while also allowing the wearer to see both the displays and the outside world at the same time. This co-existence of both virtual and real-world visuals are a hallmark of Augmented Reality.

A serious setback to the aspiring AR hacker has been the fact that while the design is open, the lenses absolutely are not off the shelf components. [Smart Prototyping] aims to change that, and recently announced in a blog post that they will be offering Project North Star-compatible reflective lenses. They’re in the final stages of approving manufacture, and listed pre-orders for the lenses in their store along with downloadable 3D models for frames.

When Leap Motion first announced their open-source AR headset, we examined the intruiguing specifications and the design has since been published to GitHub.  At the time, we did note that the only option for the special lenses seemed to be to CNC them and then spring for a custom reflective coating.

If the lenses become affordable and mass-produced, that would make the design much more accessible. In addition, anyone wanting to do their own experiments with near-eye displays or HUDs would be able to use the frame and lenses as a basis for their own work, and that’s wonderful.

Bose Wants You To Listen Up For Augmented Reality

Perhaps it is true that if all you have is a hammer every problem you see looks like a nail. When you think of augmented reality (AR), you usually think of something like the poorly-received Google Glass where your phone or computer overlays imagery in your field of vision. Bose isn’t known for video, though, they are known for audio. So perhaps it isn’t surprising that their upcoming (January 2019) AR sunglasses won’t feature video overlays. Instead, the $200 sunglasses will tell you what you are looking at.

The thing hinges on your device knowing your approximate location and the glasses knowing their orientation due to an inertial measuring system. In other words, the glasses — combined with your smart device — know where you are and what you are looking at. Approximately. So at the museum, if you are looking at a piece of art, the glasses could tell you more information about it. There’s a video showing an early prototype from earlier this year, below.

Continue reading “Bose Wants You To Listen Up For Augmented Reality”