The ‘All-Seeing Pi’ Aids Low-Vision Adventurer

Adventure travel can be pretty grueling, what with the exotic locations and potential for disaster that the typical tourist destinations don’t offer. One might find oneself dangling over a cliff for that near-death-experience selfie or ziplining through a rainforest canopy. All this is significantly complicated by being blind, of course, so a tool like this Raspberry Pi low-vision system would be a welcome addition to the nearly-blind adventurer’s well-worn rucksack.

[Dan] has had vision problems since childhood, but one look at his YouTube channel shows that he doesn’t let that slow him down. When [Dan] met [Ben] in Scotland, [Ben] noticed that he was using his smartphone as a vision aid, looking at the display up close and zooming in to get as much detail as possible from his remaining vision. [Ben] thought he could help, so he whipped up a heads-up display from a Raspberry Pi and a Pi Camera. Mounted to a 3D-printed frame holding a 5″ HDMI display and worn from a GoPro head mount, the camera provides enough detail to help [Dan] navigate, as seen in the video below.

The rig is a bit unwieldy right now, but as proof of concept (and proof of friendship), it’s a solid start. We think a slimmer profile design might help, in which case [Ben] might want to look into this Google Glass-like display for a multimeter for inspiration on version 2.0.

Continue reading “The ‘All-Seeing Pi’ Aids Low-Vision Adventurer”

Sharing Virtual and Holographic Realities via Vive and Hololens

An experimental project to mix reality and virtual reality by [Drew Gottlieb] uses the Microsoft Hololens and the HTC Vive to show two users successfully sharing a single workspace as well as controllers. While the VR user draws cubes in midair with a simple app, the Hololens user can see the same cubes being created and mapped to a real-world location, and the two headsets can even interact in the same shared space. You really need to check ou the video, below, to fully grasp how crazy-cool this is.

Two or more VR or AR users sharing the same virtual environment isn’t new, but anchoring that virtual environment into the real world in a way that two very different headsets share is interesting to see. [Drew] says that the real challenge wasn’t just getting the different hardware to talk to each other, it was how to give them both a shared understanding of a common space. [Drew] needed a way to make that work, and you can see the results in the video embedded below.

Continue reading “Sharing Virtual and Holographic Realities via Vive and Hololens”

Behold: Valve’s VR and AR Prototypes

Just in case anyone secretly had the idea that Valve Software’s VR and other hardware somehow sprang fully-formed from a lab, here are some great photos and video of early prototypes, and interviews with the people who made them. Some of the hardware is quite raw-looking, some of it is recognizable, and some are from directions that were explored but went nowhere, but it’s all fascinating.

ValvePrototypeVIsit-172-Medium
An early AR prototype that worked like looking through a tube into another world.

The accompanying video (embedded below) has some great background and stories about the research process, which began with a mandate to explore the concepts of AR and VR and determine what could be done and what was holding things back.

One good peek into this process is the piece of hardware shown to the left. You look into the lens end like a little telescope. It has a projector that beams an image directly into your eye, and it has camera-based tracking that updates that image extremely quickly.

The result is a device that lets you look through a little window into a completely different world. In the video (2:16) one of the developers says “It really taught us just how important tracking was. No matter [how you moved] it was essentially perfect. It was really the first glimpse we had into what could be achieved if you had very low persistence displays, and very good tracking.” That set the direction for the research that followed.

Continue reading “Behold: Valve’s VR and AR Prototypes”

Get Your Game On: Troy’s TVCoG Hosts VR and Gaming Hackathon

Troy New York’s Tech Valley Center of Gravity is following up their January IoT Hackathon with another installment. The April 16-17 event promises to be a doozy, and anyone close to the area with even a passing interest in gaming and AR/VR should really make an effort to be there.

Not content to just be a caffeine-fueled creative burst, TVCoG is raising the bar in a couple ways. First, they’re teaming up with some corporate sponsors with a strong presence in the VR and AR fields. unspecifiedDaydream.io, a new company based in the same building as the CoG, is contributing a bunch of its Daydream.VR smartphone headsets to hackathon attendees, as well as mentors to get your project up and running. Other sponsors include 1st Playable Productions and Vicarious Visions, game studios both located in the Troy area. And to draw in the hardcore game programmers, a concurrent Ludum Dare game jam will be run by the Tech Valley Game Space, with interaction and collaboration between the AR/VR hackers and the programmers encouraged. Teams will compete for $1000 in prizes and other giveaways.

This sounds like it’s going to be an amazing chance to hack, to collaborate, and to make connections in the growing AR/VR field. And did we mention the food? There was a ton of it last time, so much they were begging us to take it home on Sunday night. Go, hack, create, mingle, and eat. TVCoG knows how to hackathon, and you won’t be disappointed.

Thanks to [Duncan Crary] for the heads up on this.

 

 

Augmented Reality Sandbox Using a Kinect

Want to make all your 5 year old son’s friends jealous? What if he told them he could make REAL volcanoes in his sandbox? Will this be the future of sandboxes, digitally enhanced with augmented reality?

It’s not actually that hard to set up! The system consists of a good computer running Linux, a Kinect, a projector, a sandbox, and sand. And that’s it! The University of California (UC Davis) has setup a few of these systems now to teach children about geography, which is a really cool demonstration of both 3D scanning and projection mapping. As you can see in the animated gif above, the Kinect can track the topography of the sand, and then project its “reality” onto it. In this case, a mini volcano.

Continue reading “Augmented Reality Sandbox Using a Kinect”

[Jeri] spills the beans on her AR glasses

AR

In the last year, [Jeri Ellsworth] has been very busy. She was hired by Valve, started development of an augmented reality system, fired by Valve, and started a new company with [Rick Johnson] to bring her augmented reality glasses to the market. On the last Amp Hour podcast she spilled the beans on what went down at Valve, how her glasses work, and what her plans for the future are.

[Jeri] and [Rick]’s castAR glasses aren’t virtual reality glasses like the Oculus Rift or other virtual reality glasses that cut you off from the real world. The castAR glasses preserve your peripheral vision by projecting images and objects onto a gray retro-reflective mat and allows you to interact with a virtual environment with an electronic wand. So far, there are a few demos for the castAR system; a Jenga clone, and a game of battle chess called Team For Chess, a wonderful reference to Valve’s hat simulator.

The electronics inside the castAR glasses are fairly impressive; new frames are drawn on the retro-reflective surface at 100 Hz, positioning accuracy is in the sub-millimeter range, and thanks to [Jeri]’s clever engineering the entire system should be priced at about $200. Not too bad for an awesome device that can be used not only for D&D and Warhammer, but also for some very cool practical applications like visualizing engineering models of 3D prints before they’re printed.

Pool-playing robot + ARpool

Enjoy losing at pool? Well the folks at Queen’s University just made it a whole lot easier. The Deep Green robot was created with the purpose of playing a flawless game, allowing it to beat even the most skilled human players. More than a couple of research papers have been written on the project. A ceiling-mounted Canon 350D tracks the position of all of the balls, in addition to another cue-mounted cam (for higher shot accuracy). Using a bunch of calculations, and a computer (probably more advanced than an Arduino), the Deep Green is able to strategize and play. Very well.

On a positive note, another team from Queens is working on a seperate but related project: ARpool (as in augmented reality). It was created to make playing pool easier. The website does not provide much info, but it seems to project different moves onto the pool table, allowing an inexperienced player to tell whether a shot is at all possible.