An experimental project to mix reality and virtual reality by [Drew Gottlieb] uses the Microsoft Hololens and the HTC Vive to show two users successfully sharing a single workspace as well as controllers. While the VR user draws cubes in midair with a simple app, the Hololens user can see the same cubes being created and mapped to a real-world location, and the two headsets can even interact in the same shared space. You really need to check ou the video, below, to fully grasp how crazy-cool this is.
Two or more VR or AR users sharing the same virtual environment isn’t new, but anchoring that virtual environment into the real world in a way that two very different headsets share is interesting to see. [Drew] says that the real challenge wasn’t just getting the different hardware to talk to each other, it was how to give them both a shared understanding of a common space. [Drew] needed a way to make that work, and you can see the results in the video embedded below.
Just in case anyone secretly had the idea that Valve Software’s VR and other hardware somehow sprang fully-formed from a lab, here are some great photos and video of early prototypes, and interviews with the people who made them. Some of the hardware is quite raw-looking, some of it is recognizable, and some are from directions that were explored but went nowhere, but it’s all fascinating.
The accompanying video (embedded below) has some great background and stories about the research process, which began with a mandate to explore the concepts of AR and VR and determine what could be done and what was holding things back.
One good peek into this process is the piece of hardware shown to the left. You look into the lens end like a little telescope. It has a projector that beams an image directly into your eye, and it has camera-based tracking that updates that image extremely quickly.
The result is a device that lets you look through a little window into a completely different world. In the video (2:16) one of the developers says “It really taught us just how important tracking was. No matter [how you moved] it was essentially perfect. It was really the first glimpse we had into what could be achieved if you had very low persistence displays, and very good tracking.” That set the direction for the research that followed.
Troy New York’s Tech Valley Center of Gravity is following up their January IoT Hackathon with another installment. The April 16-17 event promises to be a doozy, and anyone close to the area with even a passing interest in gaming and AR/VR should really make an effort to be there.
Not content to just be a caffeine-fueled creative burst, TVCoG is raising the bar in a couple ways. First, they’re teaming up with some corporate sponsors with a strong presence in the VR and AR fields. Daydream.io, a new company based in the same building as the CoG, is contributing a bunch of its Daydream.VR smartphone headsets to hackathon attendees, as well as mentors to get your project up and running. Other sponsors include 1st Playable Productions and Vicarious Visions, game studios both located in the Troy area. And to draw in the hardcore game programmers, a concurrent Ludum Dare game jam will be run by the Tech Valley Game Space, with interaction and collaboration between the AR/VR hackers and the programmers encouraged. Teams will compete for $1000 in prizes and other giveaways.
This sounds like it’s going to be an amazing chance to hack, to collaborate, and to make connections in the growing AR/VR field. And did we mention the food? There was a ton of it last time, so much they were begging us to take it home on Sunday night. Go, hack, create, mingle, and eat. TVCoG knows how to hackathon, and you won’t be disappointed.
Thanks to [Duncan Crary] for the heads up on this.
It’s not actually that hard to set up! The system consists of a good computer running Linux, a Kinect, a projector, a sandbox, and sand. And that’s it! The University of California (UC Davis) has setup a few of these systems now to teach children about geography, which is a really cool demonstration of both 3D scanning and projection mapping. As you can see in the animated gif above, the Kinect can track the topography of the sand, and then project its “reality” onto it. In this case, a mini volcano.
In the last year, [Jeri Ellsworth] has been very busy. She was hired by Valve, started development of an augmented reality system, fired by Valve, and started a new company with [Rick Johnson] to bring her augmented reality glasses to the market. On the last Amp Hour podcast she spilled the beans on what went down at Valve, how her glasses work, and what her plans for the future are.
[Jeri] and [Rick]’s castAR glasses aren’t virtual reality glasses like the Oculus Rift or other virtual reality glasses that cut you off from the real world. The castAR glasses preserve your peripheral vision by projecting images and objects onto a gray retro-reflective mat and allows you to interact with a virtual environment with an electronic wand. So far, there are a few demos for the castAR system; a Jenga clone, and a game of battle chess called Team For Chess, a wonderful reference to Valve’s hat simulator.
The electronics inside the castAR glasses are fairly impressive; new frames are drawn on the retro-reflective surface at 100 Hz, positioning accuracy is in the sub-millimeter range, and thanks to [Jeri]’s clever engineering the entire system should be priced at about $200. Not too bad for an awesome device that can be used not only for D&D and Warhammer, but also for some very cool practical applications like visualizing engineering models of 3D prints before they’re printed.
Enjoy losing at pool? Well the folks at Queen’s University just made it a whole lot easier. The Deep Green robot was created with the purpose of playing a flawless game, allowing it to beat even the most skilled human players. More than a couple of research papers have been written on the project. A ceiling-mounted Canon 350D tracks the position of all of the balls, in addition to another cue-mounted cam (for higher shot accuracy). Using a bunch of calculations, and a computer (probably more advanced than an Arduino), the Deep Green is able to strategize and play. Very well.
On a positive note, another team from Queens is working on a seperate but related project: ARpool (as in augmented reality). It was created to make playing pool easier. The website does not provide much info, but it seems to project different moves onto the pool table, allowing an inexperienced player to tell whether a shot is at all possible.
[onezerothrice] has been working hard on creating ARtisan, a flash library for bringing augmented reality to the browser. His goal in creating the library was to make AR projects quicker and easier to develop. The library can provide the location, size, and rotation of multiple markers on screen with little work from the developer. It is licensed under the GPL and comes bundled with Papervision3D, another flash library for manipulating objects in 3 dimensions. He has posted several demos with source and accompanying video.