Augmented reality is all the rage right now, and it’s all because of Pokemon. Of course, this means the entire idea of augmented reality is now wrapped up in taking pictures of Pidgeys in their unnatural setting. There are more useful applications of augmented reality, as [vijayvictory]’s Hackaday Prize entry shows us. He’s built an augmented reality helmet for firefighters that will detect temperature, gasses, smoke and the user’s own vital signs, displaying the readings on a heads up display.
The core of the build is a Particle Photon, a WiFi-enabled microcontroller that also gives this helmet the ability to relay data back to a base station, ostensibly one that’s not on fire. To this, [vijayvictory] has added an accelerometer, gas sensor, and a beautiful OLED display mounted just behind a prism. This display overlays the relevant data to the firefighter without obstructing their field of vision.
Right now, this system is fairly basic, but [vijayvictory] has a few more tricks up his sleeve. By expanding this system to include a FLIR thermal imaging sensor, this augmented reality helmet will have the ability to see through smoke. By integrating this system into an existing network and adding a few cool WiFi tricks, this system will be able to located a downed firefighter using signal trilateralization. It’s a very cool device, and one that should be very useful, making it a great entry for The Hackaday Prize.
Just in case anyone secretly had the idea that Valve Software’s VR and other hardware somehow sprang fully-formed from a lab, here are some great photos and video of early prototypes, and interviews with the people who made them. Some of the hardware is quite raw-looking, some of it is recognizable, and some are from directions that were explored but went nowhere, but it’s all fascinating.
The accompanying video (embedded below) has some great background and stories about the research process, which began with a mandate to explore the concepts of AR and VR and determine what could be done and what was holding things back.
One good peek into this process is the piece of hardware shown to the left. You look into the lens end like a little telescope. It has a projector that beams an image directly into your eye, and it has camera-based tracking that updates that image extremely quickly.
The result is a device that lets you look through a little window into a completely different world. In the video (2:16) one of the developers says “It really taught us just how important tracking was. No matter [how you moved] it was essentially perfect. It was really the first glimpse we had into what could be achieved if you had very low persistence displays, and very good tracking.” That set the direction for the research that followed.
Troy New York’s Tech Valley Center of Gravity is following up their January IoT Hackathon with another installment. The April 16-17 event promises to be a doozy, and anyone close to the area with even a passing interest in gaming and AR/VR should really make an effort to be there.
Not content to just be a caffeine-fueled creative burst, TVCoG is raising the bar in a couple ways. First, they’re teaming up with some corporate sponsors with a strong presence in the VR and AR fields. Daydream.io, a new company based in the same building as the CoG, is contributing a bunch of its Daydream.VR smartphone headsets to hackathon attendees, as well as mentors to get your project up and running. Other sponsors include 1st Playable Productions and Vicarious Visions, game studios both located in the Troy area. And to draw in the hardcore game programmers, a concurrent Ludum Dare game jam will be run by the Tech Valley Game Space, with interaction and collaboration between the AR/VR hackers and the programmers encouraged. Teams will compete for $1000 in prizes and other giveaways.
This sounds like it’s going to be an amazing chance to hack, to collaborate, and to make connections in the growing AR/VR field. And did we mention the food? There was a ton of it last time, so much they were begging us to take it home on Sunday night. Go, hack, create, mingle, and eat. TVCoG knows how to hackathon, and you won’t be disappointed.
Thanks to [Duncan Crary] for the heads up on this.
Think of Virtual Reality and it’s mostly fun and games that come to mind. But there’s a lot of useful, real world applications that will soon open up exciting possibilities in areas such as medicine, for example. [Victor] from the Shackspace hacker space in Stuttgart built an Augmented Reality Ultrasound scanning application to demonstrate such possibilities.
But first off, we cannot get over how it’s possible to go dumpster diving and return with a functional ultrasound machine! That’s what member [Alf] turned up with one day. After some initial excitement at its novelty, it was relegated to a corner gathering dust. When [Victor] spotted it, he asked to borrow it for a project. Shackspace were happy to donate it to him and free up some space. Some time later, [Victor] showed off what he did with the ultrasound machine.
As soon as the ultrasound scanner registers with the VR app, possibly using the image taped to the scan sensor, the scanner data is projected virtually under the echo sensor. There isn’t much detail of how he did it, but it was done using Vuforia SDK which helps build applications for mobile devices and digital eye wear in conjunction with the Unity 5 cross-platform game engine. Check out the video to see it in action.
The state of augmented reality is terrible. Despite everyone having handheld, portable computers with high-resolution cameras, no one has yet built ‘Minecraft with digital blocks in real life’, and the most exciting upcoming use for augmented reality is 3D Dungeons and Dragons. There are plenty of interesting things that can be done with augmented reality, the problem is someone needs to figure out what those things are. Lucky for us, the MIT Media Lab knocked it out of the park with the ability to program anything through augmented reality.
The Reality Editor is a simple idea, but one that is extraordinarily interesting. Objects all around you are marked with a design that can be easily read by a smartphone running a computer vision application. In augmented reality, these objects have buttons and dials that can be used to turn on a lamp, open a car’s window, or any other function that can be controlled over the Internet. It’s augmented reality buttons for everything.
This basic idea is simple, but by combining it by another oft-forgotten technology from the 90s, we get something really, really cool. The buttons on each of the objects can be connected together with a sort of graphical programming language. Scan a button, connect the button to a lamp, and you’re able to program the lamp with augmented reality.
The Reality Editor is already available on the Apple app store, and there are a number of examples available for people to start tinkering with this weird yet interesting means of interacting with the world. If you’ve ever wondered how we’re going to interact with the Internet of Things, there you have it. Video below.
Like many engineers of a certain age I learned the resistor color code using a mnemonic device that is so politically incorrect, only Tosh might venture to utter it in public today. When teaching kids, I have to resort to the old Radio Shack standby: Big Boys Race Our Young Girls But Violet Generally Wins. Doesn’t really roll off the tongue or beg to be remembered. Maybe: Bad Beer Rots Our Young Guts But Vodka Goes Well. But again, when teaching kids that’s probably not ideal either.
Maybe you can forget all those old memory crutches. For one thing, the world’s going surface mount and color coded resistors are becoming a thing of the past. However, if you really need to read the color code, there’s at least three apps on the Google Play Store that try to do the job. The latest one is ScanR, although there is also Resistor Scanner and Resistor Scan. If you use an iPhone, you might try this app, although not being an Apple guy, I can’t give you my feedback on that one.
It’s not actually that hard to set up! The system consists of a good computer running Linux, a Kinect, a projector, a sandbox, and sand. And that’s it! The University of California (UC Davis) has setup a few of these systems now to teach children about geography, which is a really cool demonstration of both 3D scanning and projection mapping. As you can see in the animated gif above, the Kinect can track the topography of the sand, and then project its “reality” onto it. In this case, a mini volcano.