Get Your Game On: Troy’s TVCoG Hosts VR And Gaming Hackathon

Troy New York’s Tech Valley Center of Gravity is following up their January IoT Hackathon with another installment. The April 16-17 event promises to be a doozy, and anyone close to the area with even a passing interest in gaming and AR/VR should really make an effort to be there.

Not content to just be a caffeine-fueled creative burst, TVCoG is raising the bar in a couple ways. First, they’re teaming up with some corporate sponsors with a strong presence in the VR and AR fields. unspecifiedDaydream.io, a new company based in the same building as the CoG, is contributing a bunch of its Daydream.VR smartphone headsets to hackathon attendees, as well as mentors to get your project up and running. Other sponsors include 1st Playable Productions and Vicarious Visions, game studios both located in the Troy area. And to draw in the hardcore game programmers, a concurrent Ludum Dare game jam will be run by the Tech Valley Game Space, with interaction and collaboration between the AR/VR hackers and the programmers encouraged. Teams will compete for $1000 in prizes and other giveaways.

This sounds like it’s going to be an amazing chance to hack, to collaborate, and to make connections in the growing AR/VR field. And did we mention the food? There was a ton of it last time, so much they were begging us to take it home on Sunday night. Go, hack, create, mingle, and eat. TVCoG knows how to hackathon, and you won’t be disappointed.

Thanks to [Duncan Crary] for the heads up on this.

 

 

Augmented Reality Ultrasound

Think of Virtual Reality and it’s mostly fun and games that come to mind. But there’s a lot of useful, real world applications that will soon open up exciting possibilities in areas such as medicine, for example. [Victor] from the Shackspace hacker space in Stuttgart built an Augmented Reality Ultrasound scanning application to demonstrate such possibilities.

But first off, we cannot get over how it’s possible to go dumpster diving and return with a functional ultrasound machine! That’s what member [Alf] turned up with one day. After some initial excitement at its novelty, it was relegated to a corner gathering dust. When [Victor] spotted it, he asked to borrow it for a project. Shackspace were happy to donate it to him and free up some space. Some time later, [Victor] showed off what he did with the ultrasound machine.

As soon as the ultrasound scanner registers with the VR app, possibly using the image taped to the scan sensor, the scanner data is projected virtually under the echo sensor. There isn’t much detail of how he did it, but it was done using Vuforia SDK which helps build applications for mobile devices and digital eye wear in conjunction with the Unity 5 cross-platform game engine. Check out the video to see it in action.

Thanks to [hadez] for sending in this link.

Continue reading “Augmented Reality Ultrasound”

Augmented Reality Becomes Useful, Real

The state of augmented reality is terrible. Despite everyone having handheld, portable computers with high-resolution cameras, no one has yet built ‘Minecraft with digital blocks in real life’, and the most exciting upcoming use for augmented reality is 3D Dungeons and Dragons. There are plenty of interesting things that can be done with augmented reality, the problem is someone needs to figure out what those things are. Lucky for us, the MIT Media Lab knocked it out of the park with the ability to program anything through augmented reality.

The Reality Editor is a simple idea, but one that is extraordinarily interesting. Objects all around you are marked with a design that can be easily read by a smartphone running a computer vision application. In augmented reality, these objects have buttons and dials that can be used to turn on a lamp, open a car’s window, or any other function that can be controlled over the Internet. It’s augmented reality buttons for everything.

This basic idea is simple, but by combining it by another oft-forgotten technology from the 90s, we get something really, really cool. The buttons on each of the objects can be connected together with a sort of graphical programming language. Scan a button, connect the button to a lamp, and you’re able to program the lamp with augmented reality.

The Reality Editor is already available on the Apple app store, and there are a number of examples available for people to start tinkering with this weird yet interesting means of interacting with the world. If you’ve ever wondered how we’re going to interact with the Internet of Things, there you have it. Video below.

Continue reading “Augmented Reality Becomes Useful, Real”

Resistance Is… There’s An Augmented Reality App For That!

Like many engineers of a certain age I learned the resistor color code using a mnemonic device that is so politically incorrect, only Tosh might venture to utter it in public today. When teaching kids, I have to resort to the old Radio Shack standby: Big Boys Race Our Young Girls But Violet Generally Wins. Doesn’t really roll off the tongue or beg to be remembered. Maybe: Bad Beer Rots Our Young Guts But Vodka Goes Well. But again, when teaching kids that’s probably not ideal either.

Maybe you can forget all those old memory crutches. For one thing, the world’s going surface mount and color coded resistors are becoming a thing of the past. However, if you really need to read the color code, there’s at least three apps on the Google Play Store that try to do the job. The latest one is ScanR, although there is also Resistor Scanner and Resistor Scan. If you use an iPhone, you might try this app, although not being an Apple guy, I can’t give you my feedback on that one.

Continue reading “Resistance Is… There’s An Augmented Reality App For That!”

Augmented Reality Sandbox Using A Kinect

Want to make all your 5 year old son’s friends jealous? What if he told them he could make REAL volcanoes in his sandbox? Will this be the future of sandboxes, digitally enhanced with augmented reality?

It’s not actually that hard to set up! The system consists of a good computer running Linux, a Kinect, a projector, a sandbox, and sand. And that’s it! The University of California (UC Davis) has setup a few of these systems now to teach children about geography, which is a really cool demonstration of both 3D scanning and projection mapping. As you can see in the animated gif above, the Kinect can track the topography of the sand, and then project its “reality” onto it. In this case, a mini volcano.

Continue reading “Augmented Reality Sandbox Using A Kinect”

Open Hybrid Gives You The Knobs And Buttons To Your Digital Kingdom

With a sweeping wave of complexity that comes with using your new appliance tech, it’s easy to start grumbling over having to pull your phone out every time you want to turn the kitchen lights on. [Valentin] realized that our new interfaces aren’t making our lives much simpler, and both he and the folks at MIT Media Labs have developed a solution.

open-hybrid-light-color-pickerOpen Hybrid takes the interface out of the phone app and superimposes it directly onto the items we want to operate in real life. The Open Hybrid Interface is viewed through the lense of a tablet or smart mobile device. With a real time video stream, an interactive set of knobs and buttons superimpose themselves on the objects they control. In one example, holding a tablet up to a light brings up a color palette for color control. In another, sliders superimposed on a Mindstorms tank-drive toy become the control panel for driving the vehicle around the floor. Object behaviors can even be tied together so that applying an action to one object, such as turning off one light, will apply to other objects, in this case, putting all other lights out.

Beneath the surface, Open Hybrid is developed on OpenFrameworks with a hardware interface handled by the Arduino Yún running custom firmware. Creating a new application, though, has been simplified to be achievable with web-friendly languages (HTML, Javascript, and CSS). The net result is that their toolchain cuts out a heavy need for extensive graphics knowledge to develop a new control panel.

If you can spare a few minutes, check out [Valentin’s] SolidCon talk on the drive to design new digital interfaces that echo those we’ve already been using for hundreds of years.

Last but not least, Open Hybrid may have been born in the Labs, but its evolution is up to the community as the entire project is both platform independent and open source.

Sure, it’s not mustaches, but it’s definitely more user-friendly.

Continue reading “Open Hybrid Gives You The Knobs And Buttons To Your Digital Kingdom”

An Introduction To Valve’s Tracking Hardware

[Alan Yates] brought a demo of Valve’s new VR tech that’s the basis of the HTC Vive system to Maker Faire this year. It’s exceptionally clever, and compared to existing VR headsets it’s probably one of the best headtracking solutions out there.

With VR headsets, the problem isn’t putting two displays in front of the user’s eyes. The problem is determining where the user is looking quickly and accurately. IMUs and image processing techniques can be used with varying degrees of success, but to do it right, it needs to be really fast and really cheap.

[Alan] and [Valve]’s ‘Lighthouse’ tracking unit does this by placing a dozen or so IR photodiodes on the headset itself. On the tracking base station, IR lasers scan in the X and Y axes. By scanning these IR lasers across the VR headset, the angle of the headset to the base station can be computed in just a few cycles of a microcontroller. For a bunch of one cent photodiodes, absolute angles and the orientation to a base station can be determined very easily, something that has some pretty incredible applications for everything from VR to robotics.

Remember all of the position tracking hacks that came out as a result of the Nintendo Wii using IR beacons and a tracking camera? This seems like an evolutionary leap forward but in the same realm and can’t wait to see people hacking on this tech!