Oculus Rift and Wii Balance Board make Hoverboards a (Virtual) Reality

It’s almost 2015 and still don’t have the futuristic technology promised to us by Back to the Future Part II. Where are the flying cars, Mr. Fusions, or 19 Jaws movies? Most importantly, where are our hoverboards?

[cratesmith] got tired of waiting around and decided to take matters into his own hands. He combined the Oculus Rift virtual reality headset with the Wii Fit Balance Board to create a virtual hoverboard experience. He used the Unity3D engine (a favorite among Rift developers) to program the game engine. It’s a very rough demo right now, but the game comes complete with a simulated town to float around in and of course includes a model DeLorean.

Before you try to play this demo, you should know that it’s not without its faults. The primary problem [cratesmith] has experienced is with simulation sickness. His virtual reality system has no way to track body motion, which means that leaning back and forth on the Wii Fit board does not get translated to the equivalent virtual movement. The game must assume that the player stands straight up at all times, which is not an intuitive way to control something similar to a skateboard. The result is an off-putting experience that can break immersion and lead to a feeling of nausea.

A possible solution to this problem would be to use a camera style motion detector like the Microsoft Kinect. In fact, another Reddit user has recently posted a teaser video of another hoverboard simulator that uses the Oculus Rift, Wii Fit Board, and Kinect. Not much information is available about this second project, but we look forward to seeing updates in the future.

[createsmith] has not published the code for his demo because it’s still in the very early stages, but he has stated that he’s been giving it out to anyone who goes out of their way to ask. The hoverboard is probably the most coveted fictional technology from the 1989 adventure film. We know this because we’ve seen multiple projects over the years that were inspired by the movie.  We’re excited to see it come to fruition in any form.

[via Reddit]

Virtual Reality Gets Real with 3 Kinect Cameras

kinects-capture

No, that isn’t a scene from a horror movie up there, it’s [Oliver Kreylos'] avatar in a 3D office environment. If he looks a bit strange, it’s because he’s wearing an Oculus Rift, and his image is being stitched together from 3 Microsoft Kinect cameras.

[Oliver] has created a 3D environment which is incredibly realistic, at least to the wearer. He believes the secret is in the low latency of the entire system. When coupled with a good 3D environment, like the office shown above, the mind is tricked into believing it is really in the room. [Oliver] mentions that he finds himself subconsciously moving to avoid bumping into a table leg that he knows isn’t there. In [Oliver's] words, “It circumnavigates the uncanny valley“.

Instead of pulling skeleton data from the 3 Kinect cameras, [Oliver] is using video and depth data. He’s stitching and processing this data on an i7 Linux box with an Nvidia Geforce GTX 770 video card. Powerful hardware for sure, but not the cutting edge monster rig one might expect. [Oliver] also documented his software stack. He’s using Vrui VR Toolkit, the Kinect 3D Video Capture Project, and the Collaboration Infrastructure.

We can’t wait to see what [Oliver] does when he gets his hands on the Kinect One (and some good Linux drivers for it).

[Read more...]

Kinect + Wiper Motor + LEGO = 3D Scanner

bambergDE-scanner

[Christopher] from the Bamberg Germany hackerspace, [Backspace], wrote in to tell us about one of the group’s most recent projects. It’s a Kinect-based 3D scanner (translated) that has been made mostly from parts lying around the shop.

There are 2 main components to the hardware-side of this build; the Kinect Stand and the Rotating Platform. The Kinect sits atop a platform made from LEGO pieces. This platform rides up and down an extruded aluminum rail, powered by an old windshield wiper motor.

The Rotating Platform went through a couple of iterations. The first was an un-powered platform supported by 5 roller blade wheels. The lack of automatic rotation didn’t work out so well for scanning so out came another windshield wiper motor which was strapped to an old office chair with the seat replaced by a piece of MDF. This setup may not be the best for the acrophobic, but the scan results speak for themselves.

[Read more...]

Computers Playing Flappy Bird. Skynet Imminent. Humans Flapping Arms.

flappy-double

After viral popularity, developer rage quits, and crazy eBay auctions, the world at large is just about done with Flappy Bird. Here at Hackaday, we can’t let it go without showcasing two more hacks. The first is the one that we’ve all been waiting for: a robot that will play the damn game for us. Your eyes don’t deceive you in that title image. The Flappy Bird bot is up to 147 points and going strong. [Shi Xuekun] and [Liu Yang], two hackers from China, have taken full responsibility for this hack. They used OpenCV with a webcam on Ubuntu to determine the position of both the bird and the pipes. Once positions are known, the computer calculates the next move. When it’s time to flap, a signal is sent to an Arduino Mega 2560. The genius of this hack is the actuator. Most servos or motors would have been too slow for this application. [Shi] and [Liu] used the Arduino and a motor driver to activate a hard drive voice coil. The voice coil was fast enough to touch the screen at exactly the right time, but not so powerful as to smash their tablet.

If you would like to make flapping a bit more of a physical affair, [Jérémie] created Flappy Bird with Kinect. He wrote a quick Processing sketch which uses the Microsoft Kinect to look for humans flapping their arms. If flapping is detected, a command is sent to an Android tablet. [Jérémie] initially wanted to use Android Debug Bridge (ADB) to send the touch commands, but found it was too laggy for this sort of hardcore gaming. The workaround is to use a serial connected Arduino as a mouse. The Processing sketch sends a ‘#’ to the Arduino via serial. The Arduino then sends a mouse click to the computer, which is running  hidclient.  Hidclient finally sends Bluetooth mouse clicks to the tablet. Admittedly, this is a bit of a Rube Goldberg approach, but it does add an Arduino to a Flappy Bird hack, which we think is a perfect pairing.

[Read more...]

Autonomous Lighting with Intelligence

myra_light_01_29

Getting into home automation usually starts with lighting, like hacking your lights to automatically turn on when motion is detected, timer controls, or even tying everything into an app on your smart phone. [Ken] took things to a completely different level, by giving his lighting intelligence.

The system is called ‘Myra’, and it works by detecting what you’re doing in the room, and based on this, robotic lights will optimally adjust to the activity. For example, if you’re walking through the room, the system will attempt to illuminate your path as you walk. Other activities are detected as well, like reading a book, watching TV, or just standing still.

At the heart of the ‘Myra’ system is an RGBD Sensor (Microsoft Kinect/Asus Xtion). The space in the room is processed by a PC running an application to determine the current ‘activity’. Wireless robotic lights are strategically placed around the room; each with a 2-servo system and standalone Arduino. The PC sends out commands to each light with an angle for the two axis and the intensity of the light. The lights receive this command wirelessly via a 315MHz receiver, and the Arduino then ‘aims’ the beam according to the command.

This isn’t the first time we’ve seen [Ken’s] work; a couple of years ago we saw his extremely unique ‘real life’ weather display.  The ‘Myra’ system is still a work in progress, so we can’t wait to see how it all ends up.  Be sure to check out the video after the break for a demo of the system.

[Read more...]

Holograms With The New Kinect

kinect

The Xbox One is out, along with a new Kinect sensor, and this time around Microsoft didn’t waste any time making this 3D vision sensor available for Windows. [programming4fun] got his hands on the new Kinect v2 sensor and started work on a capture system to import anything into a virtual environment.

We’ve seen [programming4fun]‘s work before with an extremely odd and original build that turns any display into a 3D display with the help of a Kinect v1 sensor. This time around, [programming] isn’t just using a Kinect to display a 3D object, he’s also using a Kinect to capture 3D data.

[programming] captured himself playing a few chords on a guitar with the new Kinect v2 sensor. This was saved to a custom file format that can be played back in the Unity engine. With the help of a Kinect v1, [programming4fun] can pan and tilt around this virtual model simply by moving his head.

If that’s not enough, [programming] has also included support for the Oculus Rift, turning the Unity-based virtual copy of himself into something he can interact with in a video game.

As far as we can tell, this is the first build on Hackaday using the new Kinect sensor. We asked what everyone was going to do with this new improved hardware, and from [programming]‘s demo, it seems like there’s still a lot of unexplored potential with the new Xbox One spybox.

[Read more...]

A New Way to Heat People

heat spotlight

[Leigh Christie] is a researcher at MIT, and he’s developed an interesting solution to heating people, not buildings.

His TEDx talk, “Heating Buildings is Stupid,” demonstrates the MIT SENSEable City Laboratory’s efforts to tackle energy issues. Their research focuses on finding an alternative to the staggering waste of energy used to heat large spaces. Although TED talk articles are a rarity at Hackaday, we think this idea is both simple and useful. Also, [Leigh] is the same guy who brought us the Mondo Spider a few years ago for the Burning Man exhibition. He’s a hacker.

Anyway, what is it? The system he’s devised is so simple that it’s brilliant: a person-tracking infrared heat spotlight. Using a Microsoft Kinect, the lamp follows you around and keeps the individual warm rather than the entire space. [Leigh] has grand plans for implementing what he calls “Local Heating” in large buildings to save on energy consumption, but smaller-scale implementations could prove equally beneficial for a big garage or a workshop. How much does your workspace cost to heat during the winter? Hackerspaces seem like the perfect test environment for a cobbled-together “Local Heating” system. If anyone builds one, we want to hear about it.

Check out the full TEDx talk after the break.

[Read more...]

Follow

Get every new post delivered to your Inbox.

Join 92,330 other followers