Virtual Reality Gets Real With 3 Kinect Cameras

No, that isn’t a scene from a horror movie up there, it’s [Oliver Kreylos’] avatar in a 3D office environment. If he looks a bit strange, it’s because he’s wearing an Oculus Rift, and his image is being stitched together from 3 Microsoft Kinect cameras.

[Oliver] has created a 3D environment which is incredibly realistic, at least to the wearer. He believes the secret is in the low latency of the entire system. When coupled with a good 3D environment, like the office shown above, the mind is tricked into believing it is really in the room. [Oliver] mentions that he finds himself subconsciously moving to avoid bumping into a table leg that he knows isn’t there. In [Oliver’s] words, “It circumnavigates the uncanny valley“.

Instead of pulling skeleton data from the 3 Kinect cameras, [Oliver] is using video and depth data. He’s stitching and processing this data on an i7 Linux box with an Nvidia Geforce GTX 770 video card. Powerful hardware for sure, but not the cutting edge monster rig one might expect. [Oliver] also documented his software stack. He’s using Vrui VR Toolkit, the Kinect 3D Video Capture Project, and the Collaboration Infrastructure.

We can’t wait to see what [Oliver] does when he gets his hands on the Kinect One (and some good Linux drivers for it).

Continue reading “Virtual Reality Gets Real With 3 Kinect Cameras”

Autonomous Lighting With Intelligence

myra_light_01_29

Getting into home automation usually starts with lighting, like hacking your lights to automatically turn on when motion is detected, timer controls, or even tying everything into an app on your smart phone. [Ken] took things to a completely different level, by giving his lighting intelligence.

The system is called ‘Myra’, and it works by detecting what you’re doing in the room, and based on this, robotic lights will optimally adjust to the activity. For example, if you’re walking through the room, the system will attempt to illuminate your path as you walk. Other activities are detected as well, like reading a book, watching TV, or just standing still.

At the heart of the ‘Myra’ system is an RGBD Sensor (Microsoft Kinect/Asus Xtion). The space in the room is processed by a PC running an application to determine the current ‘activity’. Wireless robotic lights are strategically placed around the room; each with a 2-servo system and standalone Arduino. The PC sends out commands to each light with an angle for the two axis and the intensity of the light. The lights receive this command wirelessly via a 315MHz receiver, and the Arduino then ‘aims’ the beam according to the command.

This isn’t the first time we’ve seen [Ken’s] work; a couple of years ago we saw his extremely unique ‘real life’ weather display.  The ‘Myra’ system is still a work in progress, so we can’t wait to see how it all ends up.  Be sure to check out the video after the break for a demo of the system.

Continue reading “Autonomous Lighting With Intelligence”

Holograms With The New Kinect

kinect

The Xbox One is out, along with a new Kinect sensor, and this time around Microsoft didn’t waste any time making this 3D vision sensor available for Windows. [programming4fun] got his hands on the new Kinect v2 sensor and started work on a capture system to import anything into a virtual environment.

We’ve seen [programming4fun]’s work before with an extremely odd and original build that turns any display into a 3D display with the help of a Kinect v1 sensor. This time around, [programming] isn’t just using a Kinect to display a 3D object, he’s also using a Kinect to capture 3D data.

[programming] captured himself playing a few chords on a guitar with the new Kinect v2 sensor. This was saved to a custom file format that can be played back in the Unity engine. With the help of a Kinect v1, [programming4fun] can pan and tilt around this virtual model simply by moving his head.

If that’s not enough, [programming] has also included support for the Oculus Rift, turning the Unity-based virtual copy of himself into something he can interact with in a video game.

As far as we can tell, this is the first build on Hackaday using the new Kinect sensor. We asked what everyone was going to do with this new improved hardware, and from [programming]’s demo, it seems like there’s still a lot of unexplored potential with the new Xbox One spybox.

Continue reading “Holograms With The New Kinect”

A New Way To Heat People

[Leigh Christie] is a researcher at MIT, and he’s developed an interesting solution to heating people, not buildings.

His TEDx talk, “Heating Buildings is Stupid,” demonstrates the MIT SENSEable City Laboratory’s efforts to tackle energy issues. Their research focuses on finding an alternative to the staggering waste of energy used to heat large spaces. Although TED talk articles are a rarity at Hackaday, we think this idea is both simple and useful. Also, [Leigh] is the same guy who brought us the Mondo Spider a few years ago for the Burning Man exhibition. He’s a hacker.

Anyway, what is it? The system he’s devised is so simple that it’s brilliant: a person-tracking infrared heat spotlight. Using a Microsoft Kinect, the lamp follows you around and keeps the individual warm rather than the entire space. [Leigh] has grand plans for implementing what he calls “Local Heating” in large buildings to save on energy consumption, but smaller-scale implementations could prove equally beneficial for a big garage or a workshop. How much does your workspace cost to heat during the winter? Hackerspaces seem like the perfect test environment for a cobbled-together “Local Heating” system. If anyone builds one, we want to hear about it.

Check out the full TEDx talk after the break.

Continue reading “A New Way To Heat People”

A Kinect Controlled Robotic Hand

hand

It’s that time of year again when the senior design projects come rolling in. [Ben], along with his partners [Cameron], [Carlton] and [Chris] have been working on something very ambitious since September: a robotic arm and hand controlled by a Kinect that copies the user’s movements.

The arm is a Lynxmotion AL5D, but instead of the included software suite the guys rolled their own means of controlling this arm with the help of an Arduino. The Kinect captures the user’s arm position and turns that into data for the arm’s servos.

A Kinect’s resolution is limited, of course, so for everything beyond the wrist, the team turned to another technology – flex resistors. A glove combined with these flex resistors and an accelerometer provides all the data of the position of the hand and fingers in space.

This data is sent over to another Arduino on the build for orienting the wrist and fingers of the robotic arm. As shown in the videos below, the arm performs remarkably well, just like the best Waldos you’ve ever seen.

Continue reading “A Kinect Controlled Robotic Hand”

Crazyflie Control With Leap And Kinect

crazieFlie03

The gang at Bitcraze is at it again, this time developing Leap Motion control for their Crazyflie quadcopter, as well as releasing a Kinect-driven autopilot proof of concept. If you haven’t seen the Crazyflie before, you may not realize how compact it is: 90mm motor to motor and only 19 grams.

As far as we can tell, the Crazyflie still needs a PC to control it, so the Leap and Kinect are natural followups. Hand control with the Leap Motion is what you’d expect: just imagine your open palm controlling it like a marionette, with the height of your hand dictating thrust. The Kinect setup looks the most promising. The guys strapped a red ball to the Crazyflie that provides a trackable object against a white backdrop. The Kinect then monitors the quadcopter while a user steers via mouse clicks. Separate PID controllers correct the roll, pitch and thrust to reposition the Crazyflie from its current coordinates to a new setpoint chosen by a click or a drag. Videos of both Leap and Kinect piloting are below.

Tight on cash but still want to take to the skies? We have two rubber-band-powered devices from earlier this week: the Ornithopter and the hilariously brilliant GoPro Slingshot.

Continue reading “Crazyflie Control With Leap And Kinect”

Kinect Full Body Scanner

kinect-full-body-scanner

Why let the TSA have all the fun when it comes to full body scanning? Not only can you get a digital model of yourself, but you can print it out to scale.

[Moheeb Zara] is still in development with a Kinect based full body scanner. But he took a bit of time to show off the first working prototype. The parts that went into the build were either cut on a bandsaw, laser cut, or 3D printed. The scanning part of the rig uses a free-standing vertical rail which allows the Kinect to move along the Z axis. The sled is held in place by gravity and moved up the rail using a winch with some steel cable looped over a pulley at the top.

The subject stands on a rotating platform which [Moheeb] designed and assembled. Beneath the platform you’ll find a laser cut hoop with teeth on the inside. A motor mounted in a 3D printed bracket uses these teeth to rotate the platform. He’s still got some work to do in order to automate the platform. For this demo he move each step in the scanning process using manual switches. Captured data is assembled into a virtual module using ReconstructMe.

The Kinect has been used as a 3D scanner like this before. But that time it was scanning salable goods rather than people.

Continue reading “Kinect Full Body Scanner”