A New Way to Heat People

heat spotlight

[Leigh Christie] is a researcher at MIT, and he’s developed an interesting solution to heating people, not buildings.

His TEDx talk, “Heating Buildings is Stupid,” demonstrates the MIT SENSEable City Laboratory’s efforts to tackle energy issues. Their research focuses on finding an alternative to the staggering waste of energy used to heat large spaces. Although TED talk articles are a rarity at Hackaday, we think this idea is both simple and useful. Also, [Leigh] is the same guy who brought us the Mondo Spider a few years ago for the Burning Man exhibition. He’s a hacker.

Anyway, what is it? The system he’s devised is so simple that it’s brilliant: a person-tracking infrared heat spotlight. Using a Microsoft Kinect, the lamp follows you around and keeps the individual warm rather than the entire space. [Leigh] has grand plans for implementing what he calls “Local Heating” in large buildings to save on energy consumption, but smaller-scale implementations could prove equally beneficial for a big garage or a workshop. How much does your workspace cost to heat during the winter? Hackerspaces seem like the perfect test environment for a cobbled-together “Local Heating” system. If anyone builds one, we want to hear about it.

Check out the full TEDx talk after the break.

[Read more...]

A Kinect Controlled Robotic Hand

hand

It’s that time of year again when the senior design projects come rolling in. [Ben], along with his partners [Cameron], [Carlton] and [Chris] have been working on something very ambitious since September: a robotic arm and hand controlled by a Kinect that copies the user’s movements.

The arm is a Lynxmotion AL5D, but instead of the included software suite the guys rolled their own means of controlling this arm with the help of an Arduino. The Kinect captures the user’s arm position and turns that into data for the arm’s servos.

A Kinect’s resolution is limited, of course, so for everything beyond the wrist, the team turned to another technology – flex resistors. A glove combined with these flex resistors and an accelerometer provides all the data of the position of the hand and fingers in space.

This data is sent over to another Arduino on the build for orienting the wrist and fingers of the robotic arm. As shown in the videos below, the arm performs remarkably well, just like the best Waldos you’ve ever seen.

[Read more...]

Crazyflie control with Leap and Kinect

crazieFlie03

The gang at Bitcraze is at it again, this time developing Leap Motion control for their Crazyflie quadcopter, as well as releasing a Kinect-driven autopilot proof of concept. If you haven’t seen the Crazyflie before, you may not realize how compact it is: 90mm motor to motor and only 19 grams.

As far as we can tell, the Crazyflie still needs a PC to control it, so the Leap and Kinect are natural followups. Hand control with the Leap Motion is what you’d expect: just imagine your open palm controlling it like a marionette, with the height of your hand dictating thrust. The Kinect setup looks the most promising. The guys strapped a red ball to the Crazyflie that provides a trackable object against a white backdrop. The Kinect then monitors the quadcopter while a user steers via mouse clicks. Separate PID controllers correct the roll, pitch and thrust to reposition the Crazyflie from its current coordinates to a new setpoint chosen by a click or a drag. Videos of both Leap and Kinect piloting are below.

Tight on cash but still want to take to the skies? We have two rubber-band-powered devices from earlier this week: the Ornithopter and the hilariously brilliant GoPro Slingshot.

[Read more...]

Kinect full body scanner

kinect-full-body-scanner

Why let the TSA have all the fun when it comes to full body scanning? Not only can you get a digital model of yourself, but you can print it out to scale.

[Moheeb Zara] is still in development with a Kinect based full body scanner. But he took a bit of time to show off the first working prototype. The parts that went into the build were either cut on a bandsaw, laser cut, or 3D printed. The scanning part of the rig uses a free-standing vertical rail which allows the Kinect to move along the Z axis. The sled is held in place by gravity and moved up the rail using a winch with some steel cable looped over a pulley at the top.

The subject stands on a rotating platform which [Moheeb] designed and assembled. Beneath the platform you’ll find a laser cut hoop with teeth on the inside. A motor mounted in a 3D printed bracket uses these teeth to rotate the platform. He’s still got some work to do in order to automate the platform. For this demo he move each step in the scanning process using manual switches. Captured data is assembled into a virtual module using ReconstructMe.

The Kinect has been used as a 3D scanner like this before. But that time it was scanning salable goods rather than people.

[Read more...]

3D mapping of rooms, again

kin

Last year we saw what may be the coolest application of a Kinect ever. It was called Kintinuous, and it’s back again, this time as Kintinuous 2.0, with new and improved features.

When we first learned of Kintinuous, we were blown away. The ability for a computer with a Kinect to map large-scale areas has applications as diverse as Google Street View, creating custom Counter-Strike maps, to archeological excavations. There was one problem with the Kintinuous 1.0, though: scanning a loop would create a disjointed map, where the beginning and end of a loop would be in a different place.

In the video for Kintinuous 2.0, you can see a huge scan over 300 meters in length with two loops automatically stitched back into a continuous scan. An amazing feat, especially considering the computer is processing seven million vertices in just a few seconds.

Unfortunately, it doesn’t look like there will be an official distribution of Kintinuous 2.0 anytime soon. The paper for this Kintinuous is still under review, and there are ‘issues’ surrounding the software that don’t allow an answer to the if and when question of release. Once the paper is out, though, anyone is free to reimplement it, and we’ll gladly leave that as an open challenge to our readers.

[Read more...]

Human Asteroids makes you a vector triangle ship

asteroids

In 1979, [Nolan Bushnell] released Asteroids to the world. Now, he’s playing the game again, only this time with the help of a laser projector and a Kinect that turns anyone sitting on a stool – in this case [Nolan] himself – into everyone’s favorite vector spaceship. It’s a project for Steam Carnival, a project by [Brent Bushnell] and [Eric Gradman] that hopes to bring a modern electronic carnival to your town.

The reimagined Asteroids game was created with a laser projector to display the asteroids and ship on a floor. A Kinect tracks the user sitting and rolling on a stool while a smart phone is the triangular spaceship’s ‘fire’ button. The game is played in a 150 square foot arena, and is able to put anyone behind the cockpit of an asteroid mining triangle.

[Brent] and [Eric] hope to bring their steam carnival to LA and San Francisco next spring, but if they exceed their funding goals, they might be convinced to bring their show east of the Mississippi. We’d love to try it out by hiding behind the score like the original Asteroids and wasting several hours.

[Read more...]

Ask Hackaday: What are we going to do with the new Kinect?

kinect

Yesterday Microsoft announced their new cable box, the Xbox One. Included in the announcement is a vastly improved Kinect sensor. It won’t be available until next Christmas, but now the question is what are we going to do with it?

From what initial specs that can be found, the new version of the Kinect will output RGB 1080p video over a USB 3.0 connection to the new Xbox. The IR depth camera of the original Kinect has been replaced with a time of flight camera – a camera that is able to send out a pulse of light and time how long it takes for photons to be reflected back to the camera. While there have been some inroads into making low-cost ToF cameras – namely Intel and Creative’s Interactive Gesture Camera Development Kit and the $250 DepthSense 325 from SoftKinetic – the Kinect 2.0 will be the first time of flight camera you’ll be able to buy for a few hundred bucks at any Walmart.

We’ve seen a ton of awesome Kinect hacks over the years. Everything from a ‘holographic display’ that turns any TV into a 3D display, computer vision for robots, and a 3D scanner among others. A new Kinect sensor with better 3D resolution can only improve existing projects and the time of flight sensor – like the one found in Google’s driverless car – opens up the door for a whole bunch of new projects.

So, readers of Hackaday, assuming someone can write a driver in a few days like the Kinect 1.0, what are we going to do with it?

While we’re at it, keep in mind we made a call for Wii U controller hacks. If somebody can crack that nut, it’ll be an awesome remote for robots and FPV airplanes and drones.

Follow

Get every new post delivered to your Inbox.

Join 96,614 other followers