Nice Shoes, Wanna Recognize Some Input?

Even though giant multouch display tables have been around for a few years now we have yet to see them being used in the wild. While the barrier to entry for a Microsoft Surface is very high, one of the biggest problems in implementing a touch table is one of interaction; how exactly should the display interpret multiple commands from multiple users? [Stephan], [Christian], and [Patrick] came up with an interesting solution to sorting out who is touching where by having a computer look at shoes.

The system uses a Kinect mounted on the edge of a table to extract users from the depth images. From there, interaction on the display can be pinned to a specific user based on hand and arm orientation. As an added bonus the computer can also recognize users from their shoes. If a user is wearing a pair of shoes the computer recognizes, they’ll just walk up to the table and the software will recognize them.

Continue reading “Nice Shoes, Wanna Recognize Some Input?”

Web-enabled Kinect

There are Kinect hacks out there for robot vision, 3D scanners, and even pseudo-LIDAR setups. Until now, one limiting factor to these builds is the requirement for a full-blown computer on the device to deal with the depth maps and do all the necessary processing and computation. This doesn’t seem like much of a problem since [wizgrav] published Intrael, an HTTP interface for the Kinect.

[Eleftherios] caught up to [wizgrav] at his local hackerspace where he did a short tutorial on Intrael. [wizgrav]’s project provides each frame from the Kinect over HTTP wrapped up in JSON arrays. Everything a Kinect outputs aside from sound is now easily available over the Internet.

The project is meant to put computer vision outside the realm of desktops and robotic laptops and into the web. [wizgrav] has a few ideas on what his project can be used for, such as smart security cameras and all kinds of interactive surfaces.

After the break, check out the Intrael primer [wizgrav] demonstrated (it’s Greek to us, but there are subtitles), and a few demos of what Intrael ‘sees.’

Continue reading “Web-enabled Kinect”

Think You Can Take Kinect To The Next Level? Check Out Kinect Accelerator

kinect-accelerator

If you’ve got a crazy ingenious idea for Microsoft’s Kinect peripheral, but don’t have the means to make your dream a reality, the Kinect Accelerator just might be the opportunity you’ve been waiting for.

Microsoft, having performed a complete 180-degree turnaround from their initial stance on Kinect hacking, is embracing developers more than ever with this new program. They are offering a $20,000 along with development space to ten startup companies, in hopes of turning out some incredible Kinect applications. At the end of the three month program, each group will have the opportunity to present their creations to a group of angel investors, which is a fantastic opportunity.

Obviously competition to gain entry into the program will be pretty fierce, but if you think you have what it takes, get your application in now. Judging by the Kinect Accelerator FAQ section, this looks to be something geared towards small tech startups rather than individuals, but it never hurts to give it a shot.

A Kinect Primer

Yes, the Kinect is over one year old now, and after some initial unhappiness from [Microsoft], it’s become a hacker’s best friend. [Eric] decided to celebrate this with an Article all about how it works.  If you’re new to this piece of hardware and want to get into working with it, this should be a good hacking introduction.  If you’ve been reading [HAD] lately, you will have noticed this information being used to “build a Kinect bot for 500 bones.”

Some interesting facts in this article include that the Kinect measures 307200 distance point, known as a “point cloud” in the gaming area. From this, it’s able to construct a 3D image of the environment around it and allow interaction. Such interesting hardware didn’t take long to hack after Adafruit announced a $3000.00 bounty to open it up to the masses.  This only took four days to do, making one wonder why, with their incredible resources, [Microsoft] wouldn’t either more effectively lock it down or officially open it to be hacked and modified to begin with. Our vote would be to officially open it up, but no one consulted us on the decision.

Build A Kinect Bot For 500 Bones

[Eric] sent in his tutorial on building a Kinect based robot for $500, a low-cost solution to a wife that thinks her husband spends too much on robots.

For the base of his build, [Eric] used an iRobot Create, a derivative of the Roomba that is built exclusive for some hardware hackery. For command and control of the robot, an EEE netbook takes data from the Kinect and sends it to the iRobot over a serial connection.

The build itself is remarkably simple: two pieces of angle aluminum were attached to the iRobot, and a plastic milk crate was installed with zip ties. The Kinect sits on top of the plastic crate and the netbook comfortably fits inside.

A few weeks ago, [Eric] posted a summary of the history and open-source software for the Kinect that covers the development of the Libfreenect driver. [Eric] used this same driver for his robot. Currently, the robot is configured for two modes. The first mode has the robot travel to the furthest point from itself. The second mode instructs the robot to follow the closest thing to itself – walk in front of the robot and it becomes an ankle biter.

There is a limitation of the Kinect that [Eric] is trying to work around. Objects closer than 19 inches to the Kinect appear to be very far away. This caused a lot of wall bumping, but he plans on adding a few ultrasonic sensors to fill the gap in the sensor data. Not bad for a very inexpensive autonomous robot.

[Vigo’s] Stare Follows You Wherever You Go

To decorate the office for Halloween [Eric] decided to make [Vigo the Carpathian] stare at passersby. We hope that readers recognize this image, but for those younger hackers who don’t, this painting of [Vigo] played an important part in the classic film Ghostbusters II.

In the movie, his eyes appeared to be following anyone looking at the painting. [Eric] grabbed a Kinect and used Processing to recreate the effect in real life. The image is displayed on an LCD screen. A bit of work with Photoshop allowed him to cut out the eyes from the image, then create sprites which are moved by the Processing sketch. It’s reading data from the Kinect (so it knows where to ‘look’) which you can see perched on the top of the cubicle wall. The illusion is delightful, see for yourself in the clip after the break. We’ve already watched it a half-dozen times, and it looks like it was a real hit with the guests at the open house.

Can you believe they threw this together in just one day?

Continue reading “[Vigo’s] Stare Follows You Wherever You Go”

The “Effervo” Kinect Particle Effect Machine

Here’s a new hack for the Xbox Kinect called “Effervo”. It’s a really cool effect built using Openframeworks. The Kinect is setup in front of the user and the projector puts an image in front of the user’s screen. Three dimensional data about the person and his or her movements is captured using Microsoft’s sensor. As it is described, the Effervo program “uses simple iterative rules to govern its movement and gives the impression of swarm like behavior.” This may not be a “Haloween Hack”, but we could definitely see something similar used in a haunted house. Maybe it could use blood droplets instead of particles?

Maybe this hack will inspire other people to follow in [Jayson’s] footsteps. He describes himself as a “programmer turned artist.” We’d like to think that all engineering and programming work is a form of art, but the video of this piece in action after the break is especially eye-catching.

Continue reading “The “Effervo” Kinect Particle Effect Machine”