The Kinect is an interesting beast. On one hand, it’s fantastic for hacking – a purpose for which it was not designed. On the other hand, it’s “just OK” when it comes to gaming – its entire reason for being.
One of the big complaints regarding the Kinect’s control scheme is that it’s no good for games such as first person shooters, where a large majority of the action involves walking, jumping, and aiming. For his Master’s project, [Alex Poolton] put together a fantastic demonstration showing how the Kinect can be paired with a standard Xbox controller to provide hybrid gaming input.
While you might expect a simple game that shows the fundamentals of the hybrid control system, he has put together a full fledged game demo that shows how this control scheme might be implemented in a real game. [Alex] admits that it’s still a bit rough around the edges, but there’s some real potential in his design.
Continue reading to see a video demonstration of [Alex’s] project in action, and be sure to check out his blog for news and updates on the project.
Continue reading “Hybrid control scheme using an Xbox game pad and Kinect”
While we have seen Kinect-based virtual dressing rooms before, the team at Arbuzz is taking a slightly different approach (Translation) to the digital dress up game. Rather than using flat images of clothes superimposed on the subject’s body, their solution uses full 3D models of the clothing to achieve the desired effect. This method allows them to create a more true to life experience, where the clothing follows the subject around, flowing naturally with the user’s movements.
Like many other Kinect hacks, they use openNI and NITE to obtain skeletal data from the sensor. The application itself was written in C# with Microsoft’s XNA game development tools, and uses a special physics engine to render the simulated cloth in a realistic fashion
[Lukasz] says that the system is still in its infancy, and will require plenty of work before they are completely happy with the results. From where we’re sitting, the demo video embedded below is pretty neat, even if it is a bit rough around the edges. We were particularly pleased to see the Xbox’s native Kinect interface put to work in a DIY project, and we are quite interested to see how things look once they put the final touches on it.
Continue reading “Play dress up with Kinect”
Some of the Kinect hacks we have featured here are quite useful in the realm of assisted living, others showcase what can be done with the clever application of video filters. Some…are just plain fun.
This pair of Kinect hacks are not necessarily going to win any awards for usefulness, but they are big on fun. [Tom] put together a neat juggling application that watches for your hands to disappear behind your back, generating a glowing ball once they return to the camera’s field of vision. The balls can be tossed away or juggled as you can see in the video below. It looks like it could be pretty fun and most definitely easier than chasing balls around while learning to juggle.
[Tom’s] hack was based off code he saw demonstrated in a video by YouTube user [hogehoge335]. His application for the Kinect allows him to replicate the Kamehameha attack from Dragonball Z, flowing hair and all.
Check out the videos below for a demonstration of both Kinect hacks, and swing by the respective Google Code sites if you want to give them a try.
Continue reading “Juggling with Kinect”
[Ryan Lloyd], [Sandeep Dhull], and [Ruben D'Sa] wrote in to share a robotics project they have been keeping busy with lately. The three University of Minnesota students are using a Kinect sensor to remotely control a robotic arm, but it’s not as simple as it sounds.
Using OpenNI alongside PrimeSense, the team started out by doing some simple skeleton tracking before working with their robotic arm. The arm has five degrees of freedom, making the task of controlling it a bit tricky. The robot has quite a few joints to play with, so the trio not only tracks shoulder, elbow, and wrist movements, but they also monitor the status of the user’s hand to actuate the robot’s gripper.
When everything was said and done, the results were pretty impressive as you can see in the video below, but the team definitely sees room for improvement. Using inverse kinematics, they plan on filtering out some of the joint tracking inaccuracies that occur when the shoulders are moved in a certain way. They also plan on using a robotic arm with even more degrees of freedom to see just how well their software can perform.
Be sure to check out their site to see more details and videos.
Continue reading “Advanced robotic arm control using Kinect”
Now that Kinect has been hacked to work with just about everything from robots to toaster ovens, someone finally got around to tweaking it for use on the PS3.
[Shantanu] has been hard at work writing code and experimenting with some preexisting Kinect software to get the sensor to talk to his PS3. The Kinect is hooked up to a PC, which captures all of his movements with OpenNI. Those movements are mapped to PS3 controls via NITE, a piece of middleware used for interpreting gestures into commands. All of the captured button presses are then relayed to the PS3 over a Bluetooth connection using DIYPS3Controller.
As you can see in the video below, the solution works pretty well for what should be considered pre-alpha code. He has been able to map several custom gestures to button presses, and the Kinect does an overall decent job tracking his limbs and translating their movements to on-screen actions. The actual in-game use is a bit rough at the moment, but aside from the infancy of the code, you have to remember that these games were never meant to be played with the Kinect.
It’s a job well done, and we can’t wait to see where this project goes.
Looking for more Kinect fun? Look no further than right here.
Continue reading “Clever hack tethers a Kinect sensor to the PS3″