If you’re going to build a giant touch screen, why not use an OS that is designed for touch interfaces, like Android? [Colin] had the same idea, so he connected his phone to a projector and a Kinect.
Video is carried from [Colin]’s Galaxy Nexus to the projector via an MHL connection. Getting the Kinect to work was a little more challenging, though. The Kinect is connected to a PC running Simple Kinect Touch. The PC converts the data from the Kinect into TUIO commands that are received using TUIO for Android.
In order for the TUIO commands to be recognized as user input, [Colin] had to compile his own version of Android. It was a lot of work, but using an OS designed for touch interface seems much better than all the other touch screen hacks that start from the ground up.
You can check out [Colin]’s demo after the break. Sadly, there are no Angry Birds.
Continue reading “Control Android With A Projector And Kinect”
Even though giant multouch display tables have been around for a few years now we have yet to see them being used in the wild. While the barrier to entry for a Microsoft Surface is very high, one of the biggest problems in implementing a touch table is one of interaction; how exactly should the display interpret multiple commands from multiple users? [Stephan], [Christian], and [Patrick] came up with an interesting solution to sorting out who is touching where by having a computer look at shoes.
The system uses a Kinect mounted on the edge of a table to extract users from the depth images. From there, interaction on the display can be pinned to a specific user based on hand and arm orientation. As an added bonus the computer can also recognize users from their shoes. If a user is wearing a pair of shoes the computer recognizes, they’ll just walk up to the table and the software will recognize them.
Continue reading “Nice Shoes, Wanna Recognize Some Input?”
Finally [Michelle Annett] can talk about her super secret project she did at Autodesk Research.
Medusa, as [Michelle]’s project is called, is a Microsoft Surface that has been fitted with 138 proximity sensors. This allows the Surface to sense users walking up to it, and detect users hands and arms above the table top. Multiple users can be detected at the same time, and the left and right hands of two users can be mapped to specific users.
The proximity sensors [Michelle] used are inexpensive, so we’re wondering when someone with a crazy multitouch setup will add proximity sensors to their build. We’d like to play with Medusa, even if just for a virtual game of Settlers of Catan. It seems like the perfect setup…
[Michelle] built Medusa last January during her internship at Autodesk. Now that UIST 2011 is over, she can finally talk about it. There’s also a video demonstrating the possibilities of Medusa, check it out after the break.
Thanks [Fraser] for sending this one in.
Continue reading “Medusa: A Proximity-aware Tabletop”
When we see artists like Daft Punk or Madeon working their magic in a live setting, we’re always impressed with their controllers. Sample-based artist use controllers like the Monome and Kaoss Pad a lot, but these devices are fairly expensive. Thankfully, we live in an age of multitouch displays, so [Graham Comerford] came up with his own multitouch controller that does just about anything.
The build is based on the Kivy framework and includes a Monome emulator, MIDI drum pads, mixer, and a whole bunch of other sliders and buttons. There’s no word on how [Graham]’s multitouch display was constructed, but if you’re looking to build your own gigantic audio control setup there’s a lot of info on building Microsoft Surface clones, adapting computer monitors, and spherical multitouch rigs.
We’re not sure if [Graham]’s virtual drum kit is velocity sensitive but even if it’s not, it’s an interesting bit of kit. Check out an earlier version of his setup after the break.
Continue reading “Controlling Samplers And Sequencers With Multitouch”
This is a keyboard alternative that [Sebastian] is building from two Apple Magic Trackpads. The multitouch devices are a good platform for this because they’re designed to pick up several events at the same time. To prototype the locations of the keys he’s using printable transparency sheets. He gives you a sense of where the home row is with a dab of clear fingernail polish that you can feel with your digits.
He may laser etch these pads once the key location is just right. This should give a bit of texture in itself and do away with the need for nail polish but we still like the ingenuity of that solution. The device is being developed in Linux, with some kernel hacking to handle the devices. We asked about source code and [Sebastian] is hesitant to post it because he’s been getting a lot of kernel panics. It sounds like once he cleans things up a bit he’ll share his work.
Don’t forget, there’s an easy hack to do away with the batteries in these things.
We all love a little bit of multitouch, but we’ve seen so many setups that it is getting a bit less exciting. This one will get your attention with its unique shape. It is a spherical multitouch using all open source software. Well, since the poles are unusable, it might just be toroidal, or cylindrical, but it is still impressive. They are using a convex mirror mounted to the upper most point of the frosted sphere to reflect a projector mounted at the bottom of the base. A web cam pointed at that same mirror picks up reflected IR light from a few emitters. You can catch a video of it after the break.
Continue reading “Spherical Multitouch Rig”
[Taichi Inoue] is back again, this time with a multitouch system that uses water as the touch surface. The setup consists of a tank of water placed atop an LCD, a lamp, and a web cam. The web cam pics up the light that is reflected when something breaks the surface of the water. It is, as far as the computer is concerned, no different than the blob recognition we see with many of the home made multitouch systems. Mixed with his Yukikaze, this guy might end up with the most relaxing computer system in the world.