The big news circulating this morning is of the Leap Motion sensor that will be hitting the market soon. They claim that their sensor is ~100 more accurate than anything else on the market right now. Check out the video to see what they mean (there’s another at the link). This is pretty impressive. You can see the accuracy as they use a stylus to write things. If you’ve played with the Kinect, you know that it is nowhere near this tight. Of course, the Kinect is scanning a massive space compared to the work area that this new sensor works in. The response time looks very impressive as well, motions seem to be perfectly in sync with the input. We’re excited to play with one when we get a chance.
So, why do we care as hackers? Well, we always care when a new toy arrives. That alone should be good enough. However, what we really like is the price tag. This thing is $69. That is a great thing to hear. At roughly half the cost of a Kinect, this is getting into a new market. As these prices drop, we might start to see motion input used as it really should be; a supplement to your other input devices. Undoubtely, someone won’t actually read this article and one of the comments will be “your arms will get tired doing everything by waving your hands”. Yep, your arms would get tired. With the cost of these devices being rather high, people tend to think of them as being the primary input device. As the prices drop (and size as well), we could start adding these things to our laptops and keyboards. Sometimes you actually do want to wave your hand at the screen, when an application can utilize that naturally. Then you go right back to the keyboard/mouse when that fits. If these got cheap enough, we could see them pop up in vending machines making them ~100 times more sanitary!
Like everyone else, we really want to know how these work. We can see several demos of it in action in the videos. We’re familiar with common methods of doing this kind of thing. At one point, there’s a hand visualization that looks like it might be a very tightly packed point cloud (IR array? those points do jitter!). Then again, that could just be a fun little graphical representation. We can’t wait to see, so if any of you get your hands on one of the developer models, let us know!