Gesture controls arrived in the public consciousness a little over a decade ago as touchpads and touchscreens became more popular. The main limitation to gesture controls, a least as far as [Norbert] is concerned, is that they can only control objects in a virtual space. He was hoping to use gestures to control a real-world object instead, and created this device which uses gestures to control an actual picture.
In this unique augmented reality device, not only is the object being controlled in the real world but the gestures are being monitored there as well, thanks to a computer vision system watching his hand which is running OpenCV. The position data is fed into an algorithm which controls a physical picture mounted on a slender robotic arm. Now, when [Norbert] “pinches to zoom”, the servo attached to the picture physically brings it closer to or further from his field of view. He can also use other gestures to move the picture around.
While this gesture-controlled machine is certainly a proof-of-concept, there are plenty of other uses for gesture controls of real-world objects. Any robotics platform could benefit from an interface like this, or even something slightly more mundane like an office PowerPoint presentation. Opportunity abounds, but if you need a primer for OpenCV take a look at this build which tracks a hand in minute detail.
Continue reading “OpenCV Brings Pinch To Zoom Into The Real World”