Twitch And Blink Your Way Through Typing With This Facial Keyboard

For those that haven’t experienced it, the early days of parenthood are challenging, to say the least. Trying to get anything accomplished with a raging case of sleep deprivation is hard enough, but the little bundle of joy who always seems to need to be in physical contact with you makes doing things with your hands nigh impossible. What’s the new parent to do when it comes time to be gainfully employed?

Finding himself in such a boat, [Fletcher]’s solution was to build a face-activated keyboard to work around his offspring’s needs. Before you ask: no, voice recognition software wouldn’t work, at least according to the sleepy little boss who protests noisy awakenings. The solution instead was to first try OpenCV and the dlib facial recognition library to watch [Fletcher] blinking out Morse code. While that sorta-kinda worked, one’s blinkers can’t long endure such a workout, so he moved on to an easier set of gestures. Mouthing Morse code covers most of the keyboard, while a combination of eye, eyebrow, and other facial twitches and tics cover the rest, with MediaPipe’s Face Mesh doing the heavy-lifting in terms of landmark detection.

The resulting facial keyboard, aptly dubbed “CheekyKeys,” performed well enough for [Fletcher] to use for a skills test during an interview with a Big Tech Company. Imagining the interviewer on the other end watching him convulse his way through the interview was worth the price of admission, and we don’t even care if it was a put-on. Video after the break.

CheekyKeys is pretty cool, doing something with a webcam and Python that we thought would have needed a dedicated AI depth camera to accomplish. But perhaps the real hack here was how [Fletcher] taught himself Morse in fifteen minutes.

Continue reading “Twitch And Blink Your Way Through Typing With This Facial Keyboard”

OpenCV Brings Pinch To Zoom Into The Real World

Gesture controls arrived in the public consciousness a little over a decade ago as touchpads and touchscreens became more popular. The main limitation to gesture controls, a least as far as [Norbert] is concerned, is that they can only control objects in a virtual space. He was hoping to use gestures to control a real-world object instead, and created this device which uses gestures to control an actual picture.

In this unique augmented reality device, not only is the object being controlled in the real world but the gestures are being monitored there as well, thanks to a computer vision system watching his hand which is running OpenCV. The position data is fed into an algorithm which controls a physical picture mounted on a slender robotic arm. Now, when [Norbert] “pinches to zoom”, the servo attached to the picture physically brings it closer to or further from his field of view. He can also use other gestures to move the picture around.

While this gesture-controlled machine is certainly a proof-of-concept, there are plenty of other uses for gesture controls of real-world objects. Any robotics platform could benefit from an interface like this, or even something slightly more mundane like an office PowerPoint presentation. Opportunity abounds, but if you need a primer for OpenCV take a look at this build which tracks a hand in minute detail.

Continue reading “OpenCV Brings Pinch To Zoom Into The Real World”

Hedgehog Gesture Sensor Built With Cheap Time-of-Flight Modules

Time-of-flight sensors used to be expensive obscurities, capable of measuring the travel time of photons themselves and often used for tracking purposes. However, the technology is cheaper now, such that [jean.perardel] has used TOF sensors to build a useful and affordable gesture-tracking system.

The system relies on four VL53L1X time of flight sensors, which have a 16×16 scanning array and communicate over the I2C bus. Controlling the show is an Arduino MKR1010, though the project should be achievable with a range of other microcontrollers, too.

The device is built into a cute hedgehog-like form factor, with an LCD screen acting as the face. It displays facial expressions which show how the system is interpreting and responding to gestures. It gives the project lots of personality, which makes using the system more fun. Gestures from the system can be used to send keystrokes over USB, control relays or servos, or even fire IR signals to control TVs and other hardware.

It actually seems like a useful gesture control interface, one that could become a useful part of a workstation setup. We’ve seen gesture controls put to other uses too, like controlling robot arms. Video after the break.

Continue reading “Hedgehog Gesture Sensor Built With Cheap Time-of-Flight Modules”

Gesture Control The Easy Way

Gesture control is a technology that has floated around for quite a while, but never quite reached mainstream acceptance. Wii Bowling was fun for a while, but we’re not regularly using gestures to open doors or order pizza just yet. Doing it yourself can be quite easy, however, as [RC Lover san] found with a barebones, hacky build.

Typically, when we think of gesture control, we envisage object tracking cameras or MEMS accelerometers. Instead, this build uses simple tilt switches, as you might find in a pinball machine from days of yore. Four of these are placed on a wrist-mounted device, allowing the user to tilt their arm to move an RC car in different directions. The tilt switches are easy to hack into the controller for a toy RC car, as they simply replace the existing buttons on the PCB.

It’s a project that goes to show that not everything has to be done with advanced sensors and complex algorithms. Sometimes, it can all be done with a handful of cheap switches and some ingenuity. Plus, using arm movements to scoot BB-8 around on the floor looks like great fun. We’ve seen other attempts to build simple gesture controls with pots, too. Video after the break.

Continue reading “Gesture Control The Easy Way”

Simplified AI On Microcontrollers

Artificial intelligence is taking the world by storm. Rather than a Terminator-style apocalypse, though, it seems to be more of a useful tool for getting computers to solve problems on their own. This isn’t just for supercomputers, either. You can load AI onto some of the smallest microcontrollers as well. Tensorflow Lite is a popular tool for this, but getting it to work on your particular microcontroller can be a pain, unless you’re using an Espruino.

This project adds support for Tensorflow to this class of microcontrollers without having to fuss around with obtuse build tools. Basically adding a single line of code creates an instance, all without having to compile anything or even reboot. Tensorflow is a powerful software tool for microcontrollers, and having it this accessible now is a great leap forward.

So, what can you do with this tool? The team behind this build is using Tensorflow on an open smart watch that can be used to detect hand gestures and many other things. They also opened up these tools for use in a browser, which allows use of the AI software and emulates an Espruino without needing a physical device. There’s a lot going on with this one, and it’s a bonus that it’s open source and ready to be turned into anything you might need, like turning yourself into a Street Fighter.

Smartphone Case Doubles As Chording Keyboard, With Gesture Inputs

Smartphones and other modern computing devices are wonderful things, but for those with disabilities interacting with them isn’t always easy. In trying to improve accessibility, [Dougie Mann] created TypeCase, a combination gestural input device and chording keyboard that exists in a kind of symbiotic relationship with a user’s smartphone.

With TypeCase, a user can control a computer (or the smartphone itself) with gestures, emulate a mouse, or use the device as a one-handed chording keyboard for text input. The latter provides an alternative to voice input, which can be awkward in public areas.

The buttons and motion sensors allow for one-handed button and gestural input while holding the phone, and the Bluetooth connectivity means that the device acts and works just like a wireless mouse or keyboard. The electronics consist mainly of an Adafruit Feather 32u4 Bluefruit LE, and [Dougie] used 3D Hub’s on-demand printing service to create the enclosures once the design work was complete. Since TypeCase doubles as a protective smartphone case, users have no need to carry or manage a separate device.

TypeCase’s use cases are probably best expressed by [Dougie]’s demo video, embedded below. Chording keyboards have a higher learning curve, but they can be very compact. One-handed text input does remind us somewhat of a very different approach that had the user make gestures in patterns reminiscent of Palm’s old Graffiti system; perhaps easier to learn but not nearly as discreet.

Continue reading “Smartphone Case Doubles As Chording Keyboard, With Gesture Inputs”

Arduino, Accelerometer, And TensorFlow Make You A Real-World Street Fighter

A question: if you’re controlling the classic video game Street Fighter with gestures, aren’t you just, you know, street fighting?

That’s a question [Charlie Gerard] is going to have to tackle should her AI gesture-recognition controller experiments take off. [Charlie] put together the game controller to learn more about the dark arts of machine learning in a fun and engaging way.

The controller consists of a battery-powered Arduino MKR1000 with WiFi and an MPU6050 accelerometer. Held in the hand, the controller streams accelerometer data to an external PC, capturing the characteristics of the motion. [Charlie] trained three different moves – a punch, an uppercut, and the dreaded Hadouken – and captured hundreds of examples of each. The raw data was massaged, converted to Tensors, and used to train a model for the three moves. Initial tests seem to work well. [Charlie] also made an online version that captures motion from your smartphone. The demo is explained in the video below; sadly, we couldn’t get more than three Hadoukens in before crashing it.

With most machine learning project seeming to concentrate on telling cats from dogs, this is a refreshing change. We’re seeing lots of offbeat machine learning projects these days, from cryptocurrency wallet attacks to a semi-creepy workout-monitoring gym camera.

Continue reading “Arduino, Accelerometer, And TensorFlow Make You A Real-World Street Fighter”