Gesture Control The Easy Way

Gesture control is a technology that has floated around for quite a while, but never quite reached mainstream acceptance. Wii Bowling was fun for a while, but we’re not regularly using gestures to open doors or order pizza just yet. Doing it yourself can be quite easy, however, as [RC Lover san] found with a barebones, hacky build.

Typically, when we think of gesture control, we envisage object tracking cameras or MEMS accelerometers. Instead, this build uses simple tilt switches, as you might find in a pinball machine from days of yore. Four of these are placed on a wrist-mounted device, allowing the user to tilt their arm to move an RC car in different directions. The tilt switches are easy to hack into the controller for a toy RC car, as they simply replace the existing buttons on the PCB.

It’s a project that goes to show that not everything has to be done with advanced sensors and complex algorithms. Sometimes, it can all be done with a handful of cheap switches and some ingenuity. Plus, using arm movements to scoot BB-8 around on the floor looks like great fun. We’ve seen other attempts to build simple gesture controls with pots, too. Video after the break.

Continue reading “Gesture Control The Easy Way”

Simplified AI On Microcontrollers

Artificial intelligence is taking the world by storm. Rather than a Terminator-style apocalypse, though, it seems to be more of a useful tool for getting computers to solve problems on their own. This isn’t just for supercomputers, either. You can load AI onto some of the smallest microcontrollers as well. Tensorflow Lite is a popular tool for this, but getting it to work on your particular microcontroller can be a pain, unless you’re using an Espruino.

This project adds support for Tensorflow to this class of microcontrollers without having to fuss around with obtuse build tools. Basically adding a single line of code creates an instance, all without having to compile anything or even reboot. Tensorflow is a powerful software tool for microcontrollers, and having it this accessible now is a great leap forward.

So, what can you do with this tool? The team behind this build is using Tensorflow on an open smart watch that can be used to detect hand gestures and many other things. They also opened up these tools for use in a browser, which allows use of the AI software and emulates an Espruino without needing a physical device. There’s a lot going on with this one, and it’s a bonus that it’s open source and ready to be turned into anything you might need, like turning yourself into a Street Fighter.

Smartphone Case Doubles As Chording Keyboard, With Gesture Inputs

Smartphones and other modern computing devices are wonderful things, but for those with disabilities interacting with them isn’t always easy. In trying to improve accessibility, [Dougie Mann] created TypeCase, a combination gestural input device and chording keyboard that exists in a kind of symbiotic relationship with a user’s smartphone.

With TypeCase, a user can control a computer (or the smartphone itself) with gestures, emulate a mouse, or use the device as a one-handed chording keyboard for text input. The latter provides an alternative to voice input, which can be awkward in public areas.

The buttons and motion sensors allow for one-handed button and gestural input while holding the phone, and the Bluetooth connectivity means that the device acts and works just like a wireless mouse or keyboard. The electronics consist mainly of an Adafruit Feather 32u4 Bluefruit LE, and [Dougie] used 3D Hub’s on-demand printing service to create the enclosures once the design work was complete. Since TypeCase doubles as a protective smartphone case, users have no need to carry or manage a separate device.

TypeCase’s use cases are probably best expressed by [Dougie]’s demo video, embedded below. Chording keyboards have a higher learning curve, but they can be very compact. One-handed text input does remind us somewhat of a very different approach that had the user make gestures in patterns reminiscent of Palm’s old Graffiti system; perhaps easier to learn but not nearly as discreet.

Continue reading “Smartphone Case Doubles As Chording Keyboard, With Gesture Inputs”

Arduino, Accelerometer, And TensorFlow Make You A Real-World Street Fighter

A question: if you’re controlling the classic video game Street Fighter with gestures, aren’t you just, you know, street fighting?

That’s a question [Charlie Gerard] is going to have to tackle should her AI gesture-recognition controller experiments take off. [Charlie] put together the game controller to learn more about the dark arts of machine learning in a fun and engaging way.

The controller consists of a battery-powered Arduino MKR1000 with WiFi and an MPU6050 accelerometer. Held in the hand, the controller streams accelerometer data to an external PC, capturing the characteristics of the motion. [Charlie] trained three different moves – a punch, an uppercut, and the dreaded Hadouken – and captured hundreds of examples of each. The raw data was massaged, converted to Tensors, and used to train a model for the three moves. Initial tests seem to work well. [Charlie] also made an online version that captures motion from your smartphone. The demo is explained in the video below; sadly, we couldn’t get more than three Hadoukens in before crashing it.

With most machine learning project seeming to concentrate on telling cats from dogs, this is a refreshing change. We’re seeing lots of offbeat machine learning projects these days, from cryptocurrency wallet attacks to a semi-creepy workout-monitoring gym camera.

Continue reading “Arduino, Accelerometer, And TensorFlow Make You A Real-World Street Fighter”

Sensor Lets Gestures And An Arduino Control The Tunes

Every time we watch Minority Report we want to make wild hand gestures at our computer — most of them polite. [Rootsaid] wanted to do the same and discovered that the PAJ7620 is an easy way to read hand gestures. The little sensor has a serial interface and can recognize quite a bit of hand waving. To be precise, the device can read nine different motions: up, down, left, right, forward, backward, clockwise, anticlockwise, and wave.

There are plenty of libraries to read it for common platforms. If you have an Arduino that can act as a keyboard for a PC, the code almost writes itself. [Rootsaid] uses a specific library for the PAJ7620 and another — Nicohood — for sending media keys.

Continue reading “Sensor Lets Gestures And An Arduino Control The Tunes”

Robot’s Actions And Our Reactions

If you walk into a dog owner’s home that dog is probably going to make a beeline to see if you are a threat. If you walk into a cat owner’s home, you may see the cat wandering around, if it even chooses to grace you with its presence. For some people, a dog’s direct approach can be nerve-wracking, or even scary depending on their history and relative size of the dog. Still, these domestic animals are easy to empathize with especially if you or your family have a pet. They have faces which can convey curiosity or smug indifference but what if you were asked to judge the intent of something with no analogs to our own physical features like a face or limbs? That is what researchers at the IDC Herzliya in Israel and Cornell University in the US asked when they made the Greeting Machine to move a moon-like sphere around a planet-like sphere.

Participants were asked to gauge their feelings about the robot after watching the robot move in different patterns. It turns out that something as simple as a sphere tracing across the surface of another sphere can stir consistent and predictable emotions in people even though the shapes do not resemble a human, domestic pet, or anything but a snowman’s abdomen. This makes us think about how our own robots must be perceived by people who are not mired in circuits all day. Certainly, a robot jellyfish lazing about in the Atlantic must feel less threatening than a laser pointer with a taste for human eyeballs.

 

Continue reading “Robot’s Actions And Our Reactions”

This Machine Teaches Sign Language

Sign language can like any language be difficult to learn if you’re not immersed in it, or at least learning from someone who is fluent. It’s not easy to know when you’re making minor mistakes or missing nuances. It’s a medium with its own unique issues when learning, so if you want to learn and don’t have access to someone who knows the language you might want to reach for the next best thing: a machine that can teach you.

This project comes from three of [Bruce Land]’s senior electrical and computer engineering students, [Alicia], [Raul], and [Kerry], as part of their final design class at Cornell University. Someone who wishes to learn the sign language alphabet slips on a glove outfitted with position sensors for each finger. A computer inside the device shows each letter’s proper sign on a screen, and then checks the sensors from the glove to ensure that the hand is in the proper position. Two letters include making a gesture as well, and the device is able to track this by use of a gyroscope and compass to ensure that the letter has been properly signed. It appears to only cover the alphabet and not a wider vocabulary, but as a proof of concept it is very effective.

The students show that it is entirely possible to learn the alphabet reliably using the machine as a teaching tool. This type of technology could be useful for other applications as well, such as gesture recognition for a human interface device. If you want to see more of these interesting and well-referenced senior design builds we’ve featured quite a few, from polygraph machines to a sonar system for a bicycle.

Continue reading “This Machine Teaches Sign Language”