Arduino, Accelerometer, And TensorFlow Make You A Real-World Street Fighter

A question: if you’re controlling the classic video game Street Fighter with gestures, aren’t you just, you know, street fighting?

That’s a question [Charlie Gerard] is going to have to tackle should her AI gesture-recognition controller experiments take off. [Charlie] put together the game controller to learn more about the dark arts of machine learning in a fun and engaging way.

The controller consists of a battery-powered Arduino MKR1000 with WiFi and an MPU6050 accelerometer. Held in the hand, the controller streams accelerometer data to an external PC, capturing the characteristics of the motion. [Charlie] trained three different moves – a punch, an uppercut, and the dreaded Hadouken – and captured hundreds of examples of each. The raw data was massaged, converted to Tensors, and used to train a model for the three moves. Initial tests seem to work well. [Charlie] also made an online version that captures motion from your smartphone. The demo is explained in the video below; sadly, we couldn’t get more than three Hadoukens in before crashing it.

With most machine learning project seeming to concentrate on telling cats from dogs, this is a refreshing change. We’re seeing lots of offbeat machine learning projects these days, from cryptocurrency wallet attacks to a semi-creepy workout-monitoring gym camera.

Continue reading “Arduino, Accelerometer, And TensorFlow Make You A Real-World Street Fighter”

Name Stone Helps You Greet Coworkers

When starting a new job, learning coworkers names can be a daunting task. Getting this right is key to forming strong professional relationships. [Ahad] noted that [Marcos] was struggling with this, so built the Name Stone to help.

The Name Stone consists of some powerful hardware, wrapped up in a 3D printed case reminiscent of the Eye of Agamotto from Doctor Strange. Inside, there’s a Jetson Nano – an excellent platform for any project built around machine learning tasks. This is combined with a microphone and camera to collect data from the environment.

[Ahad] then went about training neural networks to help with basic identification tasks. Video was taken of the coworkers, then the frames used to train a convolutional neural network using PyTorch. Similarly, a series of audio clips were used to again train a network to identify individuals through the sound of their voice, using MFCC techniques. Upon activating the stone, the device will capture an image or a short sound clip, and process the data to identify the target coworker and remind [Marcos] of their name.

It’s a project that could be quite useful, given to new employees to help them transition into the new workplace. Of course, pervasive facial recognition technology does have some drawbacks. Video after the break.

Continue reading “Name Stone Helps You Greet Coworkers”

GymCam Knows Exactly What You’ve Been Doing In The Gym

Getting exact statistics on one’s physical activities at the gym, is not an easy feat. While most people these days are familiar with or even regularly use one of those motion-based trackers on their wrist, there’s a big question as to their accuracy. After all, it’s all based on the motions of just one’s wrist, which as we know leads to amusing results in the tracker app when one does things like waving or clapping one’s hands, and cannot track leg exercises at the gym.

To get around the issue of limited sensor data, researchers at Carnegie Mellon University (Pittsburgh, USA) developed a system based around a camera and machine vision algorithms. While other camera solutions that attempt this suffer from occlusion while trying to track individual people as accurately as possible, this new system instead doesn’t try to track people’s joints, but merely motion at specific exercise machines by looking for repetitive motion in the scene.

The basic concept is that repetitive motion usually indicates forms of exercise, and that no two people at the same type of machine will ever be fully in sync with their motions, so that merely a handful of pixels suffice to track motion at that machine by a single person. This also negates many privacy issues, as the resolution doesn’t have to be high enough to see faces or track joints with any degree of accuracy.

In experiments at the university’s gym, the accuracy of their system over 5 days and 42 hours of video. Detecting exercise activities in the scene was with a 99.6% accuracy, disambiguating between simultaneous activities was 84.6% accurate, while recognizing exercise types was 93.6% accurate. Ultimately repetition counts for specific exercises were within 1.7 counts.

Maybe an extended version of this would be a flying drone capturing one’s outside activities, giving one finally that 100% accurate exercise account while jogging?

Thanks to [Qes] for sending this one in!

AI Makes Hyperbolic Brain Hats A Reality

It isn’t often that the world of Hackaday intersects with the world of crafting, which is perhaps a shame because many of the skills and techniques of the two have significant overlap. Crochet for instance has rarely featured here, but that is about to change with [Janelle Shane]’s HAT3000 neural network trained to produce crochet hat patterns.

Taking the GPT-2 neural network trained on Internet text  and further training it with a stack of crochet hat patterns, she was able to generate AI-designed hats which her friends on the Ravelry yarn forum set to crochet into real hats. It’s a follow-up to a previous knitting-based project, and instead of producing the hats you might expect it goes into flights of fancy. Some are visibly hat-like while others turn into avant-garde creations that defy any attempt to match them to real heads. A whole genre of hyperbolic progressions of crochet rows produce hats with organic folds that begin to resemble brains, and tax both the stamina of the person doing the crochet and their supply of yarn.

Perhaps most amusingly the neural network retains the ability to produce text, but when it does so it now inevitably steers the subject back to crochet hats. A Harry Potter sentence spawns a passage of something she aptly describes as “terrible crochet-themed erotica“, and such is the influence of the crochet patterns that this purple prose can even include enough crochet instructions to make them crochetable. It would be fascinating to see whether a similar model trained with G-code from Thingiverse would produce printable designs, what would an AI make with Benchy for example?

We’ve been entertained by [Janelle]’s AI work before, both naming tomato varieties, and creating pie recipes.

Thanks [Laura] for the tip.

Side-Channel Attack Shows Vulnerabilities Of Cryptocurrency Wallets

What’s in your crypto wallet? The simple answer should be fat stacks of Bitcoin or Ethereum and little more. But if you use a hardware cryptocurrency wallet, you may be carrying around a bit fat vulnerability, too.

At the 35C3 conference last year, [Thomas Roth], [Josh Datko], and [Dmitry Nedospasov] presented a side-channel attack on a hardware crypto wallet. The wallet in question is a Ledger Blue, a smartphone-sized device which seems to be discontinued by the manufacturer but is still available in the secondary market. The wallet sports a touch-screen interface for managing your crypto empire, and therein lies the weakness that these researchers exploited.

By using a HackRF SDR and a simple whip antenna, they found that the wallet radiated a distinctive and relatively strong signal at 169 MHz every time a virtual key was pressed to enter a PIN. Each burst started with a distinctive 11-bit data pattern; with the help of a logic analyzer, they determined that each packet contained the location of the key icon on the screen.

Next step: put together a training set. They rigged up a simple automatic button-masher using a servo and some 3D-printed parts, and captured signals from the SDR for 100 presses of each key. The raw data was massaged a bit to prepare it for TensorFlow, and the trained network proved accurate enough to give any hardware wallet user pause – especially since they captured the data from two meters away with relatively simple and concealable gear.

Every lock contains the information needed to defeat it, requiring only a motivated attacker with the right tools and knowledge. We’ve covered other side-channel attacks before; sadly, they’ll probably only get easier as technologies like SDR and machine learning rapidly advance.

[via RTL-SDR.com]

Machine Learning With Microcontrollers Hack Chat

Join us on Wednesday, September 11 at noon Pacific for the Machine Learning with Microcontrollers Hack Chat with Limor “Ladyada” Fried and Phillip Torrone from Adafruit!

We’ve gotten to the point where a $35 Raspberry Pi can be a reasonable alternative to a traditional desktop or laptop, and microcontrollers in the Arduino ecosystem are getting powerful enough to handle some remarkably demanding computational jobs. But there’s still one area where microcontrollers seem to be lagging a bit: machine learning. Sure, there are purpose-built edge-computing SBCs, but wouldn’t it be great to be able to run AI models on versatile and ubiquitous MCUs that you can pick up for a couple of bucks?

We’re moving in that direction, and our friends at Adafruit Industries want to stop by the Hack Chat and tell us all about what they’re working on. In addition to Ladyada and PT, we’ll be joined by Meghna NatrajDaniel Situnayake, and Pete Warden, all from the Google TensorFlow team. If you’ve got any interest in edge computing on small form-factor computers, you won’t want to miss this chat. Join us, ask your questions about TensorFlow Lite and TensorFlow Lite for Microcontrollers, and see what’s possible in machine learning way out on the edge.

join-hack-chatOur Hack Chats are live community events in the Hackaday.io Hack Chat group messaging. This week we’ll be sitting down on Wednesday, September 11 at 12:00 PM Pacific time. If time zones have got you down, we have a handy time zone converter.

Click that speech bubble to the right, and you’ll be taken directly to the Hack Chat group on Hackaday.io. You don’t have to wait until Wednesday; join whenever you want and you can see what the community is talking about.