Arduino, Accelerometer, And TensorFlow Make You A Real-World Street Fighter

A question: if you’re controlling the classic video game Street Fighter with gestures, aren’t you just, you know, street fighting?

That’s a question [Charlie Gerard] is going to have to tackle should her AI gesture-recognition controller experiments take off. [Charlie] put together the game controller to learn more about the dark arts of machine learning in a fun and engaging way.

The controller consists of a battery-powered Arduino MKR1000 with WiFi and an MPU6050 accelerometer. Held in the hand, the controller streams accelerometer data to an external PC, capturing the characteristics of the motion. [Charlie] trained three different moves – a punch, an uppercut, and the dreaded Hadouken – and captured hundreds of examples of each. The raw data was massaged, converted to Tensors, and used to train a model for the three moves. Initial tests seem to work well. [Charlie] also made an online version that captures motion from your smartphone. The demo is explained in the video below; sadly, we couldn’t get more than three Hadoukens in before crashing it.

With most machine learning project seeming to concentrate on telling cats from dogs, this is a refreshing change. We’re seeing lots of offbeat machine learning projects these days, from cryptocurrency wallet attacks to a semi-creepy workout-monitoring gym camera.

Continue reading “Arduino, Accelerometer, And TensorFlow Make You A Real-World Street Fighter”

Add Scroll Wheels And Buttons To Smartphones With 3D-Printed Widgets Read By Accelerometer

The first LED digital wristwatches hit the market in the 1970s. They required a button push to turn the display on, prompting one comedian to quip that giving one to a one-armed man would be in poor taste. While the UIs of watches and other wearables have improved since then, smartphones still present some usability challenges. Some of the touch screen gestures needed to operate a phone, like pinching, are nigh impossible when one-handing the phone, and woe unto those with stubby thumbs when trying to take a selfie.

You’d think that the fleet of sensors and the raw computing power on board would afford better ways to control phones. And you’d be right, if the modular mechanical input widgets described in a paper from Columbia University catch on. Dubbed “Vidgets” by [Chang Xiao] et al, the haptic devices are designed to create characteristic acceleration profiles on a phone’s inertial measurement unit (IMU) when actuated. Vidgets take various forms, from push buttons to scroll wheels, each of a similar size and shape and designed to dock into one of eight positions on the back of a 3D-printed phone case. Once trained, the algorithm watches for the acceleration signature caused by actuating a Vidget, and sends commands to the phone to mimic the corresponding gestures. The video below demonstrates a couple of use cases, of which the virtual saxophone is our favorite.

This is really clever stuff, and ventures deep into “Why didn’t I think of that?” territory. Need to get ahead of the curve on IMUs to capitalize on what they can do? You could start with [Al Williams]’ primer on micro-electromechanical systems, or MEMS.

Continue reading “Add Scroll Wheels And Buttons To Smartphones With 3D-Printed Widgets Read By Accelerometer”

Get Your Tweets Without Looking

Head-mounted displays range from cumbersome to glass-hole-ish. Smart watches have their niche, but they still take your eyes away from whatever you are doing, like driving. Voice assistants can read to you, but they require a speaker that everyone else in the car has to listen to, or a headset that blocks out important sound. Ignoring incoming messages is out of the question so the answer may be to use a different sense than vision. A joint project between Facebook Inc. and the Massachusetts Institute of Technology have a solution which uses the somatosensory reception of your forearm.

A similar idea came across our desk years ago and seemed promising, but it is hard to sell something that is more difficult than the current technique, even if it is advantageous in the long run. In 2013, a wearer had his or her back covered in vibrator motors, and it acted like the haptic version of a spectrum analyzer. Now, the vibrators have been reduced in number to fit under a sleeve by utilizing patterns. It is being developed for people with hearing or vision impairment but what drivers aren’t impaired while looking at their phones?

Patterns are what really set this version apart. Rather than relaying a discrete note on a finger, or a range of values across the back, the 39 English phenomes are given a unique sequence of vibrations which is enough to encode any word. A phenome phoneme is the smallest distinct unit of speech. The video below shows how those phonemes are translated to haptic feedback. Hopefully, we can send tweets without using our hands or mouths to upgrade to complete telepathy.

Continue reading “Get Your Tweets Without Looking”

Friday Hack Chat: Training Robots By Touch

When it comes to training robots, you could grab a joystick or carefully program movements in code. The better way, though, is to move the robot yourself, and have the robot play back all those movements ad infinitum. This is training robots by touch, and it’s the subject of this week’s Hack Chat over on hackaday.io.

Our guest for this week’s Hack Chat will be [Kent Gilson], inventor, serial entrepreneur, and pioneer in reconfigurable computing. [Kent] is the creator of Viva, an object-oriented programming language and operating environment that harnesses the power of FPGAs into general-purpose computing He’s launched eight entrepreneurial ventures, won multiple awards, and created products used in numerous industries across the globe.

[Kent]’s claim to fame on hackaday.io is Dexter, a low-cost robotic arm with 50-micron repeatability and modular end effectors. It does this with three harmonic drives and optical encoders that give it extreme precision. The arm is also trainable, meaning that you can manually control it and play back the exact path it took. It’s training robots by touch, exactly what this Hack Chat is all about.

For this Hack Chat, we’re going to be discussing:

  • Building trainable robots
  • Developing robotics haptics
  • Training robots to manufacture
  • Heterogenous direct digital manufacturing

You are, of course, encouraged to add your own questions to the discussion. You can do that by leaving a comment on the Hack Chat Event Page and we’ll put that in the queue for the Hack Chat discussion.join-hack-chat

Our Hack Chats are live community events on the Hackaday.io Hack Chat group messaging. This week is just like any other, and we’ll be gathering ’round our video terminals at noon, Pacific, on Friday, July 27th.  Need a countdown timer? Well, here you go, mango.

Click that speech bubble to the right, and you’ll be taken directly to the Hack Chat group on Hackaday.io.

You don’t have to wait until Friday; join whenever you want and you can see what the community is talking about.

Hackaday Prize Entry: HaptiVision Creates A Net Of Vibration Motors

HaptiVision is a haptic feedback system for the blind that builds on a wide array of vibration belts and haptic vests. It’s a smart concept, giving the wearer a warning when an obstruction comes into sensor view.

The earliest research into haptic feedback wearables used ultrasonic sensors, and more recent developments used a Kinect. The project team for HaptiVision chose the Intel RealSense camera because of its svelte form factor. Part of the goal was to make the HaptiVision as discreet as possible, so fitting the whole rig under a shirt was part of the plan.

In addition to a RealSense camera, the team used an Intel Up board for the brains, mostly because it natively controlled the RealSense camera. It takes a 640×480 IR snapshot and selectively triggers the 128 vibration motors to tell you what’s close. The motors are controlled by 8 PCA9685-based PWM expander boards.

The project is based on David Antón Sánchez’s OpenVNAVI project, which also featured a 128-motor array. HaptiVision aims to create an easy to replicate haptic system. Everything is Open Source, and all of the wiring clips and motor mounts are 3D-printable.

The Hackaday Prize: Exoskeletons For The Masses

While medical facilities continue to improve worldwide, access to expensive treatments still eludes a vast amount of people. Especially when it comes to prosthetics, a lot of people won’t be able to afford something so personalized even though the need for assistive devices is extremely high. With that in mind, [Guillermo Herrera-Arcos] started working on ALICE, a robotic exoskeleton that is low-cost, easy to build, and as an added bonus, 100% Open Source.

ALICE’s creators envision that the exoskeleton will have applications in rehabilitation, human augmentation, and even gaming. Also, since it’s Open Source, it could also be used as a platform for STEM students to learn from. Currently, the team is testing electronics in the legs of the exoskeleton, but they have already come a long way with their control system and getting a workable prototype in place. Moving into the future, the creators, as well as anyone else who develops something on this platform, will always be improving it and building upon it thanks to the nature of Open Source hardware.