Use Jedi Mind Tricks To Control Your Next Drone Swarm

Controlling a single drone takes up a considerable amount of concentration and normally involves wearing silly goggles. It only gets harder if you want to control a swarm. Researchers at Skolkovo Institute of Technology decided Jedi mind tricks were the best way, and set up swarm control using hand gestures. 

We’ve seen something similar at the Intel Booth of the 2016 Makerfaire. In that demo, a single drone was controlled by hand gesture using a hacked Nintendo Power Glove. The Skoltech approach has a lot of innovation building on that concept. For one, haptics in the finger tips of the glove provide feedback from the current behavior of the drones. Through their research they found that most operators quickly learned to interpret the vibrations subconsciously.

It also increased the safety of the swarm, which is a prime factor in making these technologies usable outside of the lab. Most of us have at one point frantically typed commands into a terminal or pulled cords to keep a project from destroying itself or behaving dangerously. Having an intuitive control means that an operator can react quickly to changes in the swarm behavior.

The biggest advantage, which can be seen in the video after the break, is that the hand control eliminates much of the preprogramming of paths that is currently common in swarm robotics. With tech like this we can imagine a person quickly being trained on drone swarms and then using them to do things like construction surveys with ease. As an added bonus the researchers were nice enough to pre-submit their paper to arxiv if any readers would like to get into the specifics.

Continue reading “Use Jedi Mind Tricks To Control Your Next Drone Swarm”

Arduino, Accelerometer, And TensorFlow Make You A Real-World Street Fighter

A question: if you’re controlling the classic video game Street Fighter with gestures, aren’t you just, you know, street fighting?

That’s a question [Charlie Gerard] is going to have to tackle should her AI gesture-recognition controller experiments take off. [Charlie] put together the game controller to learn more about the dark arts of machine learning in a fun and engaging way.

The controller consists of a battery-powered Arduino MKR1000 with WiFi and an MPU6050 accelerometer. Held in the hand, the controller streams accelerometer data to an external PC, capturing the characteristics of the motion. [Charlie] trained three different moves – a punch, an uppercut, and the dreaded Hadouken – and captured hundreds of examples of each. The raw data was massaged, converted to Tensors, and used to train a model for the three moves. Initial tests seem to work well. [Charlie] also made an online version that captures motion from your smartphone. The demo is explained in the video below; sadly, we couldn’t get more than three Hadoukens in before crashing it.

With most machine learning project seeming to concentrate on telling cats from dogs, this is a refreshing change. We’re seeing lots of offbeat machine learning projects these days, from cryptocurrency wallet attacks to a semi-creepy workout-monitoring gym camera.

Continue reading “Arduino, Accelerometer, And TensorFlow Make You A Real-World Street Fighter”

Add Scroll Wheels And Buttons To Smartphones With 3D-Printed Widgets Read By Accelerometer

The first LED digital wristwatches hit the market in the 1970s. They required a button push to turn the display on, prompting one comedian to quip that giving one to a one-armed man would be in poor taste. While the UIs of watches and other wearables have improved since then, smartphones still present some usability challenges. Some of the touch screen gestures needed to operate a phone, like pinching, are nigh impossible when one-handing the phone, and woe unto those with stubby thumbs when trying to take a selfie.

You’d think that the fleet of sensors and the raw computing power on board would afford better ways to control phones. And you’d be right, if the modular mechanical input widgets described in a paper from Columbia University catch on. Dubbed “Vidgets” by [Chang Xiao] et al, the haptic devices are designed to create characteristic acceleration profiles on a phone’s inertial measurement unit (IMU) when actuated. Vidgets take various forms, from push buttons to scroll wheels, each of a similar size and shape and designed to dock into one of eight positions on the back of a 3D-printed phone case. Once trained, the algorithm watches for the acceleration signature caused by actuating a Vidget, and sends commands to the phone to mimic the corresponding gestures. The video below demonstrates a couple of use cases, of which the virtual saxophone is our favorite.

This is really clever stuff, and ventures deep into “Why didn’t I think of that?” territory. Need to get ahead of the curve on IMUs to capitalize on what they can do? You could start with [Al Williams]’ primer on micro-electromechanical systems, or MEMS.

Continue reading “Add Scroll Wheels And Buttons To Smartphones With 3D-Printed Widgets Read By Accelerometer”

Get Your Tweets Without Looking

Head-mounted displays range from cumbersome to glass-hole-ish. Smart watches have their niche, but they still take your eyes away from whatever you are doing, like driving. Voice assistants can read to you, but they require a speaker that everyone else in the car has to listen to, or a headset that blocks out important sound. Ignoring incoming messages is out of the question so the answer may be to use a different sense than vision. A joint project between Facebook Inc. and the Massachusetts Institute of Technology have a solution which uses the somatosensory reception of your forearm.

A similar idea came across our desk years ago and seemed promising, but it is hard to sell something that is more difficult than the current technique, even if it is advantageous in the long run. In 2013, a wearer had his or her back covered in vibrator motors, and it acted like the haptic version of a spectrum analyzer. Now, the vibrators have been reduced in number to fit under a sleeve by utilizing patterns. It is being developed for people with hearing or vision impairment but what drivers aren’t impaired while looking at their phones?

Patterns are what really set this version apart. Rather than relaying a discrete note on a finger, or a range of values across the back, the 39 English phenomes are given a unique sequence of vibrations which is enough to encode any word. A phenome phoneme is the smallest distinct unit of speech. The video below shows how those phonemes are translated to haptic feedback. Hopefully, we can send tweets without using our hands or mouths to upgrade to complete telepathy.

Continue reading “Get Your Tweets Without Looking”

Friday Hack Chat: Training Robots By Touch

When it comes to training robots, you could grab a joystick or carefully program movements in code. The better way, though, is to move the robot yourself, and have the robot play back all those movements ad infinitum. This is training robots by touch, and it’s the subject of this week’s Hack Chat over on hackaday.io.

Our guest for this week’s Hack Chat will be [Kent Gilson], inventor, serial entrepreneur, and pioneer in reconfigurable computing. [Kent] is the creator of Viva, an object-oriented programming language and operating environment that harnesses the power of FPGAs into general-purpose computing He’s launched eight entrepreneurial ventures, won multiple awards, and created products used in numerous industries across the globe.

[Kent]’s claim to fame on hackaday.io is Dexter, a low-cost robotic arm with 50-micron repeatability and modular end effectors. It does this with three harmonic drives and optical encoders that give it extreme precision. The arm is also trainable, meaning that you can manually control it and play back the exact path it took. It’s training robots by touch, exactly what this Hack Chat is all about.

For this Hack Chat, we’re going to be discussing:

  • Building trainable robots
  • Developing robotics haptics
  • Training robots to manufacture
  • Heterogenous direct digital manufacturing

You are, of course, encouraged to add your own questions to the discussion. You can do that by leaving a comment on the Hack Chat Event Page and we’ll put that in the queue for the Hack Chat discussion.join-hack-chat

Our Hack Chats are live community events on the Hackaday.io Hack Chat group messaging. This week is just like any other, and we’ll be gathering ’round our video terminals at noon, Pacific, on Friday, July 27th.  Need a countdown timer? Well, here you go, mango.

Click that speech bubble to the right, and you’ll be taken directly to the Hack Chat group on Hackaday.io.

You don’t have to wait until Friday; join whenever you want and you can see what the community is talking about.