The Word Clock You Can Feel

By this point, pretty much everyone has come across a word clock project, if not built one themselves. There’s just an appeal to looking at a clock and seeing the time in a more human form than mere digits on a face. But there are senses beyond sight. Have you ever heard a word clock? Have you ever felt a word clock? These are questions to which Hackaday’s own [Moritz Sivers] can now answer yes, because he’s gone through the extreme learning process involved in designing and building a haptic word clock driven with the power of magnets.

Individual letters of the display are actuated by a matrix of magnetic coils on custom PCBs. These work in a vaguely similar fashion to LED matrices, except they generate magnetic fields that can push or pull on a magnet instead of generating light. As such, there are a variety of different challenges to be tackled: from coil design, to driving the increased power consumption, to even considering how coils interact with their neighbors. Inspired by research on other haptic displays, [Moritz] used ferrous foil to make the magnets latch into place. This way, each letter will stay in its forward or back position without powering the coil to hold it there. Plus the letter remains more stable while nearby coils are activated.

Part of the fun of “ubiquitous” projects like word clocks is seeing how creative hackers can get to make their own creations stand out. Whether it’s a miniaturized version of classic designs or something simple and clean, we  love to see them all. Unsurprisingly, [Moritz] himself has impressed us with his unique take on word clocks in the past. (Editor’s note: that’s nothing compared to his cloud chambers!)

Check out the video below to see this display’s actuation in action. We’re absolutely in love with the satisfying *click* the magnets make as they latch into place.

Continue reading “The Word Clock You Can Feel”

Use Jedi Mind Tricks To Control Your Next Drone Swarm

Controlling a single drone takes up a considerable amount of concentration and normally involves wearing silly goggles. It only gets harder if you want to control a swarm. Researchers at Skolkovo Institute of Technology decided Jedi mind tricks were the best way, and set up swarm control using hand gestures. 

We’ve seen something similar at the Intel Booth of the 2016 Makerfaire. In that demo, a single drone was controlled by hand gesture using a hacked Nintendo Power Glove. The Skoltech approach has a lot of innovation building on that concept. For one, haptics in the finger tips of the glove provide feedback from the current behavior of the drones. Through their research they found that most operators quickly learned to interpret the vibrations subconsciously.

It also increased the safety of the swarm, which is a prime factor in making these technologies usable outside of the lab. Most of us have at one point frantically typed commands into a terminal or pulled cords to keep a project from destroying itself or behaving dangerously. Having an intuitive control means that an operator can react quickly to changes in the swarm behavior.

The biggest advantage, which can be seen in the video after the break, is that the hand control eliminates much of the preprogramming of paths that is currently common in swarm robotics. With tech like this we can imagine a person quickly being trained on drone swarms and then using them to do things like construction surveys with ease. As an added bonus the researchers were nice enough to pre-submit their paper to arxiv if any readers would like to get into the specifics.

Continue reading “Use Jedi Mind Tricks To Control Your Next Drone Swarm”

Arduino, Accelerometer, And TensorFlow Make You A Real-World Street Fighter

A question: if you’re controlling the classic video game Street Fighter with gestures, aren’t you just, you know, street fighting?

That’s a question [Charlie Gerard] is going to have to tackle should her AI gesture-recognition controller experiments take off. [Charlie] put together the game controller to learn more about the dark arts of machine learning in a fun and engaging way.

The controller consists of a battery-powered Arduino MKR1000 with WiFi and an MPU6050 accelerometer. Held in the hand, the controller streams accelerometer data to an external PC, capturing the characteristics of the motion. [Charlie] trained three different moves – a punch, an uppercut, and the dreaded Hadouken – and captured hundreds of examples of each. The raw data was massaged, converted to Tensors, and used to train a model for the three moves. Initial tests seem to work well. [Charlie] also made an online version that captures motion from your smartphone. The demo is explained in the video below; sadly, we couldn’t get more than three Hadoukens in before crashing it.

With most machine learning project seeming to concentrate on telling cats from dogs, this is a refreshing change. We’re seeing lots of offbeat machine learning projects these days, from cryptocurrency wallet attacks to a semi-creepy workout-monitoring gym camera.

Continue reading “Arduino, Accelerometer, And TensorFlow Make You A Real-World Street Fighter”

Add Scroll Wheels And Buttons To Smartphones With 3D-Printed Widgets Read By Accelerometer

The first LED digital wristwatches hit the market in the 1970s. They required a button push to turn the display on, prompting one comedian to quip that giving one to a one-armed man would be in poor taste. While the UIs of watches and other wearables have improved since then, smartphones still present some usability challenges. Some of the touch screen gestures needed to operate a phone, like pinching, are nigh impossible when one-handing the phone, and woe unto those with stubby thumbs when trying to take a selfie.

You’d think that the fleet of sensors and the raw computing power on board would afford better ways to control phones. And you’d be right, if the modular mechanical input widgets described in a paper from Columbia University catch on. Dubbed “Vidgets” by [Chang Xiao] et al, the haptic devices are designed to create characteristic acceleration profiles on a phone’s inertial measurement unit (IMU) when actuated. Vidgets take various forms, from push buttons to scroll wheels, each of a similar size and shape and designed to dock into one of eight positions on the back of a 3D-printed phone case. Once trained, the algorithm watches for the acceleration signature caused by actuating a Vidget, and sends commands to the phone to mimic the corresponding gestures. The video below demonstrates a couple of use cases, of which the virtual saxophone is our favorite.

This is really clever stuff, and ventures deep into “Why didn’t I think of that?” territory. Need to get ahead of the curve on IMUs to capitalize on what they can do? You could start with [Al Williams]’ primer on micro-electromechanical systems, or MEMS.

Continue reading “Add Scroll Wheels And Buttons To Smartphones With 3D-Printed Widgets Read By Accelerometer”