String Art Robot Is An Autorouter In Reverse

In the depths of Etsy and Pinterest is a fascinating, if tedious, artform. String art, the process of nailing pins in a board and wrapping thread around the perimeter to create shapes and shading, The most popular project in this vein is something like putting the outline of a heart, in string, in the shape of your home state. Something like that, at least.

While this artform involves about as much effort as pallet wood furniture, there is an interesting computational aspect of it: you can create images with string art, and doing this is a very, very hard problem to solve with an algorithm. Researchers at TU Wien have brought out the best that string art has to offer. They’ve programmed an industrial robot to create portraits out of string.

The experimental setup for this is about as simple as it gets. It’s a circular frame studded with 256 hooks around the perimeter. An industrial robot arm takes a few kilometers of thread winds a piece of string around one of these hooks, then travels to another hook. Repeat that thousands and thousands of times, and you get a portrait of Ada Lovelace or Albert Einstein.

The wire wrapped backplane of a DEC PDP-11. This was assembled by a robot that was programmed with an autorouter. It’s also string art.

The real trick here is the algorithm that takes an image and translates it into the paths the string will take. This is an NP-hard problem, but it is a surprisingly well-studied problem. The first autorouters — the things you should never trust to route traces between the packages on your PCB — we created for wire wrapped computers. Here, computers would find the shortest path between whatever pins had to be connected together. There were, of course, limitations: pins could only have so many connections on them thanks to the nature of wire wrapping, and you couldn’t have one gigantic mass of wires for a parallel bus. The first autorouters were string art algorithms, only in reverse.

You can take a look at the complete publication here.

You’ll also find prior art (tee-hee) in our own pages. Here is an artist doing it by hand, and here’s a machine to do it for you if you’re lazy. We’ve even seen further work on the underlying algorithm on Hackaday.io.

Speech Recognition Without A Voice

The biggest change in Human Computer Interaction over the past few years is the rise of voice assistants. The Siris and Alexas are our HAL 9000s, and soon we’ll be using these assistants to open the garage door. They might just do it this time.

What would happen if you could talk to these voice assistants without saying a word? Would that be telepathy? That’s exactly what [Annie Ho] is doing with Cerebro Voice, a project in this year’s Hackaday Prize.

At its core, the idea behind Cerebro Voice is based on subvocal recognition, a technique that detects electrical signals from the vocal cords and other muscles involved in speaking. These electrical signals are collected by surface EMG devices, then sent to a computer for processing and reconstruction into words. It’s a proven technology, and even NASA is calling it ‘synthetic telepathy’.

The team behind this project is just in the early stages of prototyping this device, and so far they’re using EMG hardware and microphones to train a convolutional neural network that will translate electrical signals into a user’s inner monologue. It’s an amazing project, and one of the best we’ve seen in the Human Computer Interface challenge in this year’s Hackaday Prize.

Turning A Tiny FLIR Into An Action Cam With FPGAs

FLIR are making some really great miniature thermal cameras these days, designed for applications such as self-driving cars, and tools that help keep firefighters safe. That’s great and all, but these thermal cameras are so cool, you really just want to play with one. That’s what [greg] was thinking when he designed a PCB backpack that captures thermal images from a FLIR Boson and stores it on an SD card. It’s a thermal action cam, and an impressive bit of FPGA development, too.

The FLIR product in question is a Boson 640, an impressive little camera that records in 640×512 resolution, with a 60 Hz update rate. This one’s got the 95° field of view, giving it a very good specification in a very small footprint. This is a huge improvement over FLIR’s Tau camera, for which [greg] built a breakout board with Ethernet and DDR memory a few years ago. Once he found out about the Boson, he figured a backpack PCB for this camera would be possible and a great excuse to teach himself FPGAs with a hands-on project.

With an impressive ability to find the perfect part, [greg] sourced a Lattice iCE40 FPGA in an 8×8 mm package along with an 8 Mbit HyperRAM in a 6×8 package. This combination allows for all the chips to fit behind the Boson camera. Add in an microSD card slot and a few connectors and this breakout board is very close to being a commercial product, for whatever forward looking infrared needs you might have.

Using Motors As Encoders

If you have a brushless motor, you have some magnets, a bunch of coils arranged in a circle, and theoretically, all the parts you need to build a rotary encoder. A lot of people have used brushless or stepper motors as rotary encoders, but they all seem to do it by using the motor as a generator and looking at the phases and voltages. For their Hackaday Prize project, [besenyeim] is doing it differently: they’re using motors as coupled inductors, and it looks like this is a viable way to turn a motor into an encoder.

The experimental setup for this project is a Blue Pill microcontroller based on the STM32F103. This, combined with a set of half-bridges used to drive the motor, are really the only thing needed to both spin the motor and detect where the motor is. The circuit works by using six digital outputs to drive the high and low sided of the half-bridges, and three analog inputs used as feedback. The resulting waveform graph looks like three weird stairsteps that are out of phase with each other, and with the right processing, that’s enough to detect the position of the motor.

Right now, the project is aiming to send a command over serial to a microcontroller and have the motor spin to a specific position. No, it’s not a completely closed-loop control scheme for turning a motor, but it’s actually not that bad. Future work is going to turn these motors into haptic feedback controllers, although we’re sure there are a few Raspberry Pi robots out there that would love odometry in the motor. You can check out a video of this setup in action below.

Continue reading “Using Motors As Encoders”

Video Quick Bit: The Best In Human Computer Interfaces

We’re neck deep in the Hackaday Prize, and we just wrapped up the Human Computer Interface Challenge. This is an incredible contest to go beyond traditional mice and keyboards to find new ways to transfer your desires directly into a computer. Majenta Strongheart is back at it again, giving us a look at some of the coolest Human Computer Interface builds in this year’s Hackaday Prize

The Hackaday Prize is all about hacking, really, and there’s no better project that demonstrates this than [Curt White]’s hacked fitness tracker. This is a tiny, $35 fitness tracker that’s loaded up with Bluetooth and an ECG front end. With a few slight modifications this cheap bit of consumer electronics can become a prototyping platform for ECG/EMG/EEG projects. Awesome work.

But when it comes to Human Computer Interfaces, what’s really cool is games. Remember the Power Glove? Of course, everyone does. How about the Sega Activator, the first full-body motion controller? Yeah, now we’re getting into the good stuff. [Arcadia Labs] build a Head Tracker for their favorite space flight sims, and the results are remarkable. Take a look at the videos and you can see the promise of this kind of tech.

The biggest advance in Human-Computer Interaction in the last few years is obviously VR. Once the domain of some early-90s not-quite cyberpunk, VR is now showing up in living rooms. The HiveTracker is an ingenious device that reverse engineers the technology behind the Vive Tracker from HTC. This is a tiny little device that allows for sub-millimeter 3D positioning, and also adds a 9DOF IMU to the mix. If you’ve ever wanted to know exactly where you are, this is the project for you.

Right now we’re plowing through the Musical Instrument Challenge where we’re asking you to build something that pushes the boundaries of instrumentation. If you’re building a synth, we want to see it. If you’re making music with vacuum tubes, we want to see it. Got one of those guitars that are like, double guitars? Yes, we want that too. Twenty of the Musical Instrument Challenge submissions will be selected to move on to the finals and win $1000 in the process. The top five entries of the 2018 Hackaday Prize will split $100,000! This is your chance, so enter now!

Friday Hack Chat: Playing With Fire

We’re pretty sure all the hackers and tinkerers and makers out there were a tiny bit of a pyromaniac in their youth. That’s what makes this week’s Hack Chat so exciting: we’re talking about Hacking With Fire.

Our guest for this week’s Hack Chat will be [Brice Farrell], who, like most of us, has been interested in fire his entire life. He’s taken this interest and turned his amateur passion into something semi-professional. He’s a PGI certified pyrotechnician, an electrical engineer, and an ice carver. This year, he appeared on BattleBots where he built the flame system for Battle Royale with Cheese.

Given [Brice]’s extensive expertise, this Hack Chat is going to cover the relevant safety concerns of work with fire, how to keep yourself safe, and how to do everything legally. We’ll be talking about fireball shooters of all sizes, ignition techniques, and the use (and introduction) of fire in combat robotics. That last point is extremely interesting: is fire on a BattleBot actually useful, and what can you do to protect your bot from it?

Points of interest for this Hack Chat will include:

  • Fire safety
  • The difference between generating flames and fireballs
  • Ignition techniques
  • Fire safety
  • Fire in combat robotics
  • Fire safety

You are, of course, encouraged to add your own questions to the discussion. You can do that by leaving a comment on the Hacking with Fire event page and we’ll put that in the queue for the Hack Chat discussion.

join-hack-chat

Our Hack Chats are live community events on the Hackaday.io Hack Chat group messaging. This week is just like any other, and we’ll be gathering ’round our video terminals at noon, Pacific, on Friday, September 14th. That’s not the same in every time zone, but don’t worry, we have some amazing time conversion technology.

Click that speech bubble to the right, and you’ll be taken directly to the Hack Chat group on Hackaday.io.

You don’t have to wait until Friday; join whenever you want and you can see what the community is talking about.

Disassembling Mouse Sensors For Tracking Tongues

We just wrapped up the Human Computer Interface challenge in this year’s Hackaday Prize, and with that comes a bevy of interesting new designs for mice and keyboards that push the envelope of what you think should be possible, using components that seem improbable. One of the best examples of this is The Bit, a project from [oneohm]. It’s a computer mouse, that uses a tiny little trackpad in ways you never thought possible. It’s a mouse that fits on your tongue.

The idea behind The Bit was to create an input device for people with limited use of their extremities. It’s a bit like the Eyedriveomatic, the winner from the 2015 Hackaday Prize, but designed entirely to fit on the tip of your tongue.

The first experiments on a tongue-controlled mouse were done with an optical trackpad/navigation button found on Blackberry Phones. Like all mouse sensors these days, these modules are actually tiny, really crappy cameras. [oneohm] picked up a pair of these modules and found they had completely different internal tracking modules, so the experiment turned to a surface tracking module from PixArt Imaging that’s also used as a filament sensor in the Prusa 3D printer. This module was easily connected to a microcontroller, and with careful application of plastics, was imbedded in a pacifier. Yes, it tracks a tongue and turns that into cursor movements. It’s a tongue-tracking mouse, and it works.

This is an awesome project for the Hackaday Prize. Not only does it bring new tech to a human-computer interface, it’s doing it in a way that’s accessible to all.