Turn Yourself Into A Cyborg With Neural Nets

If smartwatches and tiny Bluetooth earbuds are any indications, the future is with wearable electronics. This brings up a problem: developing wearable electronics isn’t as simple as building a device that’s meant to sit on a shelf. No, wearable electronics move, they stretch, people jump, kick, punch, and sweat. If you’re prototyping wearable electronics, it might be a good idea to build a Smart Internet of Things Wearable development board. That’s exactly what [Dave] did for his Hackaday Prize entry, and it’s really, really fantastic.

[Dave]’s BodiHub is an outgrowth of his entry into last year’s Hackaday Prize. While the project might not look like much, that’s kind of the point; [Dave]’s previous projects involved shrinking thousands of dollars worth of equipment down to a tiny board that can read muscle signals. This project takes that idea a bit further by creating a board that’s wearable, has support for battery charging, and makes prototyping with wearable electronics easy.

You might be asking what you can do with a board like this. For that, [David] suggests a few projects like boxing gloves that talk to each other, or tell you how much force you’re punching something with. Alternatively, you could read body movements and synchronize a LED light show to a dance performance. It can go further than that, though, because [David] built a mesh network logistics tracking system that uses an augmented reality interface. This was actually demoed at TechCrunch Disrupt NY, and the audience was wowed. You can check out the video of that demo here.

The Tiny, Pocket-Sized Robot Meant For Hacking

The world is full of educational robots for STEAM education, but we haven’t seen one as small or as cute as the Skoobot, an entry in this year’s Hackaday Prize. It’s barely bigger than an inch cubed, but it’s still packed with motors, a battery, sensors, and a microcontroller powerful enough to become a pocket-sized sumo robot.

The hardware inside each Skoobot is small, but powerful. The main microcontroller is a Nordic nRF52832, giving this robot an ARM Cortex-M4F brain and Bluetooth. The sensors include a VL6180X time of flight sensor that has a range of about 100mm. Skoobot also includes a light sensor for all your robotic photovoring needs. Other than that, the Skoobot is just about what you would expect, with a serial port, a buzzer, and some tiny wheels mounted in a plastic frame.

The idea behind the Skoobot is to bring robotics to the classroom, introducing kids to fighting/sumo robots, while still being small, cheap, and cute. To that end, the Skoobot is completely controllable via Bluetooth so anyone with a phone, a Pi, or any other hardware can make this robot move, turn, chase after light, or sync multiple Skoobots together for a choreographed dance.

While the Skoobot is an entry for this year’s Hackaday Prize, the creator of the Skoobot, [Bill Weiler] is also making these available on Crowd Supply.

Speech Recognition Without A Voice

The biggest change in Human Computer Interaction over the past few years is the rise of voice assistants. The Siris and Alexas are our HAL 9000s, and soon we’ll be using these assistants to open the garage door. They might just do it this time.

What would happen if you could talk to these voice assistants without saying a word? Would that be telepathy? That’s exactly what [Annie Ho] is doing with Cerebro Voice, a project in this year’s Hackaday Prize.

At its core, the idea behind Cerebro Voice is based on subvocal recognition, a technique that detects electrical signals from the vocal cords and other muscles involved in speaking. These electrical signals are collected by surface EMG devices, then sent to a computer for processing and reconstruction into words. It’s a proven technology, and even NASA is calling it ‘synthetic telepathy’.

The team behind this project is just in the early stages of prototyping this device, and so far they’re using EMG hardware and microphones to train a convolutional neural network that will translate electrical signals into a user’s inner monologue. It’s an amazing project, and one of the best we’ve seen in the Human Computer Interface challenge in this year’s Hackaday Prize.

Using Motors As Encoders

If you have a brushless motor, you have some magnets, a bunch of coils arranged in a circle, and theoretically, all the parts you need to build a rotary encoder. A lot of people have used brushless or stepper motors as rotary encoders, but they all seem to do it by using the motor as a generator and looking at the phases and voltages. For their Hackaday Prize project, [besenyeim] is doing it differently: they’re using motors as coupled inductors, and it looks like this is a viable way to turn a motor into an encoder.

The experimental setup for this project is a Blue Pill microcontroller based on the STM32F103. This, combined with a set of half-bridges used to drive the motor, are really the only thing needed to both spin the motor and detect where the motor is. The circuit works by using six digital outputs to drive the high and low sided of the half-bridges, and three analog inputs used as feedback. The resulting waveform graph looks like three weird stairsteps that are out of phase with each other, and with the right processing, that’s enough to detect the position of the motor.

Right now, the project is aiming to send a command over serial to a microcontroller and have the motor spin to a specific position. No, it’s not a completely closed-loop control scheme for turning a motor, but it’s actually not that bad. Future work is going to turn these motors into haptic feedback controllers, although we’re sure there are a few Raspberry Pi robots out there that would love odometry in the motor. You can check out a video of this setup in action below.

Continue reading “Using Motors As Encoders”

Video Quick Bit: The Best In Human Computer Interfaces

We’re neck deep in the Hackaday Prize, and we just wrapped up the Human Computer Interface Challenge. This is an incredible contest to go beyond traditional mice and keyboards to find new ways to transfer your desires directly into a computer. Majenta Strongheart is back at it again, giving us a look at some of the coolest Human Computer Interface builds in this year’s Hackaday Prize

The Hackaday Prize is all about hacking, really, and there’s no better project that demonstrates this than [Curt White]’s hacked fitness tracker. This is a tiny, $35 fitness tracker that’s loaded up with Bluetooth and an ECG front end. With a few slight modifications this cheap bit of consumer electronics can become a prototyping platform for ECG/EMG/EEG projects. Awesome work.

But when it comes to Human Computer Interfaces, what’s really cool is games. Remember the Power Glove? Of course, everyone does. How about the Sega Activator, the first full-body motion controller? Yeah, now we’re getting into the good stuff. [Arcadia Labs] build a Head Tracker for their favorite space flight sims, and the results are remarkable. Take a look at the videos and you can see the promise of this kind of tech.

The biggest advance in Human-Computer Interaction in the last few years is obviously VR. Once the domain of some early-90s not-quite cyberpunk, VR is now showing up in living rooms. The HiveTracker is an ingenious device that reverse engineers the technology behind the Vive Tracker from HTC. This is a tiny little device that allows for sub-millimeter 3D positioning, and also adds a 9DOF IMU to the mix. If you’ve ever wanted to know exactly where you are, this is the project for you.

Right now we’re plowing through the Musical Instrument Challenge where we’re asking you to build something that pushes the boundaries of instrumentation. If you’re building a synth, we want to see it. If you’re making music with vacuum tubes, we want to see it. Got one of those guitars that are like, double guitars? Yes, we want that too. Twenty of the Musical Instrument Challenge submissions will be selected to move on to the finals and win $1000 in the process. The top five entries of the 2018 Hackaday Prize will split $100,000! This is your chance, so enter now!

Disassembling Mouse Sensors For Tracking Tongues

We just wrapped up the Human Computer Interface challenge in this year’s Hackaday Prize, and with that comes a bevy of interesting new designs for mice and keyboards that push the envelope of what you think should be possible, using components that seem improbable. One of the best examples of this is The Bit, a project from [oneohm]. It’s a computer mouse, that uses a tiny little trackpad in ways you never thought possible. It’s a mouse that fits on your tongue.

The idea behind The Bit was to create an input device for people with limited use of their extremities. It’s a bit like the Eyedriveomatic, the winner from the 2015 Hackaday Prize, but designed entirely to fit on the tip of your tongue.

The first experiments on a tongue-controlled mouse were done with an optical trackpad/navigation button found on Blackberry Phones. Like all mouse sensors these days, these modules are actually tiny, really crappy cameras. [oneohm] picked up a pair of these modules and found they had completely different internal tracking modules, so the experiment turned to a surface tracking module from PixArt Imaging that’s also used as a filament sensor in the Prusa 3D printer. This module was easily connected to a microcontroller, and with careful application of plastics, was imbedded in a pacifier. Yes, it tracks a tongue and turns that into cursor movements. It’s a tongue-tracking mouse, and it works.

This is an awesome project for the Hackaday Prize. Not only does it bring new tech to a human-computer interface, it’s doing it in a way that’s accessible to all.

DIY Switches For People Who Can’t Push Switches

An outstanding number of things most people take for granted present enormous hurdles for people with physical disabilities, including interaction with computers and other digital resources. Assistive technologies such as adaptive switches allow users who cannot use conventional buttons or other input devices to interact with digital devices, and while there are commercial offerings there is still plenty of room for projects like [Cassio Batista]’s DIY Low-cost Assistive Technology Switches.

[Cassio]’s project focuses on non-contact switches, such as proximity and puff-based activations. These are economical, DIY options aimed at improving accessibility for people who are unable to physically push even specialized switches. There are existing products in this space, but cost can be a barrier and DIY options that use familiar interfaces greatly improves accessibility.

Assistive technologies that give people the tools they need to have more control over their own lives in a positive, healthy way is one of the more vibrant and positive areas of open hardware development, and it’s not always clear where the challenges lie when creating solutions. An example of this is the winner of the 2015 Hackaday Prize, the Eyedrivomatic, which allows one to interface the steering of an electric wheelchair to a gaze tracking system while permanently altering neither device; a necessity because users often do not own their hardware.