‘SHE BON’ Is An Artful, Wearable, Sensual, Sensing Platform

SHE BON (that’s the French bon, or “good”) is an ambitious project by [Sarah Petkus] that consists of a series of wearable electronic and mechanical elements which all come together as a system for a single purpose: to sense and indicate female arousal. As a proponent of increased discussion and openness around the topic of sexuality, [Sarah]’s goal is to take something hidden and turn it into something obvious and overt, while giving it a certain artful flair in the process.

The core of the system is a wearable backpack in the shape of a heart, from which all other sensors and feedback elements are connected. A lot of thought has gone into the design of the system, ensuring that the different modules have an artistic angle to their feedback while also being comfortable to actually wear, and [Sarah] seems to have a knack for slick design. Some of the elements are complete and some are still in progress, but the system is well documented with a clear vision for the whole. It’s an unusual and fascinating project, and was one of the finalists selected in the Human Computer Interface portion of the 2018 Hackaday Prize. Speaking of which, the Musical Instrument Challenge is underway, so be sure check it out!

Man’s Best Robotic Friend

When it comes to robotics, some of the most interesting work — and certainly the most hilarious — has come from Boston Dynamics, and their team of interns kicking robotic dogs over. It’s an impressive feat of engineering, and even if these robotic pack mules are far too loud for their intended use on the battlefield, it’s a great showcase of how cool a bunch of motors can actually be.

It’s not quite up there with the Boston Dynamics robots, but [Dimitris]’ project for the Hackaday Prize is an almost equally impressive assemblage of motors, 3D printed parts, SLAM processing and inverse kinematics. I suppose you could also kick it over and watch it struggle for laughs, too.

This robotic dog was first modeled in Fusion 360, and was designed with  22 Dynamixel AX-12A robot actuators: big, beefy, serial-controllable servos. Of course, bolting a bunch of motors to a frame is the easy part. The real challenge here is figuring out the kinematics and teaching this robot dog how to walk. This is still a work in progress, but so far [Dimitris] is able to move the spine, keep the feet level with the ground, and have the robot walk a little bit. There’s still work to do, but there’s an incredible amount of work that’s already been done.

The upcoming features for this robot include a RealSense camera mounted on the head for 3D visualization of the surroundings. There’s also plans for a tail, loosely based on some of the tentacle robots we’ve seen. It’s going to be a great project when it’s done, and it’s already an excellent entry for the Hackaday Prize.

Continue reading “Man’s Best Robotic Friend”

Turn Yourself Into A Cyborg With Neural Nets

If smartwatches and tiny Bluetooth earbuds are any indications, the future is with wearable electronics. This brings up a problem: developing wearable electronics isn’t as simple as building a device that’s meant to sit on a shelf. No, wearable electronics move, they stretch, people jump, kick, punch, and sweat. If you’re prototyping wearable electronics, it might be a good idea to build a Smart Internet of Things Wearable development board. That’s exactly what [Dave] did for his Hackaday Prize entry, and it’s really, really fantastic.

[Dave]’s BodiHub is an outgrowth of his entry into last year’s Hackaday Prize. While the project might not look like much, that’s kind of the point; [Dave]’s previous projects involved shrinking thousands of dollars worth of equipment down to a tiny board that can read muscle signals. This project takes that idea a bit further by creating a board that’s wearable, has support for battery charging, and makes prototyping with wearable electronics easy.

You might be asking what you can do with a board like this. For that, [David] suggests a few projects like boxing gloves that talk to each other, or tell you how much force you’re punching something with. Alternatively, you could read body movements and synchronize a LED light show to a dance performance. It can go further than that, though, because [David] built a mesh network logistics tracking system that uses an augmented reality interface. This was actually demoed at TechCrunch Disrupt NY, and the audience was wowed. You can check out the video of that demo here.

The Tiny, Pocket-Sized Robot Meant For Hacking

The world is full of educational robots for STEAM education, but we haven’t seen one as small or as cute as the Skoobot, an entry in this year’s Hackaday Prize. It’s barely bigger than an inch cubed, but it’s still packed with motors, a battery, sensors, and a microcontroller powerful enough to become a pocket-sized sumo robot.

The hardware inside each Skoobot is small, but powerful. The main microcontroller is a Nordic nRF52832, giving this robot an ARM Cortex-M4F brain and Bluetooth. The sensors include a VL6180X time of flight sensor that has a range of about 100mm. Skoobot also includes a light sensor for all your robotic photovoring needs. Other than that, the Skoobot is just about what you would expect, with a serial port, a buzzer, and some tiny wheels mounted in a plastic frame.

The idea behind the Skoobot is to bring robotics to the classroom, introducing kids to fighting/sumo robots, while still being small, cheap, and cute. To that end, the Skoobot is completely controllable via Bluetooth so anyone with a phone, a Pi, or any other hardware can make this robot move, turn, chase after light, or sync multiple Skoobots together for a choreographed dance.

While the Skoobot is an entry for this year’s Hackaday Prize, the creator of the Skoobot, [Bill Weiler] is also making these available on Crowd Supply.

Speech Recognition Without A Voice

The biggest change in Human Computer Interaction over the past few years is the rise of voice assistants. The Siris and Alexas are our HAL 9000s, and soon we’ll be using these assistants to open the garage door. They might just do it this time.

What would happen if you could talk to these voice assistants without saying a word? Would that be telepathy? That’s exactly what [Annie Ho] is doing with Cerebro Voice, a project in this year’s Hackaday Prize.

At its core, the idea behind Cerebro Voice is based on subvocal recognition, a technique that detects electrical signals from the vocal cords and other muscles involved in speaking. These electrical signals are collected by surface EMG devices, then sent to a computer for processing and reconstruction into words. It’s a proven technology, and even NASA is calling it ‘synthetic telepathy’.

The team behind this project is just in the early stages of prototyping this device, and so far they’re using EMG hardware and microphones to train a convolutional neural network that will translate electrical signals into a user’s inner monologue. It’s an amazing project, and one of the best we’ve seen in the Human Computer Interface challenge in this year’s Hackaday Prize.

Using Motors As Encoders

If you have a brushless motor, you have some magnets, a bunch of coils arranged in a circle, and theoretically, all the parts you need to build a rotary encoder. A lot of people have used brushless or stepper motors as rotary encoders, but they all seem to do it by using the motor as a generator and looking at the phases and voltages. For their Hackaday Prize project, [besenyeim] is doing it differently: they’re using motors as coupled inductors, and it looks like this is a viable way to turn a motor into an encoder.

The experimental setup for this project is a Blue Pill microcontroller based on the STM32F103. This, combined with a set of half-bridges used to drive the motor, are really the only thing needed to both spin the motor and detect where the motor is. The circuit works by using six digital outputs to drive the high and low sided of the half-bridges, and three analog inputs used as feedback. The resulting waveform graph looks like three weird stairsteps that are out of phase with each other, and with the right processing, that’s enough to detect the position of the motor.

Right now, the project is aiming to send a command over serial to a microcontroller and have the motor spin to a specific position. No, it’s not a completely closed-loop control scheme for turning a motor, but it’s actually not that bad. Future work is going to turn these motors into haptic feedback controllers, although we’re sure there are a few Raspberry Pi robots out there that would love odometry in the motor. You can check out a video of this setup in action below.

Continue reading “Using Motors As Encoders”

Video Quick Bit: The Best In Human Computer Interfaces

We’re neck deep in the Hackaday Prize, and we just wrapped up the Human Computer Interface Challenge. This is an incredible contest to go beyond traditional mice and keyboards to find new ways to transfer your desires directly into a computer. Majenta Strongheart is back at it again, giving us a look at some of the coolest Human Computer Interface builds in this year’s Hackaday Prize

The Hackaday Prize is all about hacking, really, and there’s no better project that demonstrates this than [Curt White]’s hacked fitness tracker. This is a tiny, $35 fitness tracker that’s loaded up with Bluetooth and an ECG front end. With a few slight modifications this cheap bit of consumer electronics can become a prototyping platform for ECG/EMG/EEG projects. Awesome work.

But when it comes to Human Computer Interfaces, what’s really cool is games. Remember the Power Glove? Of course, everyone does. How about the Sega Activator, the first full-body motion controller? Yeah, now we’re getting into the good stuff. [Arcadia Labs] build a Head Tracker for their favorite space flight sims, and the results are remarkable. Take a look at the videos and you can see the promise of this kind of tech.

The biggest advance in Human-Computer Interaction in the last few years is obviously VR. Once the domain of some early-90s not-quite cyberpunk, VR is now showing up in living rooms. The HiveTracker is an ingenious device that reverse engineers the technology behind the Vive Tracker from HTC. This is a tiny little device that allows for sub-millimeter 3D positioning, and also adds a 9DOF IMU to the mix. If you’ve ever wanted to know exactly where you are, this is the project for you.

Right now we’re plowing through the Musical Instrument Challenge where we’re asking you to build something that pushes the boundaries of instrumentation. If you’re building a synth, we want to see it. If you’re making music with vacuum tubes, we want to see it. Got one of those guitars that are like, double guitars? Yes, we want that too. Twenty of the Musical Instrument Challenge submissions will be selected to move on to the finals and win $1000 in the process. The top five entries of the 2018 Hackaday Prize will split $100,000! This is your chance, so enter now!