Bluetooth Dongle Gives Up Its Secrets With Quick Snooping Hack

There’s a lot going on in our wireless world, and the number of packets whizzing back and forth between our devices is staggering. All this information can be a rich vein to mine for IoT hackers, but how do you zero in on the information that matters? That depends, of course, but if your application involves Bluetooth, you might be able to snoop in on the conversation relatively easily.

By way of explanation, we turn to [Mark Hughes] and his Boondock Echo, a device we’ve featured in these pages before. [Mark] needed to know how long the Echo would operate when powered by a battery bank, as well as specifics about the power draw over time. He had one of those Fnirsi USB power meter dongles, the kind that talks to a smartphone app over Bluetooth. To tap into the conversation, he enabled Host Control Interface logging on his phone and let the dongle and the app talk for a bit. The captured log file was then filtered through WireShark, leaving behind a list of all the Bluetooth packets to and from the dongle’s address.

That’s when the fun began. Using a little wetware pattern recognition, [Mark] was able to figure out the basic structure of each frame. Knowing the voltage range of USB power delivery helped him find the bytes representing voltage and current, which allowed him to throw together a Python program to talk to the dongle in real-time and get the critical numbers.

It’s not likely that all BLE-connected devices will be as amenable to reverse engineering as this dongle was, but this is still a great technique to keep in mind. We’ve got a couple of applications for this in mind already, in fact.

Continue reading “Bluetooth Dongle Gives Up Its Secrets With Quick Snooping Hack”

Your ESP32 As A USB Bluetooth Dongle

Using Bluetooth on a desktop computer is now such a seamless process; it’s something built-in and just works. Behind that ubiquity is a protocol layer called HCI, or Host Controller Interface, a set of commands allowing a host computer to talk to a Bluetooth interface.  That interface doesn’t have to be special, and [Dakhnod] is here to show us that it can be done with an ESP32 microcontroller through its USB interface.

The linked repository doesn’t tell us which of the ESP32 variants it works with, but since not all of them have a USB peripheral we’re guessing one of the newer variety. It works with Linux computers, and we’re told it should work with Windows too if a HCI driver is present. We might ask ourselves why such a project is necessary given the ubiquity of Bluetooth interfaces, but for us it’s provided the impetus to read up on how it all works.

We can’t find anyone else in our archive who’s made a Bluetooth dongle in this way, but we’ve certainly seen sniffing of HCI commands to reverse engineer a speaker’s communications.

A Tongue Operated Human Machine Interface

For interfacing with machines, most of us use our hands and fingers. When you don’t have use of your hands (permanently or temporarily), there are limited alternatives. [Dorothee Clasen] has added one more option, [In]Brace, which is basically a small slide switch that you can operate with your tongue.

[In]Brace consists of a custom moulded retainer for the roof of your mouth, on which is a small ball with an embedded magnet, that slides long wire tracks. Above the track is a set of three magnetic sensors, that can detect the position of the ball. On the prototype, a wire from the three sensors run out of the corner of the users mouth, to a wireless microcontroller (Which looks to us like a ESP8266) hooked behind the user’s ear. In a final product, it would obviously be preferable if everything were sealed in the retainer. We think there is even more potential if one of the many 3-axis hall effect sensors are used, with a small joystick of rolling ball. The device could be used by disabled persons, for physical therapy, or just for cases where a person’s hands are otherwise occupied. [Dorothy] created a simple demonstration, where she plays Pong, or Tong in this case, using only the [In]Brace. Hygiene and making sure that it doesn’t somehow become a choke hazard will be very important if this ever became a product, but we think there is some potential.

[Kristina Panos] did a very interesting deep dive into the tongue as an HMI device a while ago, so this isn’t a new idea, but the actual implementations differ quite a lot. Apparently it’s also possible to use your ear muscles as an interface!

Thanks for the tip [Itay]!

Assistive Technolgy Switch Is Actuated Using Your Ear Muscles

Assistive technology is extremely fertile ground for hackers to make a difference, because of the unique requirements of each user and the high costs of commercial solutions. [Nick] has been working on Earswitch, an innovative assistive tech switch that can be actuated using voluntary movement of the middle ear muscle.

Most people don’t know they can contract their middle ear muscle, technically called the tensor tympani, but will recognise it as a rumbling sound or muffling effect of your hearing when yawning or tightly closing eyes. Its function is actually to protect your hearing from loud sounds screaming or chewing. [Nick] ran a survey and found that 75% can consciously contract the tensor tympani and 17% of can do it in isolation from other movements. Using a cheap USB auroscope (an ear camera like the one [Jenny] reviewed in November), he was able to detect the movement using iSpy, an open source software package meant for video surveillance. The output from iSpy is used to control Grid3, a commercial assistive technology software package. [Nick] also envisions the technology being used as a control interface for consumer electronics via earphones.

With the proof of concept done, [Nick] is looking at ways to make the tech more practical to actually use, possibly with a CMOS camera module inside a standard noise canceling headphones. Simpler optical sensors like reflectance or time-of-flight are also options being investigated. If you have suggestions for or possible use case, drop by on the project page.

Assistive tech always makes for interesting hacks. We recently saw a robotic arm that helps people feed themselves, and the 2017 Hackaday Prize has an entire stage that was focused on assistive technology.

Twenty Projects That Just Won The Human Computer Interface Challenge

The greatest hardware competition on the planet is going on right now. The Hackaday Prize is the Oscars of Open Hardware. It’s the Nobel Prize of building a thing. It’s the Fields Medal of firmware development, and simply making it to the finals grants you a knighthood in the upper echelon of hardware developers.

Last week, we wrapped up the fourth challenge in The Hackaday Prize, the Human Computer Interface challenge. Now we’re happy to announce twenty of those projects have been selected to move onto the final round and have been awarded a $1000 cash prize. Congratulations to the winners of the Human Computer Interface Challenge in this year’s Hackaday Prize. Here are the winners, in no particular order:

Human Computer Interface Challenge Hackaday Prize Finalists:

Continue reading “Twenty Projects That Just Won The Human Computer Interface Challenge”

This Is Your Last Chance To Design The Greatest Human Computer Interface

This is your last chance to get your project together for the Human Computer Interface Challenge in this year’s Hackaday Prize. We’re looking for innovative interfaces for humans to talk to machines or machines to talk to humans. These are projects that make technology more intuitive, more fun, and a more natural activity. This is your time to shine, and we’re accepting entries in the Human Computer Interface Challenge in this year’s Hackaday Prize until August 27th. This is your last weekend to work on your project, folks.

This is one of the best years of the Hackaday Prize yet, with almost one thousand projects vying for the top prize of $50,000 USD. That doesn’t mean everyone else is going home empty handed; we’ve already awarded $1000 prizes to twenty projects in each of the first three challenges, and this coming Monday, we’ll be figuring out the winners to the Human Computer Interface challenge. Twenty of those finalists will be awarded $1000 USD, and move onto the final round where they’re up for the Grand Prize.

Don’t miss your last chance to get in on the Human Computer Interface Challenge in this year’s Hackaday Prize. We’re looking for an interface that could be visual, auditory, haptic, olfactory, or something never before imagined. We’re sure we’re going to see an Alexa duct taped to a drone, and that’s awesome. We’re taking all comers. Don’t wait — start your entry now.

Continue reading “This Is Your Last Chance To Design The Greatest Human Computer Interface”

Human-Computer Interface Challenge: Change How We Interact With Computers, Win Prizes

Pay no attention to the man behind the curtain. It’s a quote from the Wizard of Oz but also an interesting way to look at our interactions with electronics. The most natural interactions free us from thinking about the ones and zeros behind them. Your next challenge is to build an innovative interface for humans to talk to machines and machines to talk to humans. This is the Human-Computer Interface Challenge!

The Next Gen of HCI

A Human-Computer Interface (or HCI) is what we use to control computers and what they use to control us get information to us. HCIs have been evolving since the beginning. The most recent breakthroughs include touchscreens and natural-language voice interaction. But HCI goes beyond the obvious. The Nest thermostat used a novel approach to learning your habits by observing times and days that people are near it, and when the temperature setting is changed. This sort of behavior feels more like the future than having to program specific times for temperature control adjustments. But of course we need to go much further.

You don’t need to start from scratch. There are all kinds of great technologies out there offering APIs that let you harness voice commands, recognize gestures, and build on existing data sets. There are chips that make touch sensing a breeze, and open source software suites that let you get up and running with computer vision. The important thing is the idea: find something that should feel more intuitive, more fun, and more natural.

The Best Interfaces Have Yet to Be Dreamed Up

No HCI is too simple; a subtle cue that makes sure you don’t miss garbage collection day can make your day. Of course no idea is too complex; who among you will work on a well-spoken personal assistant that puts Jarvis to shame? We just saw that computers sound just like people if you only tell them to make random pauses while speaking. There’s a ton of low-hanging fruit in this field waiting to be discovered.

An HCI can be in an unexpected place, or leverage interactions not yet widely used like olfactory or galvanic responses.  A good example of this is the Medium Machine which is pictured above. It stimulates the muscles in your forearm, causing your finger to press the button. The application is up to you, and we really like it that Peter mentions that Medium Machine reaches for something that wouldn’t normally come to mind when you think about these interfaces; something that hasn’t been dreamed up yet. Get creative, get silly, have some fun, and show us how technology can be a copilot and not a dimwitted sidekick.

You have until August 27th to put your entry up on Hackaday.io. The top twenty entries will each get $1,000 and go on to the finals where cash prizes of $50,000, $20,000, $15,000, $10,000, and $5,000 await.