A Tongue Operated Human Machine Interface

For interfacing with machines, most of us use our hands and fingers. When you don’t have use of your hands (permanently or temporarily), there are limited alternatives. [Dorothee Clasen] has added one more option, [In]Brace, which is basically a small slide switch that you can operate with your tongue.

[In]Brace consists of a custom moulded retainer for the roof of your mouth, on which is a small ball with an embedded magnet, that slides long wire tracks. Above the track is a set of three magnetic sensors, that can detect the position of the ball. On the prototype, a wire from the three sensors run out of the corner of the users mouth, to a wireless microcontroller (Which looks to us like a ESP8266) hooked behind the user’s ear. In a final product, it would obviously be preferable if everything were sealed in the retainer. We think there is even more potential if one of the many 3-axis hall effect sensors are used, with a small joystick of rolling ball. The device could be used by disabled persons, for physical therapy, or just for cases where a person’s hands are otherwise occupied. [Dorothy] created a simple demonstration, where she plays Pong, or Tong in this case, using only the [In]Brace. Hygiene and making sure that it doesn’t somehow become a choke hazard will be very important if this ever became a product, but we think there is some potential.

[Kristina Panos] did a very interesting deep dive into the tongue as an HMI device a while ago, so this isn’t a new idea, but the actual implementations differ quite a lot. Apparently it’s also possible to use your ear muscles as an interface!

Thanks for the tip [Itay]!

Assistive Technolgy Switch Is Actuated Using Your Ear Muscles

Assistive technology is extremely fertile ground for hackers to make a difference, because of the unique requirements of each user and the high costs of commercial solutions. [Nick] has been working on Earswitch, an innovative assistive tech switch that can be actuated using voluntary movement of the middle ear muscle.

Most people don’t know they can contract their middle ear muscle, technically called the tensor tympani, but will recognise it as a rumbling sound or muffling effect of your hearing when yawning or tightly closing eyes. Its function is actually to protect your hearing from loud sounds screaming or chewing. [Nick] ran a survey and found that 75% can consciously contract the tensor tympani and 17% of can do it in isolation from other movements. Using a cheap USB auroscope (an ear camera like the one [Jenny] reviewed in November), he was able to detect the movement using iSpy, an open source software package meant for video surveillance. The output from iSpy is used to control Grid3, a commercial assistive technology software package. [Nick] also envisions the technology being used as a control interface for consumer electronics via earphones.

With the proof of concept done, [Nick] is looking at ways to make the tech more practical to actually use, possibly with a CMOS camera module inside a standard noise canceling headphones. Simpler optical sensors like reflectance or time-of-flight are also options being investigated. If you have suggestions for or possible use case, drop by on the project page.

Assistive tech always makes for interesting hacks. We recently saw a robotic arm that helps people feed themselves, and the 2017 Hackaday Prize has an entire stage that was focused on assistive technology.

Twenty Projects That Just Won The Human Computer Interface Challenge

The greatest hardware competition on the planet is going on right now. The Hackaday Prize is the Oscars of Open Hardware. It’s the Nobel Prize of building a thing. It’s the Fields Medal of firmware development, and simply making it to the finals grants you a knighthood in the upper echelon of hardware developers.

Last week, we wrapped up the fourth challenge in The Hackaday Prize, the Human Computer Interface challenge. Now we’re happy to announce twenty of those projects have been selected to move onto the final round and have been awarded a $1000 cash prize. Congratulations to the winners of the Human Computer Interface Challenge in this year’s Hackaday Prize. Here are the winners, in no particular order:

Human Computer Interface Challenge Hackaday Prize Finalists:

Continue reading “Twenty Projects That Just Won The Human Computer Interface Challenge”

This Is Your Last Chance To Design The Greatest Human Computer Interface

This is your last chance to get your project together for the Human Computer Interface Challenge in this year’s Hackaday Prize. We’re looking for innovative interfaces for humans to talk to machines or machines to talk to humans. These are projects that make technology more intuitive, more fun, and a more natural activity. This is your time to shine, and we’re accepting entries in the Human Computer Interface Challenge in this year’s Hackaday Prize until August 27th. This is your last weekend to work on your project, folks.

This is one of the best years of the Hackaday Prize yet, with almost one thousand projects vying for the top prize of $50,000 USD. That doesn’t mean everyone else is going home empty handed; we’ve already awarded $1000 prizes to twenty projects in each of the first three challenges, and this coming Monday, we’ll be figuring out the winners to the Human Computer Interface challenge. Twenty of those finalists will be awarded $1000 USD, and move onto the final round where they’re up for the Grand Prize.

Don’t miss your last chance to get in on the Human Computer Interface Challenge in this year’s Hackaday Prize. We’re looking for an interface that could be visual, auditory, haptic, olfactory, or something never before imagined. We’re sure we’re going to see an Alexa duct taped to a drone, and that’s awesome. We’re taking all comers. Don’t wait — start your entry now.

Continue reading “This Is Your Last Chance To Design The Greatest Human Computer Interface”

Human-Computer Interface Challenge: Change How We Interact With Computers, Win Prizes

Pay no attention to the man behind the curtain. It’s a quote from the Wizard of Oz but also an interesting way to look at our interactions with electronics. The most natural interactions free us from thinking about the ones and zeros behind them. Your next challenge is to build an innovative interface for humans to talk to machines and machines to talk to humans. This is the Human-Computer Interface Challenge!

The Next Gen of HCI

A Human-Computer Interface (or HCI) is what we use to control computers and what they use to control us get information to us. HCIs have been evolving since the beginning. The most recent breakthroughs include touchscreens and natural-language voice interaction. But HCI goes beyond the obvious. The Nest thermostat used a novel approach to learning your habits by observing times and days that people are near it, and when the temperature setting is changed. This sort of behavior feels more like the future than having to program specific times for temperature control adjustments. But of course we need to go much further.

You don’t need to start from scratch. There are all kinds of great technologies out there offering APIs that let you harness voice commands, recognize gestures, and build on existing data sets. There are chips that make touch sensing a breeze, and open source software suites that let you get up and running with computer vision. The important thing is the idea: find something that should feel more intuitive, more fun, and more natural.

The Best Interfaces Have Yet to Be Dreamed Up

No HCI is too simple; a subtle cue that makes sure you don’t miss garbage collection day can make your day. Of course no idea is too complex; who among you will work on a well-spoken personal assistant that puts Jarvis to shame? We just saw that computers sound just like people if you only tell them to make random pauses while speaking. There’s a ton of low-hanging fruit in this field waiting to be discovered.

An HCI can be in an unexpected place, or leverage interactions not yet widely used like olfactory or galvanic responses.  A good example of this is the Medium Machine which is pictured above. It stimulates the muscles in your forearm, causing your finger to press the button. The application is up to you, and we really like it that Peter mentions that Medium Machine reaches for something that wouldn’t normally come to mind when you think about these interfaces; something that hasn’t been dreamed up yet. Get creative, get silly, have some fun, and show us how technology can be a copilot and not a dimwitted sidekick.

You have until August 27th to put your entry up on Hackaday.io. The top twenty entries will each get $1,000 and go on to the finals where cash prizes of $50,000, $20,000, $15,000, $10,000, and $5,000 await.

3D Printering: Laser Cutting 3D Objects

3D printing can create just about any shape imaginable, but ask anyone who has babysat a printer for several hours, and they’ll tell you 3D printing’s biggest problem: it takes forever to produce a print. The HCI lab at Potsdam University has some up with a solution to this problem using the second most common tool found in a hackerspace. They’re using a laser cutter to speed up part production by a factor of twenty or more.

Instead of printing a 3D file directly, this system, Platener, breaks a model down into its component parts. These parts can then be laser cut out of acrylic or plywood, assembled, and iterated on much more quickly.

You might think laser-cut parts would only be good for flat surfaces, but with techniques like kerf bending, and stacking layer upon layer of material on top of each other, just about anything that can be produced with a 3D printer is also possible with Platener.

To test their theory that Platener is faster than 3D printing, the team behind Platener downloaded over two thousand objects from Thingiverse. The print time for these objects can be easily calculated for both traditional 3D printing and the Platener system, and it turns out Platener is more than 20 times faster than printing more than thirty percent of the time.

You can check out the team’s video presentation below, with links to a PDF and slides on the project’s site.

Thanks [Olivier] for the tip.

Continue reading “3D Printering: Laser Cutting 3D Objects”

Tablet Interacts With Magnets, How Does That Work?

Making computers interact with physical objects is a favorite of the HCI gurus out there, but these builds usually take the form of image recognition of barcodes or colors. Of course there are new near field communication builds coming down the pipe, but [Andrea Bianchi] has figured out an easier way to provide a physical bridge between computer and user. He’s using magnets to interact with a tablet, and his idea opens up a lot of ideas for future tangible interfaces.

Many tablets currently on the market have a very high-resolution, low latency magnetometer meant for geomagnetic field detection. Yes, it’s basically a compass but Android allows for the detection of magnets, and conveniently provides the orientation and magnitude of magnets around a tablet.

[Andrea] came up with a few different interfaces using magnets. The first is just a magnets of varying strengths embedded into some polymer clay. When these colorful magnetic cubes are placed on the tablet, [Andrea]’s app is able to differentiate between small, medium, and large magnets.

There are a few more things [Andrea]’s app can do; by placing two magnets on an ‘arrow’ token, the app can detect the direction in which the arrow is pointing. It’s a very cool project that borders on genius with its simplicity.

You can check out [Andrea]’s demo video after the break.

Continue reading “Tablet Interacts With Magnets, How Does That Work?”