TouchYou: Wearable Touch Sensor And Stimulator

Some of us might never know the touch of another human, but this project in the Hackaday Prize might just be the solution. It’s TouchYou, [Leonardo]’s idea for a wearable device that allows anyone to send tactile and multi-sensorial stimulation across the Internet. It’s touching someone over the Internet, and yes, this technology is right here today.

Inside the TouchYou is an Arduino Pro Mini connected to a Bluetooth module. This Arduino communicates with force sensors and touch sensors and also has an output for a small vibration motor. With that Bluetooth module, the TouchYou becomes an Internet of Things thing, capable of communicating to other TouchYous across the world. It’s an interconnected, worldwide touching experience, and one of the best examples of Human-Computer Interaction we’ve ever seen.

A project like this demands large touch sensors, and if you’re not aware, these are slightly expensive. That’s okay, because [Leonardo] came up with a way to create large flexible touch sensors cheaply. The process begins much like how you would make a PCB at home — print off two sides of a design in a laser printer, then wrap it around a copper foil and Kapton laminate. From there, it’s just a little bit of etching in ferric chloride and carefully soldering the flex PCB connections to fine wires.

From a great idea to some rather impressive work in building DIY flex PCBs, this is one of the better projects in the Hackaday Prize and was named as a finalist in the Human-Computer Interface Challenge.

A CNC Woodworking Tool That Does The Hard Parts

Drawn along in the wake of the 3d printing/home shop revolution has been the accessibility of traditional subtractive CNC equipment, especially routers and mills. Speaking of, want a desktop mill? Try a Bantam Tools (née Othermachine) Desktop Milling Machine or a Carvey or a Carbide 3D Nomad. Tiny but many-axis general purpose mill? Maybe a Pocket NC. Router for the shop? Perhaps a Shapeoko, or an X-Carve, or a ShopBot, or a… you get the picture. [Rundong]’s MatchSticks device is a CNC tool for the shop and it might be classified as a milling machine, but it doesn’t quite work the way a more traditional machine tool does. It computer controls the woodworker too.

Sample joints the MatchSticks can cut

At a glance MatchSticks probably looks most similar to a Pocket NC with a big Makita router sticking out the side. There’s an obvious X-axis spoilboard with holes for fixturing material, mounted to a gantry for Z-axis travel. Below the big friendly handle on top is the router attached to its own Y-axis carriage. The only oddity might be the tablet bolted to the other side. And come to think of it the surprisingly small size for such an overbuilt machine. What would it be useful for? MatchSticks doesn’t work by processing an entire piece of stock at once (that what you’re for, adaptable human woodworker) it’s really a tool for doing the complex part of the job – joinery – and explaining to the human how to do the rest.

The full MatchSticks creation flow goes like this:

  1. Choose a design to make on the included interface and specify the parameters you want (size, etc).
  2. The MatchSticks tool will suggest what material stocks you need, and then ask you to cut them to size and prepare them using other tools.
  3. For any parts which require CNC work the tool will help guide the user to fixture the stock to its bed, then do the cutting itself.
  4. Once everything is ready for final assembly the MatchSticks will once again provide friendly instructions for where to pound the mallet.

In this way [rundong], [sarah], [jeremy], [ethan], and [eric] were able to build a much smaller machine tool without sacrificing much practical functionality. It’s almost software-like in it’s focus on a singular purpose. Why reinvent what the table saw can do when the user probably already has access to a table saw that will cut stock better? MatchSticks is an entire machine bent around one goal, making the hard stuff easier.

It’s worth noting that MatchSticks was designed as an exploration into computer/human interaction for the ACM Conference on Human Factors in Computing Systems so it’s not a commercial product quite yet (we’re eagerly waiting!). For a much more in depth look at the project and its goals and learnings the full research paper is available here. Their intro video is down after the break.

Thanks [ethan] for the tip!

Continue reading “A CNC Woodworking Tool That Does The Hard Parts”

Human-Computer Interface Challenge: Change How We Interact With Computers, Win Prizes

Pay no attention to the man behind the curtain. It’s a quote from the Wizard of Oz but also an interesting way to look at our interactions with electronics. The most natural interactions free us from thinking about the ones and zeros behind them. Your next challenge is to build an innovative interface for humans to talk to machines and machines to talk to humans. This is the Human-Computer Interface Challenge!

The Next Gen of HCI

A Human-Computer Interface (or HCI) is what we use to control computers and what they use to control us get information to us. HCIs have been evolving since the beginning. The most recent breakthroughs include touchscreens and natural-language voice interaction. But HCI goes beyond the obvious. The Nest thermostat used a novel approach to learning your habits by observing times and days that people are near it, and when the temperature setting is changed. This sort of behavior feels more like the future than having to program specific times for temperature control adjustments. But of course we need to go much further.

You don’t need to start from scratch. There are all kinds of great technologies out there offering APIs that let you harness voice commands, recognize gestures, and build on existing data sets. There are chips that make touch sensing a breeze, and open source software suites that let you get up and running with computer vision. The important thing is the idea: find something that should feel more intuitive, more fun, and more natural.

The Best Interfaces Have Yet to Be Dreamed Up

No HCI is too simple; a subtle cue that makes sure you don’t miss garbage collection day can make your day. Of course no idea is too complex; who among you will work on a well-spoken personal assistant that puts Jarvis to shame? We just saw that computers sound just like people if you only tell them to make random pauses while speaking. There’s a ton of low-hanging fruit in this field waiting to be discovered.

An HCI can be in an unexpected place, or leverage interactions not yet widely used like olfactory or galvanic responses.  A good example of this is the Medium Machine which is pictured above. It stimulates the muscles in your forearm, causing your finger to press the button. The application is up to you, and we really like it that Peter mentions that Medium Machine reaches for something that wouldn’t normally come to mind when you think about these interfaces; something that hasn’t been dreamed up yet. Get creative, get silly, have some fun, and show us how technology can be a copilot and not a dimwitted sidekick.

You have until August 27th to put your entry up on Hackaday.io. The top twenty entries will each get $1,000 and go on to the finals where cash prizes of $50,000, $20,000, $15,000, $10,000, and $5,000 await.

Human-Machine Interface Projects At TEI 2016

For many of us, interacting with computers may be as glorious as punching keys and smearing touch screens with sweaty fingers and really bad posture. While functional, it’s worth reimagining a world where our conversation with technology is far more intuitive, ergonomic, and engaging. Enter TEI, an annual conference devoted to human-computer interaction and a landmark for novel projects that reinvent the conventional ways we engage our computers. TEI isn’t just another sit-down conference to soak in a wealth of paper talks. It’s an interactive weekend that combines these talks with a host of workshops provided by the speakers themselves.

Last year’s TEI brought us projects like SPATA, digital calipers that sped up our CAD modeling by eliminating the need for a third hand, and TorqueScreen, a force-feedback mechanism for tablets and other handhelds.

Next February’s conference is no exception for new ways to interact with novel technology. To get a sense of what’s to come, here’s a quick peek into the past from last year’s projects:

Continue reading “Human-Machine Interface Projects At TEI 2016”

HuddleLamp Turns Multiple Tablets Into Single Desktop

Imagine you’ve got a bunch of people sitting around a table with their various mobile display devices, and you want these devices to act together. Maybe you’d like them to be peepholes into a single larger display, revealing different sections of the display as you move them around the table. Or maybe you want to be able to drag and drop across these devices with finger gestures. HuddleLamp lets you do all this.

How does it work? Basically, a 3D camera sits above the tabletop, and watches for your mobile displays and your hands. Through the magic of machine vision, a server sends the right images to each screen in the group. (The “lamp” in HuddleLamp is a table lamp arranged above the space with a 3D camera built into it.)

A really nice touch is that the authors also provide JavaScript objects that you can embed into web apps to enable devices to join the group without downloading special software. A new device will flash an identifying pattern that the computer vision routine will recognize. Once that’s done, the server starts sending the correct parts of the overall display to the new device.

The video, below the break, demonstrates the possible interactions.

Continue reading “HuddleLamp Turns Multiple Tablets Into Single Desktop”

Pen Based Input Improvements

[youtube=http://www.youtube.com/watch?v=EcE3XBytN-U]

Lately we’ve been focusing on multitouch technologies, but that doesn’t mean there isn’t interesting research going on in other areas of human-computer interaction. [Johnny Lee] posted a roundup of some the work that [Gonzalo Ramos] and others have done with pen based input. The video embedded above shows how pressure can be used to increase control precision. Have a look at his post to see how pen gestures can be used for seamless workspace sharing and how pen rolling can give additional control.