Chorded Keyboard For Touchscreens

For over a hundred years, good typists didn’t ‘hunt and peck’ but instead relied on keeping their fingers on the home row. This technique relies on physical buttons, but with on-screen keyboards used on tablets and other touch screen devices touch typists have a very hard time. [Zach] is working on a new project to bring a chorded keyboard to these devices called ASETNIOP.

Instead of training a typist where to place their finger – the technique used in most other keyboard replacements, ASETNIOP trains the typist which fingers to press. For example, typing ‘H’ requires the typist to press the index and middle fingers of their right hand against the touchscreen. In addition to touchscreens, ASETNIOP can be used with projection systems, Nintendo Power Glove replicas, and extremely large touchpads that include repurposed nooks and Kindles.

If you’d like to try out ASENTNIOP, there’s a tutorial that allows you to try it out on a physical keyboard as well as one for the iPad. It’s a little weird to try out but surely no more difficult to learn than a Dvorak keyboard.

A Tale Of (un)bricking A $10k Microsoft Surface Unit

We’ve all had that sinking feeling as a piece of hardware stops responding and the nasty thought of “did I just brick this thing?” rockets to the front of our minds. [Florian Echtler] recently experienced this in extremis as his hacking on the University of Munich’s Microsoft Surface 2.0 left it unresponsive. He says this is an 8,000 Euro piece of hardware, which translates to around $10,000! Obviously it was his top priority to get the thing working again.

So what’s the first thing you should do if you get your hands on a piece of hardware like this? Try to run Linux on the thing, of course. And [Florian] managed to make that happen pretty easily (there’s a quick proof-of-concept video after the break). He took a Linux kernel drive written for a different purpose and altered it to interface with the MS Surface. After working out a few error message he packaged it and called to good. Some time later the department called him and asked if his Linux kernel work might have anything to do with the display being dead. Yikes.

He dug into the driver and found that a bug may have caused the firmware on the USB interface chip to be overwritten. The big problem being that they don’t just distribute the image for this chip. So he ended up having to dump what was left from the EEPROM and rebuild the header byte by byte.

Continue reading “A Tale Of (un)bricking A $10k Microsoft Surface Unit”

Update: Using Your Forearms As A UI

This image should look familiar to regular readers. It’s a concept that [Chris Harrison] has been working on for a while, and this hardware upgrade uses equipment which which we’re all familiar.

The newest rendition, which is named the Omnitouch, uses a shoulder-mounted system for both input and output. The functionality is the same as his Skinput project, but the goal is achieved in a different way. That used an arm cuff to electrically sense when and where you were touching your arm or hand. This uses a depth camera to do the sensing. In both cases, a pico projector provides the interactive feedback.

There’s a couple of really neat things about this upgrade. First, it has a pretty accurate multitouch capability. Second, it allows more surfaces to be used than just your arm. In fact, it can track moving surfaces and adjust accordingly. This is shown in the clip after the break when a printed document is edited in real time. Pretty neat stuff!

Continue reading “Update: Using Your Forearms As A UI”

Replicating The Fancy Touch Sensor That Uses Anything

[Sprite_tm], a name many of you will recognize from these pages, has wasted no time in replicating the latest cool thing in a much simpler fashion. En Garde is a touch sensor that can detect up to 32 different points of contact on… whatever you use as the surface.  He couldn’t sit idly by and let the Disney funded one from yesterday keep the spot light. As you can see in the video, it works pretty well. If he didn’t tell you that his can only detect up to 32 points as opposed to the 200 of the other, you probably wouldn’t even notice the difference.  Of course, [Sprite_tm] also shares how you could easily beef his up to be even more precise. You can also download his source code an schematics from his site and give it a try yourself.

Continue reading “Replicating The Fancy Touch Sensor That Uses Anything”

Turning Anything Into A Touch Sensor

This year at the CHI conference in Austin, [Munehiko Sato], [Ivan Poupyrev], and [Chris Harrison] out of the Disney research lab in Pittsburgh demonstrated their way to make touch sensors out of anything. Not only to they suggest using the surface of your skin to control cell phones and MP3 players, they’re also able to recognize touch gestures, like poking or grasping an object. That sounds a little heady, so check out the video of the Touché tech in action.

Like the capacitive touch sensors in our phones and tablets, Touché measures the rise and fall of a capacitor’s charge over time. Unlike  other touch sensors, Touché scans the capacitor at different rates, allowing for a ‘capacitive profile’ that is used to recognized touch gestures.

The applications for this tech are nearly innumerable; the team demonstrated scolding someone for eating cereal with chopsticks (yeah, we know…), an on-body music player interface, and gestures for an office doorknob that notifies passersby if you’ve stepped out for a minute or are gone for the day.

It’s a very interesting build, and we give it two weeks until someone replicates this build. We’ll be sure to post it then.

Continue reading “Turning Anything Into A Touch Sensor”

Reaching Out To A Touch Screen With A Microcontroller

We love capacitive touch screens. They’re much more robust than resistive touch screens and if the UI is programmed well they produce a great user experience. But getting your electronics project to interact with one is a bit tough. [RobB] has been experimenting in that area, and managed to build a simple touchscreen actuator for microcontroller use.

In the video after the break you can see his proof of concept. He’s using an Arduino to enter the number 2 on an Android  iOS calculator app once every second. It doesn’t take much to pull off this trick, [RopB] just taped a piece of tin foil to the screen and connected it to the Arduino with a jumper wire. The pin is left floating until a screen tap is needed, at which point it is pulled to ground.

A custom app operating at slow speeds could use this as an input technique. Two pieces of foil (one acting as clock, the other data) would provide a rudimentary serial transfer system.

Continue reading “Reaching Out To A Touch Screen With A Microcontroller”

Multitouch Table Uses A Kinect For A 3D Display

[Bastian] sent in a coffee table he built. This isn’t a place to set your drinks and copies of Make, though: it’s a multitouch table with a 3D display. Since no description can do this table justice, take a look at the video.

The build was inspired by the subject of this Hackaday post where [programming4fun] was able to build a ‘holographic display’ using a regular 2D projector and a Kinect. Both builds work on the principle of redrawing the 3D space in relation to the user’s head – as [Bastian] moves his head around the coffee table, the Kinect tracks his location and moves the 3 dimensional grid of boxes in the opposite direction. It’s extremely clever, and looks to be a promising user interface.

In addition to a Kinect, the coffee table uses a Microsoft Surface-like display; four infrared lasers are placed at the corner and detected with a camera next to the projector in the base.

After the break you can see the demo video and a gallery of the images [Bastion] put up on the NUI group forum.

Continue reading “Multitouch Table Uses A Kinect For A 3D Display”