Cloud Mirror Adds Internet To Your Morning Ritual

This mirror has a large monitor behind it which can be operated using hand gestures. It’s the result of a team effort from [Daniel Burnham], [Anuj Patel], and [Sam Bell] to build a web-enabled mirror for their ECE 4180 class at the Georgia Institute of Technology.

So far they’ve implemented four widget for the system. You can see the icons which activate each in the column to the right of the mirror. From top to bottom they are Calendar, News, Traffic, and Weather. The video after the break shows the gestures used to control the display. First select the widget by holding your hand over the appropriate icon. Next, bring that widget to the main display area by swiping from right to left along the top of the mirror.

Hardware details are shared more freely in their presentation slides (PDF). A sonar distance sensor activated the device when a user is close enough to the screen. Seven IR reflectance sensors detect a hand placed in front of them. We like this input method, as it keep the ‘display’ area finger-print free. But we wonder if the IR sensors could be placed behind the glass instead of beside it?

Continue reading “Cloud Mirror Adds Internet To Your Morning Ritual”

Multitouch Tower Defense Uses Physical Towers

If you’re tired of playing flash games with a mouse, perhaps you’ll draw inspiration from this project. Arthur built a multitouch interface that uses objects as part of the control scheme. In the image above you can see that the game board for a tower defense game is shown on the display. There is a frustum-shaped game piece resting on the surface. Just place that piece where you want to build your next tower, and then select the tower type from the list.

The controller itself is pretty straight-forward. The surface is a piece of acrylic topped with some light diffusing material. A projector shines through another acrylic window on the side of the unit, reflecting on a mirror positioned at a 45 degree angle. As for the multitouch detection, the hardware uses a series of UV LEDs along with a modified PS3 eye camera. [Arthur] chose the reacTIVision software package to process the input from the camera. Check out a couple of videos after the break to see the hardware, and some game play.

Continue reading “Multitouch Tower Defense Uses Physical Towers”

Adding Extra Buttons To A Cintiq Drawing Pad

wacom_cintiq_game_pad_addon

[David Revoy] recently picked up a brand new Cintiq 21UX, and while he liked the drawing pad overall, he was less than impressed with the tablet’s buttons. He says that most 2D linux apps require a good bit of keyboard interaction, and the built-in buttons just were not cutting it.

After seeing a fellow artist use a joypad to augment his tablet, [David] thought that he might be able to do something similar, but he wanted to add a lot more buttons. He dug out an old Logitech game pad that was collecting dust, and disassembled it, rearranging some buttons in the process. Once he was happy with the layout, he built a cardboard enclosure for the PCB and hooked it up to the Wacom via USB.

He spent a few minutes mapping buttons to key presses using Qjoypad, and was up and running with an additional 14 buttons in short order. He says that the extra buttons make his job a ton easier, and add a little bit of comfort to his long drawing sessions. We like the fact that it is a non-permanent fixture, and that he was able to repurpose an old game pad in the process.

Check out the video below for a quick demonstration of his drawing pad hack.

[via Adafruit blog]

Continue reading “Adding Extra Buttons To A Cintiq Drawing Pad”

Extending The Usability Of Touchscreens

Fanboys may be in shock from seeing duct tape applied to the screen of an iPad, but we can assure you it’s in the name of science. [Michael Knuepfel] is working on his thesis for the ITP graduate program at the Tisch School for the Arts. He managed to augment the usability of touchscreen devices by adding hardware to them.

What he’s come up with are devices for both input and output. The output devices generally rely on light and color of light displayed on the screen itself which is picked up by a light sensor. The input devices use conductive material to complete a path between your hand and your screen. This lets the capacitive sensing screen detect the presence of your hand, through the conductor. Some of his example devices include gaming controller overlays, encoder rings, and multiple stylus designs.

After the break we’ve embedded [Michael’s] teaser trailer which jumps through several demonstrations. It’s plenty to get your mind rolling, but if you want to know more you must watch his thesis presentation. It’s available as an MP4 download on this page. Just search for his name, [Michael Knuepfel] for the proper link.

Continue reading “Extending The Usability Of Touchscreens”

Touch-based Synthesizer Is A Wiring Nightmare

[Jane] wrote in to let us know about the touch-based synthesizer she and her classmates just built. They call it the ToneMatrix Touch, as it was inspired by a flash application called ToneMatrix. We’re familiar with that application as it’s been the inspiration for other physical builds as well.

A resistive touch screen in the surface glass of the device provides the ability to interact by tapping the cells you wish to turn on or off. Below the glass is a grid of LEDs which represent sound bits in the looping synthesizer track. Fifteen shift registers drive the LED matrix, with the entire system controlled by an ATmega644 microcontroller. Although the control scheme is very straight forward, the jumper wires used to connect the matrix to the shift registers make for a ratsnest of wireporn that has been hidden away inside the case. Check out the demonstration video after the break to see what this looks like and sounds like when in use.

Continue reading “Touch-based Synthesizer Is A Wiring Nightmare”

Interactive Sun Exhibit Uses 3D Projection Screen And Kinect

A few common components come together to make this interactive museum exhibit that teaches about the sun (translated). It uses three main physical components to pull this off. The first is a custom projection surface. It’s a hemisphere of the sun with a slice cut out of it. This is presumably coated with the paint you’d use to turn a wall into a projection surface. Software translates a projected image to map correctly on the topographic surface, resulting in what you see above, with a Kinect for user input.

Take a look at the video embedded after the break to see how the exhibit works. It instructs patrons to stand on a pair of footprint markers on the floor. This positions them at the proper range from a Kinect depth camera, which translates their outline into cursor commands. By moving a hand around they can explore the different parts of the sun.

We’re in love with how easy this type of interaction is becoming. Granted, there’s a fair amount of work that goes into to the coding for the project, but the physical build is quick and relatively inexpensive.

Continue reading “Interactive Sun Exhibit Uses 3D Projection Screen And Kinect”

Zork On The Microtouch

[Rossum] just finished porting Zork over to the Microtouch. This hardware, which he originally designed, is now available for purchase through Adafruit. It’s a tiny 320×240 TFT touchscreen, driven by an AVR ATmega32u4 microcontroller. The device draws power from a lithium battery, and also boast a USB connection and a MicroSD slot.

The hack here is getting Zork to run with the limited resources available on the device. [Rossum] needed to emulate the Z80 processor, but didn’t want to use extra hardware in the way that [Sprite_TM] did when he emulated a Z80 using an AVR. Instead, this is based on a stripped-down implementation of Frotz. The final code is too big to fit on the chip along side of the bootloader. This means you’ll need to use an ISP programmer in order to flash this example to the chip. We’re pretty sure that AVRdude can program the ATmega32u4, so pretty much any ISP (including an Arduino) can be used to do the programming.