For many of us, interacting with computers may be as glorious as punching keys and smearing touch screens with sweaty fingers and really bad posture. While functional, it’s worth reimagining a world where our conversation with technology is far more intuitive, ergonomic, and engaging. Enter TEI, an annual conference devoted to human-computer interaction and a landmark for novel projects that reinvent the conventional ways we engage our computers. TEI isn’t just another sit-down conference to soak in a wealth of paper talks. It’s an interactive weekend that combines these talks with a host of workshops provided by the speakers themselves.
Last year’s TEI brought us projects like SPATA, digital calipers that sped up our CAD modeling by eliminating the need for a third hand, and TorqueScreen, a force-feedback mechanism for tablets and other handhelds.
Next February’s conference is no exception for new ways to interact with novel technology. To get a sense of what’s to come, here’s a quick peek into the past from last year’s projects:
Interactive Projection Lab
What if we didn’t need to limit our interaction with computer screens to touch and buttons? What if we could manipulate the screen directly with objects in our environment? [Patrick] and [Jordi’s] Interactive Projection Lab (IPL) does just that, mixing objects like post-its, markers, and paper boxes directly into a modified version of the original Super Mario Bros. The result? Platforms can appear spontaneously via Post-it. What’s more: pushing and sliding these platforms is as intuitive as moving around their real-life representative objects in the real world.
Display Skin
Let’s face it: there’s a wealth of information we’d love to have at our fingertips without having to reach into our pockets to consult our phones. Bluetooth watches aside, Display Skin is a glance-based interface, a supplementary means of receiving small bits of information without interrupting one’s primary activity, literally, at hand. Display Skin mounts a Plastic Logic flexible display onto the user’s wrist. The onboard IMU creates an internal model of the user’s arm and can keep the data centered on the user’s eye regardless of arm pose.
This year’s theme for TEI 2016 is our body is our manual. If you’re near the Netherlands, drop by and join in for a workshop or paper talk with the folks at TEI as the conference takes a weekend to look at our bodies as the driving inspiration on which technology can conform itself.
Thanks for the tip, [Paul]!
If you want to get involved, send your papers or demos to us by August 2nd. Find out more about the submission process here: http://www.tei-conf.org/16/ or contact us via our facebook page https://www.facebook.com/teiconf
You could also get in touch with me directly: paul dot strohmeier at gmail dot com.
click jacking reported on the first video