Impress: Tactile Interface


Touch screen interfaces are generally hard and flat. Impress tries to break from that tradition by making the display flexible. Allowing you to feel more like you are interacting with the display. In the image above, the circles seem to physically fall into the dent made by your fingers. Another application shows some rudimentary 3d modeling being done by physically pushing on the vertexes. This prototype is very interesting, we’d love to see much higher resolution on the input side of things. It states that it does pressure sensitivity, but we weren’t able to distinguish it in the video. Maybe you can, catch the video after the break. Maybe laying one of these on some foam would be another alternative.


13 thoughts on “Impress: Tactile Interface

  1. This seems to be a prime candidate for rear projection. Since they’re using fabric, it should be relatively easy to see light coming from behind.

    Their major changes would be removing the foam padding and changing the force sensitivity options. They would likely be able to maintain or improve force sensitivity accuracy by mounting the sensors around the edges of the fabric. This new model would also likely allow for upward motions such as pinching and grabbing.

    Most importantly, switching to rear projection would allow the user to have a more obvious view of the way they’re affecting the image. The front projection made it difficult to see how the image was being warped because there was so much overlap with the user’s hands.

  2. Interesting idea but I see two problems:
    1) It looks like you have to press REALLY hard to do anything so thats not really practical for long periods of time and
    2) Using a top down projector means that you can’t see the display every time you’re using it.

  3. wicked idea, i really like it! another prime example of using an arduino well! unlike my efforts…

    rear projection wouldnt work with his method though, as he’s got force sensors under the foam which is projected onto from above.
    if it was rear projection you’d need some sort of camera system to read the depth of the hand and fingers pushing on the fabric.

    give that man a medal!

  4. I wonder if he could use a continuous beam-break system to detect the degree which the surface is displaced rather than force sensors? The cross section of the displaced volume will grow as he pushes harder on the screen. That way he could go to rear projection.

  5. @marco: I’m not sure, but one other possibility would be to use regular fabric and project beams of IR light from the sides. When viewed from the bottom, these series of beams would form a contour map where they hit the fabric.

    I could see there being some issues with multi-touch, since the beams could be occluded by other deformations…

  6. Yeah, you can definitely see the person’s hand shaking during the 3D modeling example, maybe others but I didn’t watch any further.

    Interesting concept, but I can’t see it being all that useful except in store displays, etc.

  7. this is an amazing concept and idea.
    but robo – your right but for one thing.
    u need a little imagination just think if money and time and a little evolution could do ?
    you would simply make advertisements look retarted , and create a whole seperate entertainment industry aside from every major comp wanting one.
    i honeastly can’t wait to see wtf they come out with soon , there is so much new technology that is exponentially growing – i think it will soon be absolutely crazy

    – but that was righteous though – peace

  8. very cool idea needs more sensors and try something like
    multiple laser lines under the material that 3d scan
    or just many more sensors don’t for get the competition of Microsoft’s Kinect it could read you with out an interface basically. Think where would it be most use full to be able to feel and read speed and strength

Leave a Reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.