Shakey robot plays Angry Birds

At this year’s Pycon [Jason Huggins] gave a talk about his Angry Birds playing robot. He built a delta robot which includes a pen actuator for controlling a capacitive touch screen. The video after the break starts with a demo of the bot beating a level of Angry Birds on the iPad.

The idea behind the build is that robots like this could be used for app testing. I this case [Jason] has tweaked the servo commands manually to achieve the results. But during the talk he does demonstrate some machine vision to analyze and win a game of tic-tac-toe.

We do enjoy seeing the robot, but we’re not sold on the thought that testing will use robots. Perhaps there is a niche need for this type of thing, but we assume the majority of automated testing can be done in the emulator for the device on which you are developing. What we really want to know is how the capacitive stylus works. We didn’t catch him talking about it at all. We want a reliable, yet simple way to electronically trigger touchscreen inputs (along the lines of this project). If you know how [Jason's] stylus is working please share your thoughts in the comments section.

[via Reddit]

Comments

  1. brad says:

    Oo, a video! Oh, it’s 30 minutes long. Nevermind.

  2. NewCommentor1283 says:

    my guess, is the touch screens use the same detection as those capacitive touch switches
    meaning…

    trigger it with 22khz (AC) electric feild??? like a “radio signal” but low voltage and low freq so it doesnt go farther then the sensing electrode(pixel/cell) width.

    formula? screw that!
    use adjustable resistors and “fine tune” it XD

  3. killr says:
  4. tinyworkshop says:

    The stylus uses a piece of conductive foam, like the stuff ICs are packed in. A small piece will not trigger the screen, but when switched to ground (with something like a FET) or connected to a piece of metal (even a longish wire) it has enough capacitance to simulate a touch. This stylus has a piece grounded by a wire on the end of the pen.

  5. tinyworkshop says:

    It uses a piece of conductive foam (similar to what ICs are packed in). When switched to ground or connected to a large enough wire, it has enough capacitance with the screen to trigger a touch. If it is just a small piece of the foam, it doesn’t trigger. This stylus has a piece of the foam on the tip of the pen, but it it connected to ground via a wire.

  6. Gdogg says:

    They use robots to test touch screens for touch response/latency.

    You’re, though, that for a lot of other testing they can just manually send the touch events and not require extra hardware.

  7. zaprodk says:

    on Danish television, a program called “Natholdet” (The Night Team) found out that a sausage works just fine as the stylus – even when shrikwrapped http://youtu.be/5K8d6pcrhHA

  8. Jorge says:

    one should not show code on a presentation.
    it makes the presentation boring

  9. Jordan says:

    if you have enough metal you don’t need anything else. A conductive foam tip provides a non-scratching contact point with the screen. I have tested this by insulating myself from different materials and using them on capacitive-touch devices. Batteries tend to work particularly well.

    • Jordan says:

      (@ Mike Szczys) [quote]What we really want to know is how the capacitive stylus works. We didn’t catch him talking about it at all. We want a reliable, yet simple way to electronically trigger touchscreen inputs (along the lines of this project). If you know how [Jason's] stylus is working please share your thoughts in the comments section.[/quote]

  10. joeisi says:

    Well wouldn’t it just be a case of sticking a piece of metal on the end of the arm? That should disrupt the electrical feild. Maybe he also charges it. That would make the screen very disrupted.

    All knowledge from this article.

  11. Aaron Leclair says:

    Maybe it’s just my eyes, but it would appear that the “hand” of the delta bot has a capacative touch stylus attached to it. I believe they use a conductive rubber tip for those, maybe carbon based? I might be completely wrong though.

  12. Aaron Leclair says:

    Also an aluminium rod would work, I’ve tried it with my Invisi shield on my smartphone. It really just needs something with a conductivity similar to human tissue.

  13. Gumby says:

    You an make a capacitive-touch stylus using some of the slightly conductive anti static foam used to store IC’s in.

  14. hugs says:

    Hi, Jason Huggins here (aka the guy who created the robot. :-)

    What we really want to know is how the capacitive stylus works. We didn’t catch him talking about it at all.

    Sorry about that. In my demo at PyCon, I used Targus iPad stylus as the end effector.

    However, in my early prototypes, I simply used anti-static foam that chips are often shipped with. I then stuffed the foam into a hollow brass rod. While the foam/rod solution was way cheaper, I found the Targus stylus was more reliable — the foam kept falling apart or falling out of the tube. In the future, I hope to get my hands on some some conductive rubber and roll my own more durable stylus.

    In either case (foam or commercial stylus), I also had to attach a grounding wire. I don’t understand all the physics involved, but I needed that grounding wire to get the touch to properly register on the screen surface.

    Also regarding:

    but we assume the majority of automated testing can be done in the emulator for the device

    Yes, most testing should be done on an emulator, but in the case of Android for example, there is a huge difference between emulator and real device performance. Developers would rather see how their app behaves on a real device. Combined with the fact that new devices are coming out all the time and that proper testing APIs often lag the initial release, a ‘robot finger’ is useful when the only other option is manual testing.

  15. nes says:

    Robots are routinely used for testing the UI of cellphones. I worked on a custom system which pressed buttons and had cameras for spotting dead pixels and LCD artefacts using a neural network. It was used on a production line back in the mid 90’s the brains were a rack full of DOS PCs.

  16. Tom the Brat says:

    Maybe it can get me past the level that’s had me stuck for weeks.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 94,499 other followers