3D Printed Robotic Arms For Sign Language

A team of students in Antwerp, Belgium are responsible for Project Aslan, which is exploring the feasibility of using 3D printed robotic arms for assisting with and translating sign language. The idea came from the fact that sign language translators are few and far between, and it’s a task that robots may be able to help with. In addition to translation, robots may be able to assist with teaching sign language as well.

The project set out to use 3D printing and other technology to explore whether low-cost robotic signing could be of any use. So far the team has an arm that can convert text into finger spelling and counting. It’s an interesting use for a robotic arm; signing is an application for which range of motion is important, but there is no real need to carry or move any payloads whatsoever.

Closeup of hand actuators and design. Click to enlarge.

A single articulated hand is a good proof of concept, and these early results show some promise and potential but there is still a long ways to go. Sign language involves more than just hands. It is performed using both hands, arms and shoulders, and incorporates motions and facial expressions. Also, the majority of sign language is not finger spelling (reserved primarily for proper names or specific nouns) but a robot hand that is able to finger spell is an important first step to everything else.

Future directions for the project include adding a second arm, adding expressiveness, and exploring the use of cameras for the teaching of new signs. The ability to teach different signs is important, because any project that aims to act as a translator or facilitator needs the ability to learn and update. There is a lot of diversity in sign languages across the world. For people unfamiliar with signing, it may come as a surprise that — for example — not only is American Sign Language (ASL) related to French sign language, but both are entirely different from British Sign Language (BSL). A video of the project is embedded below.

Sign language projects show up fairly regularly in both the hobbyist and academic worlds. We’ve seen puppet-like 3D printed hands used to convert spoken words to ASL, as well as these sensor-covered gloves which aim to convert ASL signs and motions into speech. It’s fascinating to see the progression in technology and approaches taken over time as sensor technology and tools like 3D printing become more advanced.

[via 3DHubs newsroom]

30 thoughts on “3D Printed Robotic Arms For Sign Language

  1. Very nicely done! I released a project years ago for kids to make hands like this from drinking straws and one kid tried to get his to do ASL. While inspired, that wasn’t the right tech. You guys did a wonderful job!

    1. There is just one quite big issue with their approach. It’s basically just spelling each word out, letter by letter. It’s not fully utilizing ASL. Which would make it very difficult to follow e.g. a speaker at a conference. Just imagine it, instead of seeing a complete word, you’d only get to see its letters one by one. And i think it wouldn’t even be able to sign the letter Z as it’s the only one requiring arm movement.

      Proper ASL usage would mainly rely on…let’s call them “wordsigns”. A sign means a word. Only by using those is it possible to synchronously translate a spoken talk. The spelling signs are mostly used if there is no sign for a certain word, e.g. Names.

      1. Yes, true, but are they far off? It seems that’s not a big leap to switch to wordsigns as appropriate. The motions of that sample arm look flexible enough to do at least some.

        1. They’d need to build a complete upper torso with a head and two arms, because some signs do require combinations of those. From what i remember even a simple “Thank you” involves interaction between right hand and chin.

          1. I believe they wanted to prove the concept before investing all of that time, money and energy into an unproven system. I think now that the one arm is up they will add the second. Then they can add a head and torso to do the more expressive ASL signs.

        1. Yeah, you can learn language without knowing how to read.

          Exhibit A: Children speak, and they don’t know how to read or write! :D
          Exhibit B: Koko the gorilla ( https://en.wikipedia.org/wiki/Koko_(gorilla) )

          Somebody here proposed to make CGI video, and that’s the only alternative to something like this robot, as sign language can’t be translated from spoken/written language word for word, but uses basic word/concepts combined with signs for place and time, with additional motions and expressions.

          For example (this is in sign language in my country, not ASL, but I don’t think there is any major difference), there is no past tense, but you can sign “yesterday” or “before”, and then continue with what happened. Giving and getting is the same sign, but the exact meaning depends on whether you start with your hands near your body and you’re pushing them out, or do you start with hands from afar, and you pull them in.

          1. The question wasn’t about whether you can, but whether people really teach deaf people to sign but not read. That seems like a huge omission and an outright tragedy.

            How are they reading the newspaper? Traffic signs? Closed captions on television?

            If true, that there are a significant number of deaf who can’t read, that is a much more immediate issue.

          2. Some of it is cultural.

            Deafness isolates people from the non-deaf, and just like any language barrier you end up with a community that predominantly socialises within itself. Even the schools are separate. English (or whatever the predominant spoken language is) ends up being a second language that you have to learn in school, but isn’t vital for daily use – all their friends are deaf, their teachers are deaf, their parents are not deaf but they learned sign language for their child’s sake. They have deaf-specific sources for news and entertainment. They use just enough English (or whatever) to get by when they have to interact with the non-deaf world, but it’s not something they necessarily do every day. If you don’t use it, you lose it.

          3. >They use just enough English (or whatever) to get by when they have to interact with the non-deaf world, but it’s not something they necessarily do every day

            I find that hardly credible.

            Do deaf people not watch TV, don’t they use the internet? Send text messages, chat online, read books, write emails or just their own diaries? Are you saying they’re only looking at picture books and skype, so they could go days between actually reading a full sentence in english?

            For them to be so isolated from the surrounding society sounds like bullshit to me.

          4. The point was that it’s a bigger tragedy that a lot of deaf people couldn’t read, than not having a robot hand to perform sign language for them.

            And as others have pointed out, the hand isn’t exactly giving them sign language but merely repeating letters, which the deaf people wouldn’t understand anyhow unless they were familiar with reading and the written language, so it’s doubly pointless.

          5. My experience: Worked as a professor at local polytechnic. Had deaf students. They knew how to read (our language, not english), but their written one was very bad. Not problematic when doing projects as they know how to convey an idea, problematic during exams.

    1. That’s a good question. The answer is mostly that this project is not really trying to solve the problem of how to communicate with Deaf people.

      The project is trying to discover two things: Long-term, whether it’s possible to replace a signing human (with expressiveness, etc) with a robot. Short-term, whether 3D printing and other modern techniques could be used to (cheaply) make a robot arm that is even capable of signing in the first place. So far, it’s early in the short-term part, and so far the answer seems to be a yes.

  2. This could be a very useful text-to-sign app for conferences, meetings and what not.

    But I doubt a robot is the best way to do this. I think a CG-hand would be much nicer (cheaper) so you can project it anywhere you like.

    I must say: very well done. The robot looks very fast.

  3. I don’t know where exactly in this discussion to fit this in, but I suspect some people don’t really realize the following: there are individual and cultural and language differences, but the only thing Deaf people are unable to do is hear.

  4. And as others get pointed out, the script isn’t exactly giving them star sign voice communication but merely repeating letters, which the indifferent(p) people wouldn’t empathize anyhow unless they were familiar with reading material and the written voice communication, so it’s doubly pointless.
    Deafness isolates people from the non-indifferent(p), and just like any voice communication roadblock you remnant up with a community of interests that predominantly socialises within itself.

Leave a Reply to Claud EdCancel reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.