Sensor Glove Translates Sign Language

Sign language is a language that uses the position and motion of the hands in place of sounds made by the vocal tract. If one could readily capture those hand positions and movements, one could theoretically digitize and translate that language. [ayooluwa98] built a set of sensor gloves to do just that.

The brains of the operation is an Arduino Nano. It’s hooked up to a series of flex sensors woven into the gloves, along with an accelerometer. The flex sensors detect the bending of the fingers and the gestures being made, while the accelerometer captures the movements of the hand. The Arduino then interprets these sensor signals in order to match the user’s movements up with a pre-stored list of valid signs. It can then transmit out the detected language via a Bluetooth module, where it is passed to an Android phone for translation via text-to-speech software.

The idea of capturing sign language via hand tracking is a compelling one; we’ve seen similar projects before, too. Meanwhile, if you’re working on your own accessibility projects, be sure to drop us a line!

17 thoughts on “Sensor Glove Translates Sign Language

  1. As I understand it, there some semantics communicated in sign language that aren’t trivially conveyable in a transcript of the signs. For example when talking about people, the signer might indicate a position in virtual space for that person then use that space for “she said, but then he said” type communication.

  2. I’ve kind of considered ‘glove based’ electronics projects a number of times for various use cases, including assistive, but in the end have just had too many other projects in the mix to get around to it.

    And I mean, of course, yes you can just take any pair of cheap gloves and ‘roll your own’, but I did also discover the SaeboGlove. I mean, yes, at $299, it is a bit pricey, especially as it has *no* electronics on it, while at the same time, is used for a legit purpose.

    But I think it provides an *excellent* professional base to built electronics projects on, if anyone is interested in doing such a thing.

  3. There is no technology available simply using gloves to even remotely give an accurate interpretation (note, not translation) of ASL into English, or any other spoken language. Vocabulary is only a small part of the language as a lot of meaning is portrayed by using NMMs (non-manual markers) such as facial expressions and body shifting. Gloves don’t capture this.

    There are 5 parameters to a sign:
    Handshape (gloves can capture this)
    Palm orientation (gloves can sort of capture this)
    Location (gloves will struggle with this)
    Movement (again, they’ll struggle)
    NMMs (nope, gloves will completely miss these)

    With that being said, can you see how difficult these would be to use in any sort of everyday scenario? People have been trying to make gloves like this ad nauseam, and without realizing that it’s really focused on only helping the hearing person who doesn’t know sign language.

    A better use of time, energy, and resources is to bolster the educational system and add more programs that teach ASL. Instead, many schools are eliminating these programs. It’s so backwards!

    1. There are two objections I can see to this, although I’ll concede all your points. No matter how much you teach ASL no more than a minority of hearing people will speak it, some kind of sign language translator would help deaf people communicate in a language they are most comfortable with with people who don’t understand it, you also know that in many settings deaf people need sign language interpreters in many situations which can be both inconvenient and expensive. And secondly unlike English, ASL is not universal, how does someone who uses ASL communicate with someone who speaks BSL or AuSL or NZSL or ISL, etc…? Sure a sign language translator is a very dificult task but one worth presuing.

      1. This is what confuses me. Why didn’t the American, British, New Zealand (etc) Deaf community get together and come up with *one* sign language. They all read and write the same language.

        1. Why do we have French, English, German, etc? Why don’t hearing people use Esperanto or Latin? The reason is both geographical, cultural, and historical. Deaf people (and hearing people) developed a language in each country, often with influences from other languages and cultures, and each language grew and developed. Spain has multiple native languages, as does China, and the United Kingdom. Sign language is not an artificial construct but a living language. (FYI: ASL is in the French Sign Language group, much like English belongs to the Indo-European / West Germanic language group.)

  4. Here is the Deaf communities response to the Sign Language Gloves trope

    “ sign-language glove, which purports to translate sign language in real time to text or speech as the wearer gestures. For people in the Deaf community, and linguists, the sign-language glove is rooted in the preoccupations of the hearing world, not the needs of Deaf signers.”

  5. Signed languages are not dependent on a series of individual signs produced by one hand. As anyone who knows ASL or any other signed language can tell you the actual lexical signs give very little meaning unless they have facial grammar, spatial location, affective information, relative body positioning, relative hand positions, nonmanual responses to interlocutors, prosody of production, etc. Some signs are produced solely through the face, which means the glove wouldn’t capture it at all.
    EX – the sign for coffee with different facial expressions means – is there any coffee ? There is no coffee. The coffee isn’t good. Coffee! etc.
    That’s just 1 sign.
    Try making a full sentence – with adverbs on the face, size and shape specifiers in relative space, questions on the face, etc., and you will see how insufficient this is.
    Not to mention – who is going to use it? And how? And why?

    If engineers would start thinking about communications holistically rather than assuming it is word by word, we wouldn’t have to keep explaining this info over and over and over again.
    Try adding a Deaf signing person into the team and perhaps you’d come up with something useful.

    Congrats to them for making a sensor glove – BUT I cannot congratulate a project that doesn’t even start out by investigating what language and communication is before building an object that claims to translate but instead offers inaccurate and incomplete content.

  6. Translates half of a conversation, poorly. Deludes some hearing people into thinking they don’t need to hire interpreters. May be a fun project for someone to tinker with in their garage but not an accessibility tool.

    1. @Dan Parvaz said: “Ugh. This again. Those of us who study and research signed languages are a little tired of this recurring narrative.”

      I bet sooner than you think, this will be cracked by AI/ML watching lots of people sign along with the correct translation. They’ll call it GPTSign.

      1. I agree. I think a combination of AI and a bunch of videos of people signing, different people, different situations, would be better. Just put a camera in front of the person, with the AI software, and let it record and translate. I love the idea of gloves, but just like there’s more to communication than just speech, there’s more than hand gestures. I’d like to know what led the originator to try this method.

  7. I’m learning BSL and was happy to see lots of people have already got here debunk the translation angle.

    Given the Deaf community have raised multiple thought through objections to this glove technology (see the links) let’s be good hackers and engineers and respect them enough to drop the first paragraph from this article.

Leave a Reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.