Sometimes, simple things can make a world of difference. Take for example a non-verbal person who can’t necessarily control a touch screen in order to tell someone else what they need or want or think.
This is where Augmentative and Alternative Communication (AAC) devices come in. Recently tasked with building such a device, [Thornhill!] came up with a great design that houses 160 different phrases in a fairly small package and runs on CircuitPython.
Basically, the client presses the appropriate snap-dome button button and the corresponding phrase is spoken through the speaker. The 10×16 grid of buttons is covered with a membrane that both feels nice and gives a bit of protection from spills.
The buttons can achieve high actuation forces and have a crisp tactile response, which means they’re probably gonna go a long way to keep the user from getting frustrated.
This handy AAC board is built on the Adafruit RP2040 Prop-Maker Feather and two keypad matrices. If this weren’t useful enough as it is, [Thornhill!] also built an even smaller version with 16 buttons for the client to wear around their neck.
Did you know? AAC boards aren’t just for humans.
That is a nice looking build! I am wondering though, for some of the simple words or phrases like, “goodbye” or “thank you”, would it not be better to use sign language/gestures and save the board tiles for more abstract concepts that are harder to convey through gestures? Admittedly, I don’t know anything about the physical capabilities of the user, and perhaps there are other requirements not mentioned. Regardless, good job with the build, it came out looking very tidy and professional!
While the user might know ASL (or another language), it isn’t safe to assume whoever they are communication with will.
True, but very simple things like “goodbye” or “I like” can be conveyed rather universally via gestures.
Unfortunately gestures are not a lingua franca like English. Signs that are polite in one area might be an insult elsewhere.
DAMHIKT!
(Don’t Ask Me How I Know That!)
Because some people cannot manage gestures.
You wouldn’t need this board at all if you could handle a mobile or tablet. So this solution screams impaired dexterity and movement to me. The fact that there are icons all over it also suggests cognitive impairment.
Signs often assume the user has full motor control of two hands (if present). In this case, we’re talking about populations that can have some very significant mobility challenges, which frequently go hand-in-hand with significant communication or cognitive challenges. These challenges can be deeply individualized, so having a lot of mechanisms that can be adapted and modified for each individual is, frankly, amazing.
My wife is a special educator, who has a deep love for this population. I’ve written a few programs and designed a few systems to improve interfaces. I’ve ended up building systems for kids who could only press one of two buttons. That kind of thing.
Very nice idea, and interesting build!
No video of it in use tho? I would be interesting to see it in use.
I saw a non-electronic board similar to this in the 1980s.
What struck me most was on the “header” it had
“I am not retarded, I am a mute”
This device strongly reminds me of the point-of-sale systems in fast food restaurants.
If you’re interested in similar products, take a look at the Elgato Stream Deck, which is also supported on Linux. Every key has its own LED display so that one can reconfigure it on the fly for each use.
https://www.elgato.com/us/en/p/stream-deck-mk2-black
https://github.com/timothycrosley/streamdeck-ui
I’d recommend a fixed position for the “where is the ‘Hello’ button this time?” button.
This type of device is pretty common for kids and people who don’t have good dexterity.
This build is a DIY version of a common assistive device used by non-verbal people. While a mute person has a physical problem that prevents speech, a non-verbal person has a linguistic cognitive impairment – their brain’s sentence-maker is busted. They may be quite capable in other areas, or even live somewhat independently, but they wouldn’t be able to use a TTS system. (Note that there are also some non-verbal people who can write, but are cognitively prevented from speaking. These people often do use TTS.)
The other big user group for this kind of “speaking board” are people with severe motor impairments that prevent both speech and typing. That may be why this build has those raised barriers between the buttons. Those would help someone with poor control of their hands to hit the right button.
Agree!
This makes sense, the article and link don’t describe the end user beyond “non-verbal” and “has difficulty controlling touch screen”, but I kinda assumed there was some sort of serious motor impairment not explicitly mentioned.
better way is https://www.blissymbolics.org/
meybe create a table bliss similar paper sheet