For their final project in embedded microcontroller class, [Aaheli, Jun, and Naomi] turned their focus toward assistive technology and created an Electronic Travel Aid (ETA) for the visually impaired that uses haptic feedback to report the presence of obstacles.
We have seen a few of these types of devices in the past, and they almost always use ultrasonic sensors to gauge distance. Not so with this ETA; it uses six VL53L0X time-of-flight (ToF) sensors mounted at slightly different angles from each other, which provides a wide sensing map. It is capable of detecting objects in a one-meter-wide swath at a range of one meter from the sensors.
The device consists of two parts, a wayfinding wand and a feedback module. The six ToF sensors are strapped across the end of a flashlight body and wired to an Arduino Mini inside the body. The Mini receives the sensor data over UART and sends it to the requisite PIC32, which is attached to a sleeve on the user’s forearm. The PIC decodes these UART signals into PWM and lights up six corresponding vibrating disc motors that dangle from the sleeve and form a sensory cuff bracelet around the upper forearm.
We like the use of ToF over ultrasonic for wayfinding. Whether ToF is faster or not, the footprint is much smaller, so its more practical for discreet assistive wearables. Plus, you know, lasers. You can see how well it works in the demo video after the break.
This device is intended to augment the traditional white cane, not replace it. This virtual cane we saw a few years ago is another story.
Continue reading “Find Your Way with Tiny Laser Beams”
HaptiVision is a haptic feedback system for the blind that builds on a wide array of vibration belts and haptic vests. It’s a smart concept, giving the wearer a warning when an obstruction comes into sensor view.
The earliest research into haptic feedback wearables used ultrasonic sensors, and more recent developments used a Kinect. The project team for HaptiVision chose the Intel RealSense camera because of its svelte form factor. Part of the goal was to make the HaptiVision as discreet as possible, so fitting the whole rig under a shirt was part of the plan.
In addition to a RealSense camera, the team used an Intel Up board for the brains, mostly because it natively controlled the RealSense camera. It takes a 640×480 IR snapshot and selectively triggers the 128 vibration motors to tell you what’s close. The motors are controlled by 8 PCA9685-based PWM expander boards.
The project is based on David Antón Sánchez’s OpenVNAVI project, which also featured a 128-motor array. HaptiVision aims to create an easy to replicate haptic system. Everything is Open Source, and all of the wiring clips and motor mounts are 3D-printable.
While medical facilities continue to improve worldwide, access to expensive treatments still eludes a vast amount of people. Especially when it comes to prosthetics, a lot of people won’t be able to afford something so personalized even though the need for assistive devices is extremely high. With that in mind, [Guillermo Herrera-Arcos] started working on ALICE, a robotic exoskeleton that is low-cost, easy to build, and as an added bonus, 100% Open Source.
ALICE’s creators envision that the exoskeleton will have applications in rehabilitation, human augmentation, and even gaming. Also, since it’s Open Source, it could also be used as a platform for STEM students to learn from. Currently, the team is testing electronics in the legs of the exoskeleton, but they have already come a long way with their control system and getting a workable prototype in place. Moving into the future, the creators, as well as anyone else who develops something on this platform, will always be improving it and building upon it thanks to the nature of Open Source hardware.
The days of the third hand’s dominance of workshops the world over is soon coming to an end. For those moments when only a third hand is not enough, a fourth is there to save the day.
Dubbed MetaLimbs and developed by a team from the [Inami Hiyama Laboratory] at the University of Tokyo and the [Graduate School of Media Design] at Keio University, the device is designed to be worn while sitting — strapped to your back like a knapsack — but use while standing stationary is possible, if perhaps a little un-intuitive. Basic motion is controlled by the position of the leg — specifically, sensors attached to the foot and knee — and flexing one’s toes actuates the robotic hand’s fingers. There’s even some haptic feedback built-in to assist anyone who isn’t used to using their legs as arms.
The team touts the option of customizeable hands, though a soldering iron attachment may not be as precise as needed at this stage. Still, it would be nice to be able to chug your coffee without interrupting your work.
Continue reading “Robotic Arms Controlled By Your….. Feet?”
Almost every big corporation has a research and development organization, so it came as no surprise when we found a tip about Disney Research in the Hackaday Tip Line. And that the project in question turned out to involve human-safe haptic telepresence robots makes perfect sense, especially when your business is keeping the Happiest Place on Earth running smoothly.
That Disney wants to make sure their Animatronics are safe is good news, but the Disney project is about more than keeping guests healthy. The video after the break and the accompanying paper (PDF link) describe a telepresence robot with a unique hydrostatic transmission coupling it to the operator. The actuators are based on a rolling-diaphragm design that limits hydraulic pressure. In a human-safe system that’s exactly what you want.
The system is a hybrid hydraulic-pneumatic design; two actuators, one powered by water pressure and the other with air, oppose each other in each joint. The air-charged actuators behave like a mass-efficient spring that preloads the hydraulic actuator. This increases safety by allowing the system to be de-energized instantly by venting the air lines. What’s more, the whole system presents very low mechanical impedance, allowing haptic feedback to the operator through the system fluid. This provides enough sensitivity to handle an egg, thread a needle — or even bop a kid’s face with impunity.
There are some great ideas here for robotics hackers, and you’ve got to admire the engineering that went into these actuators. For more research from the House of Mouse, check out this slightly creepy touch-sensitive smart watch, or this air-cannon haptic feedback generator.
Continue reading “Keeping Humanity Safe from Robots at Disney”
The “absorbed device user” meme, like someone following Google Maps on a smart phone so closely that they walk out into traffic, is becoming all too common. Not only can an interface that requires face time be a hazard to your health in traffic, it’s also not particularly useful to the visually impaired. Haptic interfaces can help the sighted and the visually impaired alike, but a smart phone really only has one haptic trick – vibration. But a Yale engineer has developed a 3D printed shape-shifting navigation tool that could be a haptics game changer.
Dubbed the Animotus by inventor [Ad Spiers], the device is a hand-held cube split into two layers. The upper layer can swivel left or right and extend or retract, giving the user both tactile and visual clues as to which direction to walk and how far to the goal. For a field test of the device, [Ad] teamed up with a London theater group in an interactive production of the play “Flatland”, the bulk of which was staged in an old church in total darkness. As you can see in the night-vision video after the break, audience members wearing tracking devices were each given an Animotus to allow them to navigate through the interactive sets. The tracking data indicated users quickly adapted to navigation in the dark while using the Animotus, and some became so attached to their device that they were upset by the ending of the play, which involved its mock confiscation and destruction.
Performing art applications aside, there’s plenty of potential for haptics with more than one degree of freedom. Imagine a Bluetooth interface to the aforementioned Google Maps, or an electronic seeing-eye dog that guides a user around obstacles using an Animotus and a camera. There’s still plenty of utility in traditional haptics, though, as this Hackaday Prize semi-finalist shows.
Continue reading “Experimental Theater Helps Field test Haptic Navigation Device”
There are 3.6 Million deafblind people in the world, and by far their greatest problem is one of communication. For his entry for the Hackaday Prize, our own miracle worker on hackaday.io is creating a system that enables haptic communication for a variety of devices. It’s called Tact-Tiles, and instead of creating a single device, [Anderson] is building an entire system that enables a multitude of communication devices for deafblind people.
The basic unit of the Tact Tile system is a small, touch sensitive vibrating pad. These tiny PCBs can be fitted to just about anything, including a wired glove, or whatever haptic interface anyone can dream up. The core of the device is a small PCB that can control 32 of these vibrating pads, and communicates with a smartphone or computer over a Bluetooth connection.
With a little bit of software, the Tact Tiles can be configured an any way imaginable, with mapping individual tiles to letters of the alphabet, mapping gestures to letters, or any combination in between. [Anderson] has a great video demoing the possibility of his device, you can check that out below.
Continue reading “Hackaday Prize Semifinalist: Tact Tiles”