There are 3.6 Million deafblind people in the world, and by far their greatest problem is one of communication. For his entry for the Hackaday Prize, our own miracle worker on hackaday.io is creating a system that enables haptic communication for a variety of devices. It’s called Tact-Tiles, and instead of creating a single device, [Anderson] is building an entire system that enables a multitude of communication devices for deafblind people.
The basic unit of the Tact Tile system is a small, touch sensitive vibrating pad. These tiny PCBs can be fitted to just about anything, including a wired glove, or whatever haptic interface anyone can dream up. The core of the device is a small PCB that can control 32 of these vibrating pads, and communicates with a smartphone or computer over a Bluetooth connection.
With a little bit of software, the Tact Tiles can be configured an any way imaginable, with mapping individual tiles to letters of the alphabet, mapping gestures to letters, or any combination in between. [Anderson] has a great video demoing the possibility of his device, you can check that out below.
Continue reading “Hackaday Prize Semifinalist: Tact Tiles”
For his project entered in the Hackaday Prize, [Neil] is working on a navigation aid for the blind. He’s calling his device Pathfinder, and it’s designed to allow greater freedom of motion for the disabled.
Pathfinder is a relatively simple device, with a cheap, off the shelf ultrasonic distance sensor, an ATMega, and a few passives. On its own, the ultrasonic distance sensor is only accurate to about 5%. By incorporating a temperature sensor, [Neil] was able to nail down the accuracy of his sensor to about 1%. Impressive!
For the machine to human interface, [Neil] chose haptic feedback, or small vibration motors tucked away inside a wristband. It’s by far the easiest way to add the output needed, and with a haptic motor driver, it’s easy to add specialized drive patterns to the vibration motor
You can check out [Neil]’s quarterfinal entry video for the Pathfinder below.
Continue reading “Hackaday Prize Semifinalist: Haptic Navigation”
Many of us have gone on a stationary romp through some virtual or augmented scape with one of the few headsets out in the wild today. While the experience of viewing a convincing figment of reality is an exciting sensation in itself, [Mark Lee] and [Kevin Wang] are figuring out how to tie other senses into the mix.
The duo from Cornell University have built a mechanical exoskeleton that responds to light with haptic feedback. This means the wearer can touch the sphere of light around a source as if it were a solid object. Photo resistors are mounted like antenna to the tip of each finger, which they filed down around the edges to receive a more diffused amount of light. When the wearer of the apparatus moves their hand towards a light source, the sensors trigger servo motors mounted on the back of the hand to actuate and retract a series of 3D printed tendons which arch upward and connect to the individual fingers of the wearer. This way as the resistors receive varying amounts of light, they can react independently to simulate physical contours.
One of the goals of the project was to produce a working proof of concept with no more than 100 dollars worth of materials, which [Mark] and [Kevin] achieve with some cash to spare. Their list of parts can be found on their blog along with some more details on the project.
Continue reading “Touching Light with Haptic Feedback”
For his entry to The Hackady Prize, [Sean] is building a haptic vest for gamers and the visually impaired. It’s exactly what you think it is: a vest with proximity sensors and motors that wrap around the wearer, providing haptic feedback of nearby obstacles. Actually building a vest with a few dozen motors is a bit of a challenge, and that’s why this project is in the running for The Hackaday Prize.
Each of the 48 motors are individually controllable with PWM. In any other project, this would require a few dozen microcontrollers or one with a whole lot of pins. [Sean], however, is using LED drivers. They do exactly what [Sean] needs them to do – an easy to interface way of a whole bunch of PWM lines – and they do it cheaper than any other solution.
For detecting objects surrounding the vest, [Sean] is using the depth sensor on a 1st gen Microsoft Kinect. In testing, [Sean] blindfolded a volunteer and had a few friends move around with cardboard ‘obstacles.’ The volunteer successfully avoided all the obstacles, as seen in the video below.
The project featured in this post is a quarterfinalist in The Hackaday Prize.
Continue reading “THP Semifinalist: A Haptic Vest With 48 Vibration Motors”
The most common way to put some sort of haptic feedback in an interface hasn’t changed much since the plug-in rumble pack for the Nintendo 64 controller – just put a pager motor in there and set it spinning when the user needs to feel something. This method takes a relatively long time to spin up, and even the very cool Steam controller with voice coiled directional pads can’t ‘stick’, or stay high or low to notify the user of something.
[Tim]’s day job is working with very fancy piezoelectric actuators, and when an opportunity came up to visit the Haptics symposium, he jumped at the chance to turn these actuators into some sort of interface. He ended up creating two devices: a two-piezo cellphone-sized device, and a mouse with a left click button that raises and lowers in response to the color of the mousepad.
The cellphone device contains two piezo actuators with a 10 gram weight epoxied on. A small microcontroller and piezo driver give this pseudo phone the smoothest vibrations functions you can imagine. The much more innovative color-sensing mouse has a single actuator glued to the left button, and a photosensor in the base. When the mouse rolls over a dark square on a piece of paper, the button raises. Rolling over a lighter area, the button lowers. It’s all very, very cool tech and something we’ll probably see from Apple, Microsoft, or Sony in a few years.
Videos of both devices below.
Continue reading “Piezos For Haptic Feedback”
While I was at Heatsync Labs in Mesa Arizona, [Nate] mentioned that he was really proud of helping someone build a robotic hand. I have tracked down that project because it looked pretty cool.
[Macguyver603] built this robotic hand that is controlled by a glove with flex sensors. He was originally going to 3d print the structure for the hand but the availability of the laser cutter allowed him to create something a that would be a little more structurally sound. Haptic feedback is supplied by vibrating pager motors that are triggered by sensors in the tips of the robotic hand’s fingers.
The total cost of the project was roughly $240, and there’s unfortunately no video. It did, however, earn him second place at the state fair!
[Diego] wrote in to let us know about the haptic feedback arm project with which he’s hard at work. He calls it the Vimphin, which is uses the beginning letters from the words: Virtual Manipulator Physical Interface. Instead of a claw, the robot arm has a hand grip that lets you easily move it around. That is unless the virtual model of the arm encounters a dense substance, and then it’s going to be more difficult to move.
The test arm seen above includes several high quality robotic servo motors. You probably know that servo motors have feedback circuits that let them sense their position, and this is what is used to detect when a user moves the arm. This movement is tracked in the virtual 3D environment seen on the screen. In this case, the base of the robot is sitting in a pool of water. When the end of the virtual arm is in open air it’s pretty easy to move. When it dips below the water line the motors are used to increase resistance, simulating movement through a denser substance.
This sounds like a great piece of hardware to have around when the OASIS is finally developed.
Continue reading “Robot arm provides haptic feedback from the virtual world”