No matter what your parents might say, games are good for us. They teach us to manage resources and give us dopamine rewards just like eating and mating do. Even if you’re no good at games in general, they are still a fun distraction from life.
There are so many games out there that could be enjoyed by the visually impaired, except that they rely on visuals. For example, you can play Yahtzee with nothing more than five dice, a cup, pencil and paper, and knowledge of the rules and scoring. The biggest obstacles are differentiating the dice from each other and keeping score.
One of our esteemed 2019 Hackaday Prize Top 20 Finalists is [JanThar]’s Haptic Games. [JanThar]’s growing collection of games uses 3D printing, vibration motors, and RFID to replace visual cues with sensory feedback. Yahtzee-wise, there’s a set of printed dice and scorecards. The scorecards use spherical magnets and an abacus layout. [JanThar] is also working on a Memory game to teach Braille, though it could be adapted to pure Braille for the visually impaired. Each game piece contains an RFID chip, so players can hold it up to a reader to check what they have.
Our favorite might be the PONG game that’s built on [JanThar]’s 2017 Hackaday Prize entry, the HaptiVision vest. Through the magic of a 16×8 field of vibration motors, players can track the ball’s movement across their torso and control the paddles with a sliders. There’s a brief demo of the games after the break.
Scribble is a haptic interface lets you draw your way through traffic. In an environment where fully automated vehicles are becoming the expectation for the next step in transportation, Scribble provides a friendly alternative that allows you to guide your car around, while the automation makes decisions on how to actually steer the car around obstacles.
The driver is guided by haptic feedback that alerts them about the road conditions or obstacles ahead. The project was conceived by [Felix Ros] for his master’s thesis at Eindhoven University, featured a five bar linkage that moves with two lateral degrees of freedom, commonly used for drawing robots.
The code run on an Arduino DUE control over serial by a program made in Open Frameworks that communicates with a Unity 3D driving simulator over UDP. Fellow graduate student [Frank van Valeknhoef]’s Haptic Engines are used as the actuators, outputting the position and a variable force.
The forward kinematics algorithms were based on a clock and weather plotter by SAP, sharing the same servo and drawing arm assembly. The left and right actuators update based on the desired angle, calculating the proper angles needed to achieve the correct position.
While automated vehicles may be able to travel efficiently from one destination to the next, they can’t necessarily wander off course to explore new places. Scribble takes back some of that freedom and allows drivers to decide for themselves where they want to be. It’s an interesting take at inserting the human back into the driver’s seat in automated cars.
Picture this: You’re in your bed in the middle of the night, and you want to know what time it is. Bedside alarm clocks are a thing of the past and now you rely on your smartphone to tell the time. Only, if you turned the screen on, you’d find that looking at it in the dark is tantamount to staring at the sun without eye protection. [Michael] pictured the same thing and his solution for this scenario is a clever haptic-feedback clock.
The idea behind it is simple, a clock from which you can tell the time without having to use your eyes. This one gives you two options for that, the first one being a series of haptic pulses that let you tell the time simply by touching the device. The second, audibly telling the time with voice samples stored in a flash chip, was added in the second revision as [Michael] continues to refine his design. In addition to helping us assess the time in the dark, it’s also worth noting that this could be useful for those with visual impairments as well.
Until we can see the final product, you can help him out looking over the designs and sending pull requests over at the project’s GitHub page, or just watch his progress in the Hackaday.io page. We’ve seen some interesting ways to tell the time before, from a game of Tetris to a clock housed inside the shell of an old-school camera flash, but we’ve never seen one that uses haptic feedback before. We hope for the sake of our eyes that it catches on!
A big challenge in the VR world is getting haptic feedback no matter where you are. That’s not so much of a problem when you’re sitting in a chair, the hardware can be attached to the chair or to something near you, what’s referred to as grounded force-feedback. But with VR, we’ve gotten used to at least moving around a room. How then do you feel the recoil of a gun, the pressure against a shield, the inertia of a sword slicing through the air, or the pulsations of magic sword emitting lightning?
A team of researchers at the [MAKinteract Lab] at KAIST, a university in South Korea, have come up with a small device which straps to your wrist and provides all those types of feedback. It’s called the Wind-Blaster and consists of two ducted propellers which can provide up to 1.5N of force. Both propellers are mounted on servos, and with the help of an IMU, the propellers are oriented as needed. An Arduino doing PWM controls the motor speeds.
Fire a VR shotgun and the propellers quickly spin up to 33,000 RPM for just 250 ms, giving your lower arm a quick backward tug, providing the feel of a recoiling gun. Swing a VR sword through the air and the propellers rotate at 33,000 RPM for 400 ms and then linearly decelerate to a stop in 300 ms. Making the propellers move asynchronously with respect to each other causes rotation torque on your arm for a pulsating feeling for the magic lightning-emitting sword. A connected PC runs the games using the Unity game engine. As with drones, there is noise at around 41 dB but the user’s headphones block it out. Watch it in action in the videos below.
The worst thing about walking around while trying to follow directions is that you have to keep looking down at them to get the next turn. At best, you’ll miss out on the scenery; at worst, you might walk into traffic.
Wouldn’t it be great if you didn’t have to look down? Yes it would, and with Walkity, there’s no need to look down. Walkity is a set of cuffs that slip on the backs of your shoes, pairs with your phone, and uses haptic feedback to tell you where to go. Each one has an Arduino Mini Pro, an NRF24L01 to talk to its mate, a Bluetooth module, a vibration motor, and what must be the thinnest, most flexible LiPo currently available on Earth. The specified cell is PGEB0083559, a 65 mAH cell that is 0.8 mm thick!
Your smartphone will vibrate in your pocket during naviation but our experience has been that of still not knowing which way to turn. Walkity’s feedback is simple and intuitive. The left cuff vibrates to indicate a left turn, right for right, and both vibrate when you reach your destination. Going the wrong way? Walkity will vibrate vigorously to let you know it’s time to pull over. It’s a great example of a an entry for the Human Computer Interface Challenge of the Hackaday Prize!
Looking for ideas for your haptics projects? [Destin] of the Smarter Every Day YouTube channel got a tour from the engineers at HaptX of their full-featured VR glove with amazing haptic feedback both with a very fine, 120-point sense of touch, force feedback for each finger, temperature, and motion tracking.
In hacks, we usually stimulate the sense of touch by vibrating something against the skin. With this glove, they use pneumatics to press against the skin. A single fingertip has multiple roughly 1/8 inch air bladders in contact with it. Each bladder is separately pneumatically controlled by pushing air into it. The air pressure can vary continuously so that the bladders can push lightly, harder or anywhere in between. The glove has 120 of these bladders spread out over the fingers and the palm. Unfortunately, they didn’t allow him to see the valves controlling the pneumatics, but if you are looking for a low-frequency, low-cost way to actuate valves you might consider using syringes. The engineers do tell [Destin] that if your VR scene shows something pressing against your virtual finger, as long as your haptics push against your real finger within around 1/8th of a second, your brain won’t notice the delay.
They’re also working on using hot and cold fluids to give a sense of temperature within a glove. This is demonstrated in the first video below when [Destin] feels heat while a dragon in the VR world breathes fire on his hand. Fortunately one of the engineers mentions that our sense of temperature is one of the slower ones, it can handle longer latencies than even touch. We can see implementing this in a hack using a bladder pressing against the skin while tubes circulate different temperature fluids through it. But maybe there’s a way to do it electrically, possibly with thermoelectric modules as is done with this drinks cooler? Though safety issues might prohibit that.
Other features mentioned are force feedback for each finger, and their custom motion tracking which uses both magnetic and optical means to track fingertips. But we’ll leave the rest to the videos below. The first is the technical tour and the second is the glove being used in the VR world.
We have seen a few of these types of devices in the past, and they almost always use ultrasonic sensors to gauge distance. Not so with this ETA; it uses six VL53L0X time-of-flight (ToF) sensors mounted at slightly different angles from each other, which provides a wide sensing map. It is capable of detecting objects in a one-meter-wide swath at a range of one meter from the sensors.
The device consists of two parts, a wayfinding wand and a feedback module. The six ToF sensors are strapped across the end of a flashlight body and wired to an Arduino Mini inside the body. The Mini receives the sensor data over UART and sends it to the requisite PIC32, which is attached to a sleeve on the user’s forearm. The PIC decodes these UART signals into PWM and lights up six corresponding vibrating disc motors that dangle from the sleeve and form a sensory cuff bracelet around the upper forearm.
We like the use of ToF over ultrasonic for wayfinding. Whether ToF is faster or not, the footprint is much smaller, so its more practical for discreet assistive wearables. Plus, you know, lasers. You can see how well it works in the demo video after the break.