The “absorbed device user” meme, like someone following Google Maps on a smart phone so closely that they walk out into traffic, is becoming all too common. Not only can an interface that requires face time be a hazard to your health in traffic, it’s also not particularly useful to the visually impaired. Haptic interfaces can help the sighted and the visually impaired alike, but a smart phone really only has one haptic trick – vibration. But a Yale engineer has developed a 3D printed shape-shifting navigation tool that could be a haptics game changer.
Dubbed the Animotus by inventor [Ad Spiers], the device is a hand-held cube split into two layers. The upper layer can swivel left or right and extend or retract, giving the user both tactile and visual clues as to which direction to walk and how far to the goal. For a field test of the device, [Ad] teamed up with a London theater group in an interactive production of the play “Flatland”, the bulk of which was staged in an old church in total darkness. As you can see in the night-vision video after the break, audience members wearing tracking devices were each given an Animotus to allow them to navigate through the interactive sets. The tracking data indicated users quickly adapted to navigation in the dark while using the Animotus, and some became so attached to their device that they were upset by the ending of the play, which involved its mock confiscation and destruction.
Performing art applications aside, there’s plenty of potential for haptics with more than one degree of freedom. Imagine a Bluetooth interface to the aforementioned Google Maps, or an electronic seeing-eye dog that guides a user around obstacles using an Animotus and a camera. There’s still plenty of utility in traditional haptics, though, as this Hackaday Prize semi-finalist shows.
Continue reading “Experimental Theater Helps Field test Haptic Navigation Device”
There are 3.6 Million deafblind people in the world, and by far their greatest problem is one of communication. For his entry for the Hackaday Prize, our own miracle worker on hackaday.io is creating a system that enables haptic communication for a variety of devices. It’s called Tact-Tiles, and instead of creating a single device, [Anderson] is building an entire system that enables a multitude of communication devices for deafblind people.
The basic unit of the Tact Tile system is a small, touch sensitive vibrating pad. These tiny PCBs can be fitted to just about anything, including a wired glove, or whatever haptic interface anyone can dream up. The core of the device is a small PCB that can control 32 of these vibrating pads, and communicates with a smartphone or computer over a Bluetooth connection.
With a little bit of software, the Tact Tiles can be configured an any way imaginable, with mapping individual tiles to letters of the alphabet, mapping gestures to letters, or any combination in between. [Anderson] has a great video demoing the possibility of his device, you can check that out below.
Continue reading “Hackaday Prize Semifinalist: Tact Tiles”
For his project entered in the Hackaday Prize, [Neil] is working on a navigation aid for the blind. He’s calling his device Pathfinder, and it’s designed to allow greater freedom of motion for the disabled.
Pathfinder is a relatively simple device, with a cheap, off the shelf ultrasonic distance sensor, an ATMega, and a few passives. On its own, the ultrasonic distance sensor is only accurate to about 5%. By incorporating a temperature sensor, [Neil] was able to nail down the accuracy of his sensor to about 1%. Impressive!
For the machine to human interface, [Neil] chose haptic feedback, or small vibration motors tucked away inside a wristband. It’s by far the easiest way to add the output needed, and with a haptic motor driver, it’s easy to add specialized drive patterns to the vibration motor
You can check out [Neil]’s quarterfinal entry video for the Pathfinder below.
Continue reading “Hackaday Prize Semifinalist: Haptic Navigation”
[Roy Shilkrot] and his fellow researchers at the MIT Media Lab have developed the FingerReader, a wearable device that aids in reading text. Worn on the index finger, it receives input from print or digital text and outputs spoken words – and it does this on-the-go. The FingerReader consists of a camera and sensors that detect the text. A series of algorithms the researchers created are used along with character recognition software to create the resulting audio feedback.
There is a lot of haptic feedback built into the FingerReader. It was designed with the visually impaired as the primary user for times when Braille is not practical or simply unavailable. The FingerReader requires the wearer to make physical contact with the tip of their index finger on the print or digital screen, tracing the line. As the user does so, the FingerReader is busy calculating where lines of text begin and end, taking pictures of the words being traced, and converting it to text and then to spoken word. As the user reaches the end of a line of text or begins a new line, it vibrates to let them know. If a user’s finger begins to stray, the FingerReader can vibrate from different areas using two motors along with an audible tone to alert them and help them find their place.
The current prototype needs to be connected to a laptop, but the researchers are hoping to create a version that only needs a smartphone or tablet. The videos below show a demo of the FingerReader. For a proof-of-concept, we are very impressed. The FingerReader reads text of various fonts and sizes without a problem. While the project was designed primarily for the blind or visually impaired, the researchers acknowledge that it could be a great help to people with reading disabilities or as a learning aid for English. It could make a great on-the-go translator, too. We hope that [Roy] and his team continue working on the FingerReader. Along with the Lorm Glove, it has the potential to make a difference in many people’s lives. Considering our own lousy eyesight and family’s medical history, we’ll probably need wearable tech like this in thirty years!
Continue reading “Trace Your Book or Kindle with the FingerReader”
If you’ve been killing time texting or chatting with your pals via smart phone, odds are pretty good that you’re not giving much thought to the two senses that make it happen: your sight and your hearing. Those who are deafblind, however, cannot participate in these activities; and for many, the remote communication that most of us enjoy with our phones simply isn’t possible. Enter Berlin University of the Arts Design Research Lab. Here, they’ve developed the Mobile Lorm Glove, a haptics device that enables two-way remote communication via smart phone.
For the deafblind, Lorm is the tactile technique for communication. Lorm is a series of hand-tracing gestures that map to characters of the alphabet. To communicate with others, the gloved user can trace Lorm directly onto the pressure-sensitive inputs on the palm of the hand. To receive messages, small vibration motors on the back of the hand vibrate to indicate the message encoded in Lorm.
Originally, to communicate with the deafblind, we must first learn Lorm. With the Mobile Lorm Glove, however, we need only know how to send text messages, and the Lorm-decoding is handled with a look-up table running on our classic Atmega328 microcontroller. For the sharp-eyed, the back-side of the glove seems limited in its capability to transcribe continuous finger traces into discrete motor vibrations. However, with four shift-registers and 32 levels of motor-intensities, the designers address each motor with a technique called “funneling illusion” where continuous movement is simulated by gradually changing the intensity from motor to motor. For more tricks and details, take a look at their conference paper.
By wearing the glove, everyday communication can be made far easier with anyone with a smart phone. We’re jazzed that just a Bluetooth module, an Atmega328, and a collection of pressure sensors and motors can enable any cell phone user to circumvent the learning curve and open up a new conversation.
Continue reading “Mobile Lorm Glove Puts Texting Back Into Everyone’s Hands”
If you’ve been keeping your skills fresh with any console video games in the last 15 years (or you’ve acquired a smartphone), you’ll know that “rumble,” or “haptic” feedback plays a key role in augmenting our onscreen (or touch-screen typing) experience. Nevertheless, this sort of rumble feedback is surprisingly boolean, and hasn’t developed into a richer level of precision since it started to be introduced in gaming over 30 years ago. In response, [Martin] and his fellow design teammates at the University of Salzburg, Austria have introduced the TorqueScreen, a mobile haptic attachment that puts a twist on conventional force feedback.
At its core the TorqueScreen is a gyroscope attached to a servo with the respective rotational axes in a perpendicular alignment. When the servo rotates the live gyroscope, the user can sense the tablet’s resistance to rotate about the servos axis.
The team’s conference demonstration model features a brushless motor plucked from an old hard drive controlled by an Arduino and driven manually by a Wii nunchuck. Currently only one rotational axis resists changes in rotation, but a gimbal may be the next step in this project.
We’ve certainly seen budget handheld haptic devices before, but this project allows for an entire spread of responses proportional to the speed at which the gyroscope is rotated about the servo axis.
Unless you’re reading this post on your already-torquescreen-enabled mobile device, it’s a bit difficult to get a feel for what kind of interactions you can produce with this setup. The video (after the break), though, can give you a pretty good idea of what kind of interactivity you’d expect with device clipped onto your tablet.
[via the Tangible, Embedded, and Embodied Interaction Conference]
Continue reading “TorqueScreen: Haptic Feedback On-the-Go”
If you’ve been keeping up with augmented and virtual reality news, you’ll remember that spacial haptic feedback devices aren’t groundbreaking new technology. You’ll also remember, however, that a professional system is notoriously expensive–on the order of several thousand dollars. Grad students [Jonas], [Michael], and [Jordi] and their professor [Eva-Lotta] form the design team aiming to bridge that hefty price gap by providing you with a design that you can build at home.
A quick terminology dive: a spacial haptic device is a physical manipulator that enables exploration of a virtual space through force feedback. A user grips the “manipulandum” (the handle) and moves it within the work area defined by the physical design of the device. Spacial Haptic Devices have been around for years and serve as excellent tools for telling their users (surgeons) what something (tumor) “feels like.”
In our case, this haptic device is a two-link, two-joint system grounded on a base station and providing force feedback with servo motors and tensioned wire ropes. The manipulator itself supports 3-degree-of-freedom movement of the end-effector (translations, but no rotations) which is tracked with encoders placed on all joints. To enable feedback, joints are engaged with cable-drive transmissions.
The design team isn’t new to iterative prototyping. Hailing from CS235, a Stanford course aimed to impart protoyping techniques to otherwise non-tinkerers, the designers have drawn numerous techniques from this course to deliver a fully functional and reproducible setup. In fact, it’s clear that the designers have a strong understanding of their system’s physics, and they capitalize on a few tricks that don’t immediately jump out to us as intuitive. For instance, rather than rigidly fixing their cable to the motor shaft, they simply wrap the cable around the shaft a mere 5 turns such that the force of friction greatly exceeds the threshold amount that would otherwise cause slipping. They also choose plywood–not necessarily because of its price–but more so because of its function as a stiff, layered composite that makes it ideal “lever arm material” for rigidly transferring forces.
For a full breakdown of their design, take a look at their conference paper (PDF) where they evaluate their design techniques and outline the forward kinematics. They’ve also provided a staggeringly comprehensive bill of materials (Google Spreadsheet). Finally, as justifiably open source hardware, they’ve packaged their control software and CAD models into a github repository so that you too can jump into the world of quality force feedback simulation without shelling out the twenty thousand dollars for a professional system.
[via the 2015 Tangible Embedded and Embodied Conference]
Continue reading “Open Source Haptics Kit Aims to Democratize Force Feedback”