[Roy Shilkrot] and his fellow researchers at the MIT Media Lab have developed the FingerReader, a wearable device that aids in reading text. Worn on the index finger, it receives input from print or digital text and outputs spoken words – and it does this on-the-go. The FingerReader consists of a camera and sensors that detect the text. A series of algorithms the researchers created are used along with character recognition software to create the resulting audio feedback.
There is a lot of haptic feedback built into the FingerReader. It was designed with the visually impaired as the primary user for times when Braille is not practical or simply unavailable. The FingerReader requires the wearer to make physical contact with the tip of their index finger on the print or digital screen, tracing the line. As the user does so, the FingerReader is busy calculating where lines of text begin and end, taking pictures of the words being traced, and converting it to text and then to spoken word. As the user reaches the end of a line of text or begins a new line, it vibrates to let them know. If a user’s finger begins to stray, the FingerReader can vibrate from different areas using two motors along with an audible tone to alert them and help them find their place.
The current prototype needs to be connected to a laptop, but the researchers are hoping to create a version that only needs a smartphone or tablet. The videos below show a demo of the FingerReader. For a proof-of-concept, we are very impressed. The FingerReader reads text of various fonts and sizes without a problem. While the project was designed primarily for the blind or visually impaired, the researchers acknowledge that it could be a great help to people with reading disabilities or as a learning aid for English. It could make a great on-the-go translator, too. We hope that [Roy] and his team continue working on the FingerReader. Along with the Lorm Glove, it has the potential to make a difference in many people’s lives. Considering our own lousy eyesight and family’s medical history, we’ll probably need wearable tech like this in thirty years!
Continue reading “Trace Your Book or Kindle with the FingerReader”
If you’ve been killing time texting or chatting with your pals via smart phone, odds are pretty good that you’re not giving much thought to the two senses that make it happen: your sight and your hearing. Those who are deafblind, however, cannot participate in these activities; and for many, the remote communication that most of us enjoy with our phones simply isn’t possible. Enter Berlin University of the Arts Design Research Lab. Here, they’ve developed the Mobile Lorm Glove, a haptics device that enables two-way remote communication via smart phone.
For the deafblind, Lorm is the tactile technique for communication. Lorm is a series of hand-tracing gestures that map to characters of the alphabet. To communicate with others, the gloved user can trace Lorm directly onto the pressure-sensitive inputs on the palm of the hand. To receive messages, small vibration motors on the back of the hand vibrate to indicate the message encoded in Lorm.
Originally, to communicate with the deafblind, we must first learn Lorm. With the Mobile Lorm Glove, however, we need only know how to send text messages, and the Lorm-decoding is handled with a look-up table running on our classic Atmega328 microcontroller. For the sharp-eyed, the back-side of the glove seems limited in its capability to transcribe continuous finger traces into discrete motor vibrations. However, with four shift-registers and 32 levels of motor-intensities, the designers address each motor with a technique called “funneling illusion” where continuous movement is simulated by gradually changing the intensity from motor to motor. For more tricks and details, take a look at their conference paper.
By wearing the glove, everyday communication can be made far easier with anyone with a smart phone. We’re jazzed that just a Bluetooth module, an Atmega328, and a collection of pressure sensors and motors can enable any cell phone user to circumvent the learning curve and open up a new conversation.
Continue reading “Mobile Lorm Glove Puts Texting Back Into Everyone’s Hands”
If you’ve been keeping your skills fresh with any console video games in the last 15 years (or you’ve acquired a smartphone), you’ll know that “rumble,” or “haptic” feedback plays a key role in augmenting our onscreen (or touch-screen typing) experience. Nevertheless, this sort of rumble feedback is surprisingly boolean, and hasn’t developed into a richer level of precision since it started to be introduced in gaming over 30 years ago. In response, [Martin] and his fellow design teammates at the University of Salzburg, Austria have introduced the TorqueScreen, a mobile haptic attachment that puts a twist on conventional force feedback.
At its core the TorqueScreen is a gyroscope attached to a servo with the respective rotational axes in a perpendicular alignment. When the servo rotates the live gyroscope, the user can sense the tablet’s resistance to rotate about the servos axis.
The team’s conference demonstration model features a brushless motor plucked from an old hard drive controlled by an Arduino and driven manually by a Wii nunchuck. Currently only one rotational axis resists changes in rotation, but a gimbal may be the next step in this project.
We’ve certainly seen budget handheld haptic devices before, but this project allows for an entire spread of responses proportional to the speed at which the gyroscope is rotated about the servo axis.
Unless you’re reading this post on your already-torquescreen-enabled mobile device, it’s a bit difficult to get a feel for what kind of interactions you can produce with this setup. The video (after the break), though, can give you a pretty good idea of what kind of interactivity you’d expect with device clipped onto your tablet.
[via the Tangible, Embedded, and Embodied Interaction Conference]
Continue reading “TorqueScreen: Haptic Feedback On-the-Go”
If you’ve been keeping up with augmented and virtual reality news, you’ll remember that spacial haptic feedback devices aren’t groundbreaking new technology. You’ll also remember, however, that a professional system is notoriously expensive–on the order of several thousand dollars. Grad students [Jonas], [Michael], and [Jordi] and their professor [Eva-Lotta] form the design team aiming to bridge that hefty price gap by providing you with a design that you can build at home.
A quick terminology dive: a spacial haptic device is a physical manipulator that enables exploration of a virtual space through force feedback. A user grips the “manipulandum” (the handle) and moves it within the work area defined by the physical design of the device. Spacial Haptic Devices have been around for years and serve as excellent tools for telling their users (surgeons) what something (tumor) “feels like.”
In our case, this haptic device is a two-link, two-joint system grounded on a base station and providing force feedback with servo motors and tensioned wire ropes. The manipulator itself supports 3-degree-of-freedom movement of the end-effector (translations, but no rotations) which is tracked with encoders placed on all joints. To enable feedback, joints are engaged with cable-drive transmissions.
The design team isn’t new to iterative prototyping. Hailing from CS235, a Stanford course aimed to impart protoyping techniques to otherwise non-tinkerers, the designers have drawn numerous techniques from this course to deliver a fully functional and reproducible setup. In fact, it’s clear that the designers have a strong understanding of their system’s physics, and they capitalize on a few tricks that don’t immediately jump out to us as intuitive. For instance, rather than rigidly fixing their cable to the motor shaft, they simply wrap the cable around the shaft a mere 5 turns such that the force of friction greatly exceeds the threshold amount that would otherwise cause slipping. They also choose plywood–not necessarily because of its price–but more so because of its function as a stiff, layered composite that makes it ideal “lever arm material” for rigidly transferring forces.
For a full breakdown of their design, take a look at their conference paper (PDF) where they evaluate their design techniques and outline the forward kinematics. They’ve also provided a staggeringly comprehensive bill of materials (Google Spreadsheet). Finally, as justifiably open source hardware, they’ve packaged their control software and CAD models into a github repository so that you too can jump into the world of quality force feedback simulation without shelling out the twenty thousand dollars for a professional system.
[via the 2015 Tangible Embedded and Embodied Conference]
Continue reading “Open Source Haptics Kit Aims to Democratize Force Feedback”
In June of 2014, [Afrdt] spent two weeks on a boat as an artist-in-residence in Linz, Austria. During that time, she created a dress that detects EMF waves and outputs them to vibration motors and a headphone jack.
[Afrdt] started by making two EMF coil antennas and sewed them to cuffs that snap together. She crafted fashionable fabric stripes that both conceal and carry the cables from the coils to an Adafruit FLORA that’s sewn into the body of the dress. The wearer experiences haptic feedback via vibration motors in the chest, and sonic feedback from a mini female headphone jack built into the collar. The zipper functions as a low-pass filter and volume control for the jack. One side bears resistive tape and runs to the FLORA, which is programmed to play an 800Hz tone. The other side runs to the headphone jack via conductive thread. As the zipper is opened, the pitch increases to toward the maximum pitch of 880Hz.
She drew inspiration for this project from [Aaron Alai]’s EMF detector project and built the code on top of it. Broader documentation and many more pictures are available both at [Afrdt]’s site and the residency program’s site.
This project is an official entry to The Hackaday Prize that sadly didn’t make the quarterfinal selection. It’s still a great project, and worthy of a Hackaday post on its own.
For his entry to The Hackady Prize, [Sean] is building a haptic vest for gamers and the visually impaired. It’s exactly what you think it is: a vest with proximity sensors and motors that wrap around the wearer, providing haptic feedback of nearby obstacles. Actually building a vest with a few dozen motors is a bit of a challenge, and that’s why this project is in the running for The Hackaday Prize.
Each of the 48 motors are individually controllable with PWM. In any other project, this would require a few dozen microcontrollers or one with a whole lot of pins. [Sean], however, is using LED drivers. They do exactly what [Sean] needs them to do – an easy to interface way of a whole bunch of PWM lines – and they do it cheaper than any other solution.
For detecting objects surrounding the vest, [Sean] is using the depth sensor on a 1st gen Microsoft Kinect. In testing, [Sean] blindfolded a volunteer and had a few friends move around with cardboard ‘obstacles.’ The volunteer successfully avoided all the obstacles, as seen in the video below.
The project featured in this post is a quarterfinalist in The Hackaday Prize.
Continue reading “THP Semifinalist: A Haptic Vest With 48 Vibration Motors”
The people at Two Bit Circus are at it again; this time with a futuristic racing simulator where the user controls the experience. It was developed by [Brent Bushnell] and [Eric Gradman] along with a handful of engineers and designers in Los Angeles, California. The immersive gaming chair utilized an actual racing seat in the design, and foot petals were added to give the driver more of a feeling like they were actually in a real race. Cooling fans were placed on top for haptic feedback and a Microsoft Kinect was integrated into the system as well to detect hand gestures that would control what was placed on the various screens.
The team completed the project within in thirty days during a challenge from Best Buy who wanted to see if they could create the future of viewing experiences. Problems surfaced throughout the time frame though creating obstacles surrounding the video cards, monitors, and shipping dates. They got it done and are looking towards integrating their work into restaurants like Dave & Buster’s and other facilities like arcades and bars (at least that’s the rumor going around town). The 5 part mini-series that was produced around this device can be seen after the break:
Continue reading “Custom Racing Chair with a Kinect and Haptic Feedback”