Sign language is a language that uses the position and motion of the hands in place of sounds made by the vocal tract. If one could readily capture those hand positions and movements, one could theoretically digitize and translate that language. [ayooluwa98] built a set of sensor gloves to do just that.
The brains of the operation is an Arduino Nano. It’s hooked up to a series of flex sensors woven into the gloves, along with an accelerometer. The flex sensors detect the bending of the fingers and the gestures being made, while the accelerometer captures the movements of the hand. The Arduino then interprets these sensor signals in order to match the user’s movements up with a pre-stored list of valid signs. It can then transmit out the detected language via a Bluetooth module, where it is passed to an Android phone for translation via text-to-speech software.
The idea of capturing sign language via hand tracking is a compelling one; we’ve seen similar projects before, too. Meanwhile, if you’re working on your own accessibility projects, be sure to drop us a line!
[Lucas VRTech] has made some significant progress with building force-feedback type haptic gloves for use with Steam VR games. The idea is pretty straightforward: the end of the finger is attached to a cable, which is pulled from inside a sprung-loaded spool; the kind used for hanging ID cards on.
The spool body can rotate, but a peg protruding from it engages with the arm of a co-located servo motor. This produces a programmable stop position. But it is a hard stop, and it is not possible with the current hardware to detect precisely when the stop is reached, nor is it possible to control the force it is pushing with. Such features are not difficult to achieve, its just a matter of a little more development with some custom mechatronics.
The current prototype has a focus on cost, which is great as an early development platform. By leveraging 3D printing and off-the-shelf parts that are easy to source; just a handful (chuckle!) of potentiometers, some servo motors and one from any number of ESP32 dev boards and you’re done. The real work is on the software side of things, as the games themselves need to be modified to play ball with the VR glove hardware. This has been achieved with a combination of a custom steam driver they call OpenGloves, and community developed per-game mods. A few titles are available to test right now, so this is definitely something some of us could build in a weekend and get involved with.
The hardware source for the glove mount and per-finger units can be found on the project GitHub, together with the ESP32 source for Arduino.
For some other haptic-related inspiration, here’s a force-feedback mouse, and for a more hand-off feedback, we have a wind-blaster project.
Continue reading “Low Cost Haptic VR Gloves Work With Hacked Steam Games”
Virtual reality is a slow-moving field in some respects. While a lot of focus is put on optical technologies and headsets, there’s a lot more involved when it comes to believably placing a human being in a virtual environment. So far, we’ve gotten a good start at the visuals and head tracking, but interaction technology is still lagging behind a lot. [Lucas] is working in just that area, iterating heavily on his homebrew VR gloves.
The gloves consists of potentiometers, fitted with spools and attached to the tip of each digit on a wearer’s hand by a string. As the user curls their fingers, the potentiometers turn and the position of the fingers can be measured. The potentiometers are all read via an Arduino, which communicates back to a PC via USB. A custom driver is then used to interact with Valve’s SteamVR software, allowing the glove to be used with a wide variety of existing software.
Thus far, the system is merely tracking finger position, but the spool and string based design is intended to support motors down the line for each finger to create resistance, so the user can gain the feeling of touching and interacting with virtual objects. The project has the potential to be a cheaper, more accessible alternative than current off-the-shelf solutions. We’ve seen other hand-tracking gloves before, too – though none that track the spread of a wearer’s hand as well as the finger extension. If you’re working on precisely that, please do drop us a line. Video after the break.
Continue reading “Virtual Reality Gloves Aim To Improve Interactivity”
We have to hand it to this team, their entry for the 2020 Hackaday Prize is a classic pincer maneuver. A team from [The University of Auckland] in New Zealand and [New Dexterity] is designing a couple of gloves for both rehabilitation and human augmentation. One style is a human-powered prosthetic for someone who has lost mobility in their hand. The other form uses soft robotics and Bluetooth control to move the thumb, fingers, and an extra thumb (!).
The human-powered exoskeleton places the user’s hand inside a cabled glove. When they are in place, they arch their shoulders and tighten an artificial tendon across their back, which pulls their hand close. To pull the fingers evenly, there is a differential box which ensures pressure goes where it is needed, naturally. Once they’ve gripped firmly, the cables stay locked, and they can relax their shoulders. Another big stretch and the cords relax.
In the soft-robotic model, a glove is covered in inflatable bladders. One set spreads the fingers, a vital physical therapy movement. Another bladder acts as a second thumb for keeping objects centered in the palm. A cable system draws the fingers closed like the previous glove, but to lock them they evacuate air from the bladders, so jamming layers retain their shape, like food in a vacuum bag.
We are excited to see what other handy inventions appear in this year’s Hackaday Prize, like the thumbMouse, or how about more assistive tech that uses hoverboards to help move people?
Continue reading “Assistive Gloves Come In Pairs”
Many projects have aimed to replicate the function of the human hand, creating robotic structures that mimic real anatomy. Fewer have attempted to work with human hands directly. SoftGlove is a project by [france.bonde] that uses pneumatics to do just that.
The glove works by using a silicone pneumatic actuator for each digit on the human hand, attached to a glove. These are created with 3D printed molds, into which EcoFlex silicone is poured. A FlowIO device is used to run the pneumatics, which combines a microcontroller with penumatic hardware to pump air in and out of the actuators.
The goal of the project is to use a companion unit, in which a glove with flex sensors is used to make the SoftGlove mimic its movements. This would allow SoftGlove to move the fingers of a person with damaged muscle control, potentially aiding the muscles and nerves to recover when used in a therapeutic setting.
It’s exciting to see typical maker technologies used in a context to create better outcomes for patients, and we’re excited to see where this project leads next. It also has potential applications for robotic actuators, too. Programmable Air is another exciting project working in this space, too. And of course, if you’ve got a hot pneumatics project you’re cooking up in the garage, be sure to let us know!
[Miller] wanted to practice a bit with some wireless modules and wound up creating a robotic hand he could teleoperate with the help of a haptic glove. It lookes highly reproducible, as you can see the video, below the break.
The glove uses an Arduino’s analog to digital converter to read some flex sensors. Commercial flex sensors are pretty expensive, so he experimented with some homemade sensors. The ones with tin foil and graphite didn’t work well, but using some bent can metal worked better despite not having good resolution.
Continue reading “Haptic Glove Controls Robot Hand Wirelessly”
Controlling a single drone takes up a considerable amount of concentration and normally involves wearing silly goggles. It only gets harder if you want to control a swarm. Researchers at Skolkovo Institute of Technology decided Jedi mind tricks were the best way, and set up swarm control using hand gestures.
We’ve seen something similar at the Intel Booth of the 2016 Makerfaire. In that demo, a single drone was controlled by hand gesture using a hacked Nintendo Power Glove. The Skoltech approach has a lot of innovation building on that concept. For one, haptics in the finger tips of the glove provide feedback from the current behavior of the drones. Through their research they found that most operators quickly learned to interpret the vibrations subconsciously.
It also increased the safety of the swarm, which is a prime factor in making these technologies usable outside of the lab. Most of us have at one point frantically typed commands into a terminal or pulled cords to keep a project from destroying itself or behaving dangerously. Having an intuitive control means that an operator can react quickly to changes in the swarm behavior.
The biggest advantage, which can be seen in the video after the break, is that the hand control eliminates much of the preprogramming of paths that is currently common in swarm robotics. With tech like this we can imagine a person quickly being trained on drone swarms and then using them to do things like construction surveys with ease. As an added bonus the researchers were nice enough to pre-submit their paper to arxiv if any readers would like to get into the specifics.
Continue reading “Use Jedi Mind Tricks To Control Your Next Drone Swarm”