Pneumatic Glove For Therapy And Experimentation

Many projects have aimed to replicate the function of the human hand, creating robotic structures that mimic real anatomy. Fewer have attempted to work with human hands directly. SoftGlove is a project by [france.bonde] that uses pneumatics to do just that.

The glove works by using a silicone pneumatic actuator for each digit on the human hand, attached to a glove. These are created with 3D printed molds, into which EcoFlex silicone is poured. A FlowIO device is used to run the pneumatics, which combines a microcontroller with penumatic hardware to pump air in and out of the actuators.

The goal of the project is to use a companion unit, in which a glove with flex sensors is used to make the SoftGlove mimic its movements. This would allow SoftGlove to move the fingers of a person with damaged muscle control, potentially aiding the muscles and nerves to recover when used in a therapeutic setting.

It’s exciting to see typical maker technologies used in a context to create better outcomes for patients, and we’re excited to see where this project leads next. It also has potential applications for robotic actuators, too. Programmable Air is another exciting project working in this space, too. And of course, if you’ve got a hot pneumatics project you’re cooking up in the garage, be sure to let us know!

Haptic Glove Controls Robot Hand Wirelessly

[Miller] wanted to practice a bit with some wireless modules and wound up creating a robotic hand he could teleoperate with the help of a haptic glove. It lookes highly reproducible, as you can see the video, below the break.

The glove uses an Arduino’s analog to digital converter to read some flex sensors. Commercial flex sensors are pretty expensive, so he experimented with some homemade sensors. The ones with tin foil and graphite didn’t work well, but using some bent can metal worked better despite not having good resolution.

Continue reading “Haptic Glove Controls Robot Hand Wirelessly”

Use Jedi Mind Tricks To Control Your Next Drone Swarm

Controlling a single drone takes up a considerable amount of concentration and normally involves wearing silly goggles. It only gets harder if you want to control a swarm. Researchers at Skolkovo Institute of Technology decided Jedi mind tricks were the best way, and set up swarm control using hand gestures. 

We’ve seen something similar at the Intel Booth of the 2016 Makerfaire. In that demo, a single drone was controlled by hand gesture using a hacked Nintendo Power Glove. The Skoltech approach has a lot of innovation building on that concept. For one, haptics in the finger tips of the glove provide feedback from the current behavior of the drones. Through their research they found that most operators quickly learned to interpret the vibrations subconsciously.

It also increased the safety of the swarm, which is a prime factor in making these technologies usable outside of the lab. Most of us have at one point frantically typed commands into a terminal or pulled cords to keep a project from destroying itself or behaving dangerously. Having an intuitive control means that an operator can react quickly to changes in the swarm behavior.

The biggest advantage, which can be seen in the video after the break, is that the hand control eliminates much of the preprogramming of paths that is currently common in swarm robotics. With tech like this we can imagine a person quickly being trained on drone swarms and then using them to do things like construction surveys with ease. As an added bonus the researchers were nice enough to pre-submit their paper to arxiv if any readers would like to get into the specifics.

Continue reading “Use Jedi Mind Tricks To Control Your Next Drone Swarm”

Lighting The Way For The Visually Impaired

The latest creation from Bengali roboticist [nabilphysics] might sound familiar. His laser-augmented glove gives users the ability to detect objects horizontally in front of them, much like a cane or pole is used by the visually impaired to navigate through a physical space.

As a stand in for the physical cane, he uses the VL53L0X time-of-flight (TOF) sensor which detects the time taken for a laser source to bounce back to the sensor. Theses are much more accurate than IR distance sensors and have a much finer focus than ultrasonic sensors for excellent directionality.

While the sensors can succumb to interferences from background light or other time-of-flight sensors, the main advantages are speed of calculation (it relies on a single shot to compute the distances within a scene) and an efficient distance algorithm that simplifies the measurement of distance data. In contrast to stereo vision, which requires complex correlation algorithms, the process for extracting information for a time-of-flight sensor is entirely direct, requiring a small amount of processing power.

The glove delivers haptic feedback to the user to determine if an object is in their way. The feedback is controlled through an Arduino Pro Mini, powered remotely by a LiPo battery. The code is uploaded to the Arduino from an FTDI adapter, and works by taking continuous readings from the time-of-flight sensor and determining if the object in front is within 450 millimeters of the glove, at which point it triggers the vibration motor to alert the user of the object’s presence.

Since the glove used for the project is a bicycle glove, the form factor is straightforward — the Arduino, motor, battery, and switch are all located inside a plastic box on the top of the glove, while the time-of-flight sensor sticks out to make continuous measurements when the glove is switched on.

In general, the setup is fairly simple, but the idea of using a time-of-flight sensor rather than an IR or sonar sensor is interesting. In the broader usage of sensors, LIDARs are already the de facto sensor used for autonomous vehicles and robotic components that rely on distance sensing. This three-dimensional data wouldn’t be much use here and this sensor works without mechanical moving parts since it doesn’t rely on the point-by-point scan from a laser beam that LIDAR systems use.

Punch The World With A Raspberry Pi

Robots have certainly made the world a better place. Virtually everything from automobile assembly to food production uses a robot at some point in the process, not to mention those robots that can clean your house or make your morning coffee. But not every robot needs such a productive purpose. This one allows you to punch the world, which while not producing as much physical value as a welding robot in an assembly line might, certainly seems to have some therapeutic effects at least.

The IoT Planet Puncher comes to us from [8BitsAndAByte] who build lots of different things of equally dubious function. This one allows us to release our frustration on the world by punching it (or rather, a small model of it). A small painted sphere sits in front of a 3D-printed boxing glove mounted on a linear actuator. The linear actuator is driven by a Raspberry Pi. The Pi’s job doesn’t end there, though, as the project also uses a Pi camera to take video of the globe and serve it on a webpage through which anyone can control the punching glove.

While not immediately useful, we certainly had fun punching it a few times, and once a mysterious hand entered the shot to make adjustments to the system as well. Projects like this are good fun, and sometimes you just need to build something, even if it’s goofy, because the urge strikes you. Continue reading “Punch The World With A Raspberry Pi”

This Machine Teaches Sign Language

Sign language can like any language be difficult to learn if you’re not immersed in it, or at least learning from someone who is fluent. It’s not easy to know when you’re making minor mistakes or missing nuances. It’s a medium with its own unique issues when learning, so if you want to learn and don’t have access to someone who knows the language you might want to reach for the next best thing: a machine that can teach you.

This project comes from three of [Bruce Land]’s senior electrical and computer engineering students, [Alicia], [Raul], and [Kerry], as part of their final design class at Cornell University. Someone who wishes to learn the sign language alphabet slips on a glove outfitted with position sensors for each finger. A computer inside the device shows each letter’s proper sign on a screen, and then checks the sensors from the glove to ensure that the hand is in the proper position. Two letters include making a gesture as well, and the device is able to track this by use of a gyroscope and compass to ensure that the letter has been properly signed. It appears to only cover the alphabet and not a wider vocabulary, but as a proof of concept it is very effective.

The students show that it is entirely possible to learn the alphabet reliably using the machine as a teaching tool. This type of technology could be useful for other applications as well, such as gesture recognition for a human interface device. If you want to see more of these interesting and well-referenced senior design builds we’ve featured quite a few, from polygraph machines to a sonar system for a bicycle.

Continue reading “This Machine Teaches Sign Language”

Looks Like A Glove, Plays Like A Musical Instrument

The GePS is a musical project that shows how important integration work is when it comes to gesture controls. Creators [Cedric Spindler] and [Frederic Robinson] demonstrate how the output of a hand-mounted IMU (Inertial Measurement Unit) and magnetometer can be used to turn motion, gestures, and quick snap movements into musical output. The GePS is designed to have enough repeatability and low enough latency that feedback is practically immediate. As a result, it can be used and played like any other musical instrument that creates sound from physical movements in a predictable way. It’s not unlike a Theremin in that way, but much more configurable.

To do this, [Cedric] and [Frederic] made GePS from a CurieNano board (based on Intel’s Curie, which also has the IMU on-board) and an XBee radio for a wireless connection to software running on a computer, from which the sounds are played. The device’s sensitivity and low lag means that even small movements can be reliably captured, meaning that the kind of fluid and complex movements that hands do every day can be used as the basis for playing sounds with immediate feedback. In a very real sense, the glove-based GePS is an experimental kind of new instrument, which makes it a fascinating contender for the Musical Instrument Challenge portion of the 2018 Hackaday Prize.