The whole idea behind virtual reality is that you don’t really know what’s going on in the world around you. You only know what your senses tell you is there. If you can fake out your vision, for example, then your brain won’t realize you are floating in a tank providing power for the robot hordes. However, scientists in Japan think that you can even fool your feet into thinking they are walking when they aren’t. In a recent paper, they describe a test they did that combined audio cues with buzzing on different parts of the feet to simulate the feel of walking.
The trick only requires four transducers, two on each foot. They tested several different configurations of what the effect looked like in the participant’s virtual reality headgear. Tests were performed in third person didn’t cause test subjects to associate the foot vibrations with walking. But the first-person perspective caused sensations of walking, with a full-body avatar working the best, compared to showing just hands and feet or no avatar at all.
Making people think they are walking in VR can be tricky but it does explain how they fit all that stuff in a little holodeck. Of course, it is nice if you can also sense walking and use it to move your avatar, but that’s another problem.
Humans make walking look simple, but of course that’s an illusion easily shattered by even small injuries. Losing the ability to walk has an enormous impact on every part of your day, so rehabilitative advances are nothing short of life-changing. The Open Exosuit for Differently Abled project is working feverishly on their Hackaday Prize entry to provide a few different layers of help in getting people back on their feet.
We’ve seen a number of exosuit projects in the past, and all of them struggle in a few common places. It’s difficult to incorporate intuitive user control into these builds, and quite important that they stay out of the way of the user’s own balance. This one approaches those issues with the use of a walker that both provides a means of steadying one’s self, and facilitates sending commands to the exosuit. Using the OLED screen and buttons incorporated on the walker, the user can select and control the walking, sitting, and standing modes.
The exoskeleton is meant to provide assistance to people with weakness or lack of control. They still walk and balance for themselves, but the hope is that these devices will be an aid at times when human caregivers are not available and the alternative would be unsteady mobility or complete loss of mobility. Working with the assistive device has the benefit of continuing to make progress in strengthening on the march to recovery.
The team is hard at work on the design, and with less than two weeks left before the entry deadline of the 2020 Hackaday Prize, we’re excited to see where the final push will bring this project!
Continue reading “Open Exosuit Project Helps Physically Challenged Put One Foot In Front Of Another”
[Project Malaikat] is a 3D printed hybrid bipedal walker and quadcopter robot, but there’s much more to it than just sticking some props and a flight controller to a biped and calling it a day. Not only is it a custom design capable of a careful but deliberate two-legged gait, but the props are tucked away and deployed on command via some impressive-looking linkages that allow it to transform from walking mode to flying mode.
Creator [tang woonthai] has the 3D models available for download (.rar file) and the video descriptions on YouTube contain a bill of materials, but beyond that there doesn’t seem to be much other information available about [Malaikat]. The creator does urge care to be taken should anyone use the design, because while the robot may be small, it does essentially have spinning blades for hands.
Embedded below are videos that show off the robot’s moves, as well as a short flight test demonstrating that while control was somewhat lacking during the test, the robot is definitely more than capable of actual flight.
Continue reading “Hybrid Robot Walks, Transforms, And Takes Flight”
Wearables and robots don’t often intersect, because most robots rely on rigid bodies and programming while we don’t. Exoskeletons are an instance where robots interact with our bodies, and a soft exosuit is even closer to our physiology. Machine learning is closer to our minds than a simple state machine. The combination of machine learning software and a soft exosuit is a match made in heaven for the Harvard Biodesign Lab and Agile Robotics Lab.
Machine learning studies a walker’s steady gait for twenty periods while vitals are monitored to assess how much energy is being expended. After watching, the taught machine assists instead of assessing. This type of personalization has been done in the past, but the addition of machine learning shows that the necessary customization can be programmed into each machine without a team of humans.
Exoskeletons are no stranger to these pages, our 2017 Hackaday Prize gave $1000 to an open-source set of robotic legs and reported on an exoskeleton to keep seniors safe.
Continue reading “Learning Software In A Soft Exosuit”
If you’re working on your own bipedal robot, you don’t have to start from the ground up anymore. [Ted Huntington]’s Two Leg Robot project aims to be an Open Source platform that’ll give any future humanoid-robot builders a leg up.
While we’ve seen quite a few small two-legged walkers, making a pair of legs for something human-sized is a totally different endeavor. [Ted]’s legs are chock-full of sensors, and there’s a lot of software that processes all of the data. That’s full kinematics and sensor info going back and forth from 3D model to hardware. Very cool. And to top it all off, “Two Leg” uses affordable motors and gearing. This is a full-sized bipedal robot platform that you might someday be to afford!
Will walking robots really change the world? Maybe. Will easily available designs for an affordable bipedal platform give hackers of the future a good base to stand on? We hope so! And that’s why this is a great entry for the Hackaday Prize.
There are a lot of ways to try to mathematically quantify how healthy a person is. Things like resting pulse rate, blood pressure, and blood oxygenation are all quite simple to measure and can be used to predict various clinical outcomes. However, one you may not have considered is gait velocity, or the speed at which a person walks. It turns out gait velocity is a viable way to predict the onset of a wide variety of conditions, such as congestive heart failure or chronic obtrusive pulmonary disease. It turns out, as people become sick, elderly or infirm, they tend to walk slower – just like the little riflemen in your favourite RTS when their healthbar’s way in the red. But how does one measure this? MIT’s CSAIL has stepped up, with a way to measure walking speed completely wirelessly.
You can read the paper here (PDF). The WiGate device sends out a low-power radio signal, and then measures the reflections to determine a person’s location over time. Alone, however, this is not enough – it’s important to measure the walking speed specifically, to avoid false positives being triggered by a person simply not moving while watching television, for example. Algorithms are used to separate walking activity from the data set, allowing the device to sit in the background, recording walking speed data with no user interaction required whatsoever.
This form of passive monitoring could have great applications in nursing homes, where staff often have a huge number of patients to monitor. It would allow the collection of clinically relevant data without the need for any human intervention; the device could simply alert staff when a patient’s walking pattern is indicative of a bigger problem.
We see some great health research here at Hackaday – like this open source ECG. Video after the break.
Continue reading “Measuring Walking Speed Wirelessly”
[Basti] was playing around with Artificial Neural Networks (ANNs), and decided that a lot of the “hello world” type programs just weren’t zingy enough to instill his love for the networks in others. So he juiced it up a little bit by applying a reasonably simple ANN to teach a four-legged robot to walk (in German, translated here).
While we think it’s awesome that postal systems the world over have been machine sorting mail based on similar algorithms for years now, watching a squirming quartet of servos come to forward-moving consensus is more viscerally inspiring. Job well done! Check out the video embedded below.
Continue reading “Train Your Robot To Walk With A Neural Network”