[Fred Hoefler] was challenged to finally do something with that Raspberry Pi he wouldn’t keep quiet about. So he built a machine assist loom for the hand weaver. Many older weavers simply can’t enjoy their art anymore due to the physical strain caused by the repetitive task. Since he had a Pi looking for a purpose, he also had his project.
His biggest requirement was cost. There are lots of assistive looms on the market, but the starting price for those is around ten thousand dollars. So he set the rule that nothing on the device would cost more than the mentioned single board computer. This resulted in a BOM cost for the conversion that came in well under two hundred dollars. Not bad!
The motive parts are simple cheap 12V geared motors off Amazon. He powered them using his own motor driver circuits. They get their commands from the Pi, running Python. To control the loom one can either type in commands into the shell or use the keyboard. There are also some manual switches on the loom itself.
In the end [Fred] met his design goal, and has further convinced his friends that the words Raspberry Pi are somehow involved with trouble.
For humans, moving our arms and hands onto an object to pick it up is pretty easy; but for manipulators, it’s a different story. Once we’ve found the object we want our robot to pick up, we still need to plan a path from our robot hand to the object all the while lugging the remaining limbs along for the ride without snagging them on any incoming obstacles. The space of all possible joint configurations is called the “joint configuration space.” Planning a collision-free path through them is called path planning, and it’s a tricky one to solve quickly in the world of robotics.
These days, roboticists have nailed out a few algorithms, but executing them takes 100s of milliseconds to compute. The result? Robots spend most of their time “thinking” about moving, rather than executing the actual move.
It’s worth asking: why is this problem so hard? How did hardware make it faster? There’s a few layers here, but it’s worth investigating the big ones. Planning a path from point A to point B usually happens probabilistically (randomly iterating to the finishing point), and if there exists a path, the algorithm will find it. The issue, however, arises when we need to lug our remaining limbs through the space to reach that object. This feature is called the swept volume, and it’s the entire shape that our ‘bot limbs envelope while getting from A to B. This is not just a collision-free path for the hand, but for the entire set of joints.
Encoding a map on a computer is done by discretizing the space into a sufficient resolution of 3D voxels. If a voxel is occupied by an obstacle, it gets one state. If it’s not occupied, it gets another. To compute whether or not a path is OK, a set of voxels that represent the swept volume needs to be compared against the voxels that represent the environment. Here’s where the FPGA kicks in with the speed bump. With the hardware implementation, voxel occupation is encoded in bits, and the entire volume calculation is done in parallel. Nifty to have custom hardware for this, right?
We applaud the folks at Duke University for getting this up-and-running, and we can’t wait to see custom “robot path-planning chips” hit the market some day. For now, though, if you’d like to sink your teeth into seeing how FPGAs can parallelize conventional algorithms, check out our linear-time sorting feature from a few months back.
Boston Dynamics, the lauded robotics company famed for its ‘Big Dog’ robot and other machines which push mechanical dexterity to impressive limits have produced a smaller version of their ‘Spot’ robot dubbed ‘SpotMini’.
A lightweight at 55-65 lbs, this quiet, all-electric robot lasts 90 minutes on a full charge and boasts partial autonomy — notably in navigation thanks to proprioception sensors in the limbs. SpotMini’s most striking features are its sleek new profile and manipulator arm, showing off this huge upgrade by loading a glass into a dishwasher and taking out some recycling.
Robots are prone to failure, however, so it’s good to know that our future overlords are just as susceptible to slipping on banana peels as we humans are.
One of the features of fancy modern industrial motor and controller sets is the ability for the motor to act as a mass-spring-damper. For example, let’s say you want a robot to hold an egg. You could have it move to the closed position, but tell the controller you only want to use so much force to do it. It will hold the egg as if there was a spring at its joint.
Another way you could use this is in the application of a robot leg. You tell the controller what kind of spring and shock absorber (damper) combination it is and it will behave as if those parts have been added to the mechanism. This is important if you want a mechanical leg to behave like a biological leg.
[Ben] had worked on a more formal project which used some very expensive geared motors to build a little running robot. It looks absolutely ridiculous, as you can see in the following video, but it gives an idea of where he’s going with this line of research. He wanted to see if he could replace all those giant geared motors with the cheap and ubiquitous high performance brushless DC motors for sale now. Especially given his experience with them.
So far he’s done a very impressive amount of work. He’s built a control board. He’s characterized different motors for the application. He’s written a lot of cool software; he can even change the stiffness and damping settings on the fly. He has a single leg that can jump. It’s cool. He’s taking a hiatus from the project, but he’ll be right back at it soon. We’re excited for the updates!
Custom, robotic prosthesis are on the rise. In numerous projects, hackers and makers have taken on the challenge. From Enabling The Future, Open Hand Project, OpenBionics to the myriad prosthesis projects on Hackaday.io. Yet, the mechatronics that power most of them are still from the last century. At the end of the day, you can only fit so many miniature motors and gears into a plastic hand, and only so many hydraulics fit onto an arm or leg before it becomes a slow, heavy brick – more hindering than helpful. If only we had a few extra of these light, fast and powerful actuators that help us make it through the day. If only we had artificial muscles.
Every now and then someone gets seriously inspired, and that urge just doesn’t go away until something gets created. For [Paulius Liekis], it led to creating a roughly 1:20 scale version of the T08A2 Hexapod “Spider” Tank from the movie Ghost in the Shell. As the he puts it, “[T]his was something that I wanted to build for a long time and I just had to get it out of my system.” It uses two Raspberry Pi computers, 28 servo motors, and required over 250 hours of 3D printing for all the meticulously modeled pieces – and even more than that for polishing, filing, painting, and other finishing work on the pieces after they were printed. The paint job is spectacular, with great-looking wear and tear. It’s even better seeing it in motion — see the video embedded below.
Having to work away from the convenience of a workshop can be tough. But it’s sometimes unavoidable and it always means planning ahead. When the work area also happens to be 150m under a lake’s surface, it’s much more of a challenge – but it’s both doable and more accessible than you might think. To prove it, this DIY research vessel will be part of the robotic exploration of an underwater shipwreck. It is complete with an Ethernet bridge, long-range wireless communications, remotely operated underwater vehicle (ROV), the ability to hold a position, and more. The best part? It can all be packed up and fit into a minivan. We can’t put it any better than the folks at the OpenROV Forums:
In just over a week (June 6th – 9th), a bunch of people from OpenROV are going to attempt to dive a set of specially modified deep-capable ROVs to a 50m-long shipwreck at a depth of 150m below lake Tahoe. We’ll be using a deployment architecture that we’ve been perfecting over the years that involves a very small boat keeping station over the dive site while the rest of the people on the expedition run the mission from a remote location via long-range broadband radio. Since the mission control site will have an internet connection, we’ll be able to live stream the entire dive over the internet.
The purpose of the design was “to demonstrate that many of the capabilities one might think would require a large research vessel can actually be achieved with off-the-shelf parts that are more portable and much less expensive. […] There’s a lot to discover down there, and the technology readily available these days can allow us to explore it.” This mindset happens to wonderfully complement the kickoff of the Citizen Scientist Challenge portion of the 2016 Hackaday Prize.