We like that the Weedinator Project is thinking big for this year’s Hackaday Prize! This ambitious project by [TegwynTwmffat] is building on a previous effort, which was a tractor mounted weeding machine (shown above). It mercilessly shredded any weeds; the way it did this was by tilling everything that existed between orderly rows of growing leeks. The system worked, but it really wasn’t accurate enough. We suspect it had a nasty habit of mercilessly shredding the occasional leek. The new version takes a different approach.
The new Weedinator will be an autonomous robotic rover using a combination of GPS and colored markers for navigation. With an interesting looking adjustable suspension system to help with fine positioning, the Weedinator will use various attachments to help with plant care. Individual weeds will be identified optically and sent to the big greenhouse in the sky via precise flame from a small butane torch. It’s an ambitious project, but [TegwynTwmffat] is building off experience gained from the previous incarnation and we’re excited to see where it goes.
Robot design traditionally separates the body geometry from the mechanics of the gait, but they both have a profound effect upon one another. What if you could play with both at once, and crank out useful prototypes cheaply using just about any old 3D printer? That’s where Interactive Robogami comes in. It’s a tool from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) that aims to let people design, simulate, and then build simple robots with a “3D print, then fold” approach. The idea behind the system is partly to take advantage of the rapid prototyping afforded by 3D printers, but mainly it’s to change how the design work is done.
To make a robot, the body geometry and limb design are all done and simulated in the Robogami tool, where different combinations can have a wild effect on locomotion. Once a design is chosen, the end result is a 3D printable flat pack which is then assembled into the final form with a power supply, Arduino, and servo motors.
A white paper is available online and a demonstration video is embedded below. It’s debatable whether these devices on their own qualify as “robots” since they have no sensors, but as a tool to quickly prototype robot body geometries and gaits it’s an excitingly clever idea.
A team of students in Antwerp, Belgium are responsible for Project Aslan, which is exploring the feasibility of using 3D printed robotic arms for assisting with and translating sign language. The idea came from the fact that sign language translators are few and far between, and it’s a task that robots may be able to help with. In addition to translation, robots may be able to assist with teaching sign language as well.
The project set out to use 3D printing and other technology to explore whether low-cost robotic signing could be of any use. So far the team has an arm that can convert text into finger spelling and counting. It’s an interesting use for a robotic arm; signing is an application for which range of motion is important, but there is no real need to carry or move any payloads whatsoever.
A single articulated hand is a good proof of concept, and these early results show some promise and potential but there is still a long ways to go. Sign language involves more than just hands. It is performed using both hands, arms and shoulders, and incorporates motions and facial expressions. Also, the majority of sign language is not finger spelling (reserved primarily for proper names or specific nouns) but a robot hand that is able to finger spell is an important first step to everything else.
Future directions for the project include adding a second arm, adding expressiveness, and exploring the use of cameras for the teaching of new signs. The ability to teach different signs is important, because any project that aims to act as a translator or facilitator needs the ability to learn and update. There is a lot of diversity in sign languages across the world. For people unfamiliar with signing, it may come as a surprise that — for example — not only is American Sign Language (ASL) related to French sign language, but both are entirely different from British Sign Language (BSL). A video of the project is embedded below.
[Sebastian Goscik]’s entry in the 2017 Hackaday Prize is a line following robot. Well, not really; the end result is a line following robot, but the actual project is about a simple, cheap robot chassis to be used in schools, clubs, and other educational, STEAM education events. Along with the chassis design comes a lesson plan allowing teachers to have a head start when presenting the kit to their students.
The lesson plan is for a line-following robot, but in design is a second lesson – traffic lights which connect to a main base through a bus and work in sync. The idea of these lessons is to be fairly simple and straightforward for both the teachers and the students in order to get them more interested in STEM subjects.
What [Sebastian] noticed about other robot kits was that they were expensive or complicated or lacked tutorials. Some either came pre-assembled or took a long time to assemble. [Sebastian] simplified things – The only things required after the initial assembly of the chassis are: Zip-ties, electrical tape and a few screws. The PCB can’t be disassembled, but the assembled PCB can be reused.
The hardware [Sebastian] came up with consists of some 3mm material that can be laser cut (acrylic or wood) and a sensor board that has 5 IR LEDs and corresponding IR sensors. The chassis can be put together using nothing more than a Phillips screwdriver, and the sensor PCBs are well documented so that soldering them is as easy as possible. An Arduino is used as the brains of the unit.
[Sebastian] has come up with a great project and the idea of a platform like this with a couple of lesson plans included is a great one. He’s released the hardware under an Open Hardware license as well so others can share and add-on. Of course, there are other line following robots, like this miniature one created with analog circuitry, and there are other open source robots for teaching, like this one. But [Sebastian]’s focus on the lesson plans is a really unique way of approaching the problem – one that will hopefully be very successful.
[James Bruton] is well known for making robots using electric motors but he’s decided to try his hand at using pneumatics in order to make a fighting robot. The pneumatic cylinders will be used to give it two powerful punching arms. In true [James Bruton] fashion, he’s started with some experiments first, using the pneumatic cylinders from foot pumps. The cylinders he’s tried so far are taken out of single cylinder foot pumps from Halfords Essentials, costing only £6.29, around $8.11 US. That’s far cheaper than a commercial pneumatic cylinder, and perfectly adequate for this first step.
He did have to hack the cylinder a little though, besides removing it from its mounting and moving it to a DIY frame. Normally when you step down on a foot pump’s lever, you compress the cylinder, forcing air out the hose and into whatever you’re inflating. But he wanted to push air in the other direction, into the hose and into the cylinder. That would make the cylinder expand and thereby extend a robot fighting arm. And preferably that would be done rapidly and forcefully. However, a check valve at the hose outlet prevented air from entering the cylinder from the hose. So he removed the check valve. Now all he needed was a way to forcefully, and rapidly, push air into the hose.
For that he bought a solenoid activated valve on eBay, and a compressor with a 24 liter reservoir and a decent air flow rate of 180 liters per minute. The compressor added £110 ($142) to the cost of his project but that was still cheaper than the batteries he normally buys for his electric motor robots.
After working his usual CAD and 3D printing magic, he came up with an arm for the cylinder and a body that could fit two more valve activated cylinders to act as a working shoulder. A little more 3D printing and electronics, and he had 3 switches, one for each valve and cylinder. He then had the very successful results his experiment. You can see the entire R&D process in the video below, along with demonstrations of the resulting punching robot arm. We think it’s fairly intimidating for a first step.
Project Kino — inspired by living jewelry — are robotic accessories that use magnetic gripping wheels on both sides of the clothing to move about. For now they fill a mostly aesthetic function, creating kinetic accents to one’s attire, but one day they might be able to provide more interactive functionality. They could act as a phone’s mic, adjust clothing to suit the weather, function as high-visibility wear for cyclists or joggers, as haptic feedback sensors for all manner of applications (haptic sonar bodysuit, anyone?), assemble into large displays, and even function as a third — or more! — hand are just the tip of the iceberg for these ‘bots.
As with many tasks, robots may soon be ironing our clothes for us before we leave for work. Built by a team from the University Carlos III de Madrid’s robotics lab in Getafe, Spain, TEO is a highly articulated robot, that can climb stairs, open doors, and has recently added ironing to its skill set.
Data from a depth-sensing camera in TEO’s head is combed over by an algorithm, breaking it down into thousands of points — 0 being smooth and 1 a defined line in the clothing. Comparing those point values to those of its neighbours allows TEO to identify wrinkles without any preexisting notion of what a freshly-pressed garment looks like.