An Integrated Electromagnetic Lifting Module For Robots

The usual way a robot moves an object is by grabbing it with a gripper or using suction, but [Mile] believes that electromagnets offer a lot of advantages that are worth exploring, and has designed the ELM (Electromagnetic Lifting Module) in order to make experimenting with electromagnetic effectors more accessible. The ELM is much more than just a breakout board for an electromagnet; [Mile] has put a lot of work into making a module that is easy to interface with and use. ELM integrates a proximity sensor, power management, and LED lighting as well as 3D models for vertical or horizontal mounting. Early tests show that 220 mW are required to lift a 1 kg load, but it may be possible to manage power more efficiently by dynamically adjusting drive voltage depending on the actual load.

[Mile]’s focus on creating an easy to use, integrated solution that can be implemented easily by others is wonderful to see, and makes the ELM a great entry for The Hackaday Prize.

The Weedinator Returns

We are delighted to see The Weedinator as an entry for the 2018 Hackaday Prize! Innovations in agriculture are great opportunities to build something to improve our world. [TegwynTwmffat]’s Weedinator is an autonomous, electric platform aimed at small farms to take care of cultivating, tilling, and weeding seedbeds. The cost of this kind of labor can push smaller farms out of sustainability if it has to be done by people.

Greater efficiency in agriculture is traditionally all about multiplying the work a single person can do, and usually takes the form or bigger and heavier equipment that can do more at once and in less time. But with an autonomous robotic platform, the robot doesn’t get tired or bored so it doesn’t matter if the smaller platform needs to make multiple passes to cover a field or accomplish a task. In fact, smaller often means more maneuverable, more manageable, and more energy-efficient when it comes to a small farm.

The Original Weedinator was a contender for the 2017 Hackaday Prize and we’re deeply excited to see it return with an updated design and new people joining their team for 2018. Remember, there’s money set aside to help bootstrap promising concepts and all you really need to get started is an idea, an image, and documentation. There’s no better opportunity to dust off that idea and see if it has legs.

See This Slick RC Strandbeest Zip Around

Bevel gears used to mount motors vertically.

Theo Jansen’s Strandbeest design is a favorite and for good reason; the gliding gait is mesmerizing and this RC version by [tosjduenfs] is wonderful to behold. Back in 2015 the project first appeared on Thingiverse, and was quietly updated last year with a zip file containing the full assembly details.

All Strandbeest projects — especially steerable ones — are notable because building one is never a matter of simply scaling parts up or down. For one thing, the classic Strandbeest design doesn’t provide any means of steering. Also, while motorizing the system is simple in concept it’s less so in practice; there’s no obvious or convenient spot to actually mount a motor in a Strandbeest. In this project bevel gears are used to mount the motors vertically in a central area, and the left and right sides are driven independently like a tank. A motor driver that accepts RC signals allows the use of an off the shelf RC transmitter and receiver to control the unit. There is a wonderful video of the machine zipping around smoothly, embedded below.

Continue reading “See This Slick RC Strandbeest Zip Around”

Gorgeous Engineering Inside Wheels Of A Robotic Trail Buddy

Robots are great in general, and [taylor] is currently working on something a bit unusual: a 3D printed explorer robot to autonomously follow outdoor trails, named Rover. Rover is still under development, and [taylor] recently completed the drive system and body designs, all shared via OnShape.

Rover has 3D printed 4.3:1 reduction planetary gearboxes embedded into each wheel, with off the shelf bearings and brushless motors. A Raspberry Pi sits in the driver’s seat, and the goal is to use a version of NVIDA’s TrailNet framework for GPS-free navigation of paths. As a result, [taylor] hopes to end up with a robotic “trail buddy” that can be made with off-the-shelf components and 3D printed parts.

Moving the motors and gearboxes into the wheels themselves makes for a very small main body to the robot, and it’s more than a bit strange to see the wheel spinning opposite to the wheel’s hub. Check out the video showcasing the latest development of the wheels, embedded below.

Continue reading “Gorgeous Engineering Inside Wheels Of A Robotic Trail Buddy”

Hackaday Prize Entry: The Weedinator Project, Now With Flame

We like that the Weedinator Project is thinking big for this year’s Hackaday Prize! This ambitious project by [TegwynTwmffat] is building on a previous effort, which was a tractor mounted weeding machine (shown above). It mercilessly shredded any weeds; the way it did this was by tilling everything that existed between orderly rows of growing leeks. The system worked, but it really wasn’t accurate enough. We suspect it had a nasty habit of mercilessly shredding the occasional leek. The new version takes a different approach.

The new Weedinator will be an autonomous robotic rover using a combination of GPS and colored markers for navigation. With an interesting looking adjustable suspension system to help with fine positioning, the Weedinator will use various attachments to help with plant care. Individual weeds will be identified optically and sent to the big greenhouse in the sky via precise flame from a small butane torch. It’s an ambitious project, but [TegwynTwmffat] is building off experience gained from the previous incarnation and we’re excited to see where it goes.

3D Printed Robotic Arms For Sign Language

A team of students in Antwerp, Belgium are responsible for Project Aslan, which is exploring the feasibility of using 3D printed robotic arms for assisting with and translating sign language. The idea came from the fact that sign language translators are few and far between, and it’s a task that robots may be able to help with. In addition to translation, robots may be able to assist with teaching sign language as well.

The project set out to use 3D printing and other technology to explore whether low-cost robotic signing could be of any use. So far the team has an arm that can convert text into finger spelling and counting. It’s an interesting use for a robotic arm; signing is an application for which range of motion is important, but there is no real need to carry or move any payloads whatsoever.

Closeup of hand actuators and design. Click to enlarge.

A single articulated hand is a good proof of concept, and these early results show some promise and potential but there is still a long ways to go. Sign language involves more than just hands. It is performed using both hands, arms and shoulders, and incorporates motions and facial expressions. Also, the majority of sign language is not finger spelling (reserved primarily for proper names or specific nouns) but a robot hand that is able to finger spell is an important first step to everything else.

Future directions for the project include adding a second arm, adding expressiveness, and exploring the use of cameras for the teaching of new signs. The ability to teach different signs is important, because any project that aims to act as a translator or facilitator needs the ability to learn and update. There is a lot of diversity in sign languages across the world. For people unfamiliar with signing, it may come as a surprise that — for example — not only is American Sign Language (ASL) related to French sign language, but both are entirely different from British Sign Language (BSL). A video of the project is embedded below.

Continue reading “3D Printed Robotic Arms For Sign Language”

GuitarBot Brings Together Art And Engineering

Not only does the GuitarBot project show off some great design, but the care given to the documentation and directions is wonderful to see. The GuitarBot is an initiative by three University of Delaware professors, [Dustyn Roberts], [Troy Richards], and [Ashley Pigford] to introduce their students to ‘Artgineering’, a beautiful portmanteau of ‘art’ and ‘engineering’.

The GuitarBot It is designed and documented in a way that the three major elements are compartmentalized: the strummer, the brains, and the chord mechanism are all independent modules wrapped up in a single device. Anyone is, of course, free to build the whole thing, but a lot of work has been done to ease the collaboration of smaller, team-based groups that can work on and bring together individual elements.

Some aspects of the GuitarBot are still works in progress, such as the solenoid-activated chord assembly. But everything else is ready to go with Bills of Materials and build directions. An early video of a strumming test proof of concept used on a ukelele is embedded below.

Continue reading “GuitarBot Brings Together Art And Engineering”