Open-Source Farming Robot Now Includes Simulations

Farming is a challenge under even the best of circumstances. Almost all conventional farmers use some combination of tillers, combines, seeders and plows to help get the difficult job done, but for those like [Taylor] who do not farm large industrial monocultures, more specialized tools are needed. While we’ve featured the Acorn open source farming robot before, it’s back now with new and improved features and a simulation mode to help rapidly improve the platform’s software.

The first of the two new physical features includes a fail-safe braking system. Since the robot uses electric geared hub motors for propulsion, the braking system consists of two normally closed relays which short the motor leads in emergency situations. This makes the motors see an extremely high load and stops them from turning. The robot also has been given advanced navigation facilities so that it can follow custom complex routes. And finally, [Taylor] created a simulation mode so that the robot’s entire software stack can be run in Docker and tested inside a simulation without using the actual robot.

For farmers who are looking to buck unsustainable modern agricultural practices while maintaining profitable farms, a platform like Acorn could be invaluable. With the ability to survey, seed, harvest, and even weed, it could perform every task of larger agricultural machinery. Of course, if you want to learn more about it, you can check out our earlier feature on this futuristic farming machine.

Tracked RC Vehicle Is (Mostly) 3D Printed

While wheels might seem like a foundational technology, they do have one major flaw: they typically need maintained roads in order to work. Anyone who has experience driving a Jeep or truck off-road likely knows this first-hand. For those with extreme off-road needs the track is often employed. [Let’s Print] is working on perfecting his RC tracked vehicle to take advantage of these perks using little more than 3D printed parts and aluminum stock.

This vehicle doesn’t just include the 3D printed tracks, but an entire 3D printed gearbox and drivetrain to drive them. Each track is driven by its own DC motor coupled to a planetary gearbox to give each plenty of torque to operate in snow or mud. The gearbox is mated to a differential which currently shares a shaft, which means that steering is currently not possible. The original plan was to have each motor drive the tracks independently but a small mistake in the build meant that the shaft needed to be tied together. [Let’s Print] has several options to eventually include steering, including an articulating body or redesigning the drivetrain to be able to separate the shaft.

While this vehicle currently has no wheels in order to improve traction, [Let’s Print] does point out that a pair of wheels could complement this vehicle when he finished the back half of it since wheels have a major advantage over tracks when it comes to steering. A vehicle with both could have the advantages of both, so we’re interested to see where this build eventually goes.

Thanks to [Joonas] for the tip!

Continue reading “Tracked RC Vehicle Is (Mostly) 3D Printed”

Cranes made by Origami (Orizuru). The height is 35mm.

Bringing The Art Of Origami And Kirigami To Robotics And Medical Technology

Traditionally, when it comes to high-tech self-assembling microscopic structures for use in medicine delivery, and refined, delicate grippers for robotics, there’s been a dearth of effective, economical options. While some options exist, they are rarely as effective as desired, with microscopic medicine delivery mechanisms, for example, not having the optimal porosity. Similarly, in so-called soft robotics, many compromises had to be made.

A promising technology here involves the manipulation of flat structures in a way that enables them to either auto-assemble into 3D structures, or to non-destructively transform into 3D structures with specific features such as grippers that might be useful in both micro- and macroscopic applications, including robotics.

Perhaps the most interesting part is how much of these technologies borrow from the Japanese art of origami, and the related kirigami.

Continue reading “Bringing The Art Of Origami And Kirigami To Robotics And Medical Technology”

Autonomous Mower Hits Snag

Interfacing technology and electronics with the real world is often fairly tricky. Complexity and edge cases work their way in to every corner of a project like this; just ask anyone who has ever tried to operate a rover on Mars, make a hydroponics garden, or build almost any robotics project. Even those of us who simply own a consumer-grade printer are flummoxed by the ways in which they can fail when manipulating single sheets of paper. This robotic lawnmower is no exception, driving its creator [TK] to extremes to get it to mow his lawn.

[TK] actually had a platform for his autonomous mower ready to go thanks to a previous build using this solar-powered robot to explore the Australian outback. Adding another motor to handle the grass trimming seemed simple at first and he set about wiring it all up and interfacing it to the robot. After the first iteration he found the robot was moving too fast to effectively cut the grass, so he added a more powerful cutting motor and a gearbox to help the mower crawl more slowly over the lawn. Disaster struck when his 3D printed mount for the steel cutting blades shattered, but with [TK] uninjured he pushed on with more improvements.

As it stands right now, the mower can effectively cut the grass moving forward even with the plastic-only cutting blades that [TK] is using now for safety reasons. The mower stripped its reverse gear so there still are some improvements to make before this robot is autonomously cutting the lawn without supervision. Normally we see lawnmowers retrofitted with robotics rather than robotics retrofitted with a lawnmower, but we’re excited to see any approach that lets us worry about one less household chore.

Thanks to [Rob] for the tip!

Continue reading “Autonomous Mower Hits Snag”

Prototype Robot For Omniwheel Bicycle

For all its ability to advance modern society in basically every appreciable way, science still has yet to explain some seemingly basic concepts. One thing that still has a few holes in our understanding is the method by which a bicycle works. Surely, we know enough to build functional bicycles, but like gravity’s inclusion into the standard model we have yet to figure out a set of equations that govern all bicycles in the universe. To push our understanding of bicycles further, however, some are performing experiments like this self-balancing omniwheel bicycle robot.

Functional steering is important to get the bicycle going in the right direction, but it’s also critical for keeping the bike upright. This is where [James Bruton] is putting the omniwheel to the test. By placing it at the front of the bike, oriented perpendicularly to the direction of travel, he can both steer the bicycle robot and keep it balanced. This does take the computational efforts of an Arduino Mega paired with an inertial measurement unit but at the end [James] has a functional bicycle robot that he can use to experiment with the effects of different steering methods on bicycles.

While he doesn’t have a working omniwheel bicycle for a human yet, we at least hope that the build is an important step on the way to [James] or anyone else building a real bike with an omniwheel at the front. Hopefully this becomes a reality soon, but in the meantime we’ll have to be content with bicycles with normal wheels that can balance and drive themselves.

Continue reading “Prototype Robot For Omniwheel Bicycle”

Dummy The Robot Arm Is Not So Dumb

[Zhihui Jun] is a name you’re going to want to remember because this Chinese maker has created quite probably one of the most complete open-source robot arms (video in Chinese with subtitles, embedded below) we’ve ever seen. This project has to be seen to be believed. Every aspect of the design from concept, mechanical CAD, electronics design and software covering embedded, 3D GUI, and so on, is the work of one maker, in just their spare time! Sound like we’re talking it up too much? Just watch the video and try to keep up!

After an initial review of toy robots versus more industrial units, it was quickly decided that servos weren’t going to cut it – too little torque and lacking in precision. BLDC motors offer great precision and torque when paired with a good controller, but they are tricky to make small enough, so an off-the-shelf compact harmonic drive was selected and paired with a stepper motor to get the required performance. This was multiplied by six and dropped into some slick CNC machined aluminum parts to complete the mechanics. A custom closed-loop stepper controller mounts directly to the rear of each motor. That’s really nice too.

Stepper controller mounts on the motor rear – smart!

Control electronics are based around the STM32 using an ESP32 for Wi-Fi connectivity, but the pace of the video is so fast it’s hard to keep up with how much of the design operates. There is a brief mention that the controller runs the LiteOS kernel for Harmony OS, but no details we can find. The project GitHub has many of the gory details to pore over perhaps a bit light in places but the promise is made to expand that. For remote control, there’s a BLE-connected teaching device (called ‘Peak’) with a touch screen, again details pending. Oh, did we mention there’s a force-feedback (a PS5 Adaptive Trigger had to die for the cause) remote control unit that uses binocular cameras to track motion, with an AHRS setup giving orientation and that all this is powered by a Huawei Atlas edge AI processing system? This was greatly glossed over in the video like it was just some side-note not worth talking about. We hope details of that get made public soon!

Threading a needle through a grape by remote control

The dedicated GUI, written in what looks like Unity, allows robot programming and motion planning, but since those harmonic drives are back-drivable, the robot can be moved by hand and record movements for replaying later. Some work with AR has been started, but that looks like early in the process, the features just keep on coming!

Quite frankly there is so much happening that it’s hard to summarise here and do the project any sort of justice, so to that end we suggest popping over to YT and taking a look for yourselves.

We love robots ’round these parts, especially robot arms, here’s a big one by [Jeremy Fielding],  and if you think stepper motors aren’t necessary, because servo motors can be made to work just fine, you may be right.

Continue reading “Dummy The Robot Arm Is Not So Dumb”

“So Long,” Said All The Tank-Driving Fish

Though some of us are heavily assisted by smart phone apps and delivery, humans don’t need GPS to find food. We know where the fridge is. The grocery store. The drive-thru. And we don’t really need a map to find shelter, in the sense that shelter is easily identifiable in a storm. You might say that our most important navigation skills are innate, at least when we’re within our normal environment. Drop us in another city and we can probably still identify viable overhangs, cafes, and food stalls.

The question is, do these navigational skills vary by species or environment? Or are the tools necessary to forage for food, meet mates, and seek shelter more universal? To test the waters of this question, Israeli researchers built a robot car and taught six fish to navigate successfully toward a target with a food reward. This experiment is one of domain transfer methodology, which is the exploration of whether a species can perform tasks outside its natural environment. Think of all the preparation that went into Vostok and Project Mercury.

Continue reading ““So Long,” Said All The Tank-Driving Fish”