Building an underwater remotely operated vehicle (ROV) is always a challenge, and making it waterproof is often a major hurdle. [Filip Buława] and [Piotr Domanowski] have spent four years and 14 prototypes iterating to create the CPS 5, a 3D printed ROV that can potentially reach a depth of 85 m.
FDM 3D prints are notoriously difficult to waterproof, thanks to all the microscopic holes between the layers. There are ways to mitigate this, but they all have limits. Instead of trying to make the printed exterior of the CPS 5 waterproof, the electronics and camera are housed in a pair of sealed acrylic tubes. The end caps are still 3D printed, but are effectively just thin-walled containers filled with epoxy resin. Passages for wiring are also sealed with epoxy, but [Filip] and [Piotr] learned the hard way that insulated wire can also act as a tube for water to ingress. They solved the problem by adding an open solder joint for each wire in the epoxy-filled passages.
For propulsion, attitude, and depth control, the CPS 5 has five brushless drone motors with 3D printed propellers, which are inherently unaffected by water as long as you seal the connectors. The control electronics consist of a PixHawk flight controller and a Raspberry Pi 4 for handling communication and the video stream to a laptop. An IMU and water pressure sensor also enable auto-leveling and depth hold underwater. Like most ROVs, it uses a tether for communication, which in this case is an Ethernet cable with waterproof connectors.
It should come as no surprise that the COVID-19 pandemic has sparked renewed interest in robotic deliveries. Amazon saying they would some day land Prime orders in your backyard with a drone sounded pretty fanciful a few years ago, but now that traditional delivery services are under enormous strain and people are looking to avoid as much human contact as possible, it’s starting to make a lot more sense.
Now to be clear, we don’t think you’ll be seeing this modified RC truck rolling up your driveway with a pizza in tow anytime soon. But the experiments that [Sean] has been doing with it are certainly interesting, and show just how far autonomous rover technology has progressed at the hobbyist level. Whether you need to move some sushi or a sensor package, his build is a great starting point for anyone interested in DIY robotic ground vehicles.
Especially if you want to take things off the beaten path once and awhile. By combining the Pixhawk autopilot system with an off-road RC truck by Traxxas, [Sean] has created a delivery bot that’s not afraid of a little mud. Or even the occasional jump, should the need arise. Just don’t expect your shrimp cocktail and champagne to arrive in one piece after they’ve been given the Dukes of Hazzard treatment.
In the video after the break [Sean] goes over some of the lessons learned on this build, including how he managed to keep the electronics from cooking themselves in the Texas heat. He also goes over the realities of building an autonomous driving system that doesn’t actually have a camera onboard; sure you can plan a route for it in advance, but all bets are off if an unexpected obstacle blocks the path. It’s a pretty serious shortcoming he’s looking to address in the future, as well as upgrading to a far more accurate RTK-GPS receiver.
To adequately study a body of water such as a lake, readings and samples need to be taken from an array of depths and locations. Traditionally this is done by a few researchers on a small boat with an assortment of tools that can be lowered to the desired depth, which is naturally a very slow and expensive process. As the demand for ever more granular water quality analysis has grown, various robotic approaches have been suggested to help automate the process.
A group of students from Northeastern University in Boston have been working on Project Albatross, a unique combination of semi-autonomous vehicles that work together to provide nearly instantaneous data from above and below the water’s surface. By utilizing open source software and off-the-shelf components, their system promises to be affordable enough even for citizen scientists conducting their own environmental research.
The surface vehicle, assembled from five gallon buckets and aluminum extrusion, uses a Pixhawk autopilot module to control a set of modified bilge pumps acting as thrusters. With ArduPilot, the team is able to command the vehicle to follow pre-planned routes or hold itself in one position as needed. Towed behind this craft is a sensor laden submersible inspired by the Open-Source Underwater Glider (OSUG) that won the 2017 Hackaday Prize.
Using an array of syringes operated by a NEMA 23 stepper motor, the glider is able to control its depth in the water by adjusting its buoyancy. The aluminum “wings” on the side of the PVC pipe body prevent the vehicle from rolling will moving through the water. As with the surface vehicle, many of the glider components were sourced from the hardware store to reduce its overall cost to build and maintain.
The tether from the surface vehicle provides power for the submersible, greatly increasing the amount of time it can spend underwater compared to internal batteries. It also allows readings from sensors in the tail of the glider to be transmitted to researchers in real-time rather than having to wait for it to surface. While the team says there’s still work to be done on the PID tuning which will give the glider more finely-grained control over its depth, the results from a recent test run already look very promising.
Mowing the lawn is one of those repetitive tasks most of us really wish we had a robot for. [Kenny Trussell] mowing needs are a bit more strenuous than most backyards, so he hacked a ride-on mower to handle multi-acre fields all on it’s own.
The mower started out life as a standard zero turn ride on lawn mower. It’s brains consist of a PixHawk board running Ardurover, an Ardupilot derivative for ground vehicles. Navigation is provided by a RTK GPS module that gets error corrections from a fixed base station via an Adafruit LoRa feather board, to achieve centimetre level accuracy. To control the mower, [Kenny] replaced the pneumatic shocks that centred the control levers with linear actuators.
So far [Kenny] has been using the mower to cut large 5-18 acre fields, which would be a very time-consuming job for a human operator. A relay was added to the existing safety circuit that only allows the mower to function when there is weight on the seat. This relay is wired directly to the RC receiver and is controlled from the hand-held RC transmitter. It will also stop the mower if it loses signal to the transmitter. To set up mowing missions, [Kenny] uses the Ardupilot Mission Planner for which he wrote a custom command line utility to create a concentric route for the mower to follow to completely cover a defined area. He has made a whole series of videos on the process, which is very handy for anyone wanting to do the same. We’re looking forward to a new video with all the latest updates.
This mower has been going strong for two years, but in terms of hours logged it’s got nothing on this veteran robotic mower that’s been at it for more than two decades and still runs off an Intel 386 processor.
The glider is a simple design built from foam board, controlled with two elevons, and fitted with a third servo to handle its release from the tow drone. It’s fitted with a Pixracer autopilot module and a Dragonlink telemetry link to the ground control laptop.
Initial testing was unsuccessful, with the drone ignoring return-to-home commands, and only responding to waypoints. After some further experimentation, performance improved. Testing and tweaking is the name of the game, and while the attempt to fly the glider into the back of the trailer failed, overall the project shows promise.
It’s impressive to see the glider tracing out perfect circles on the map under autopilot control. While it’s not officially supported, [rctestflight]’s work shows that it’s possible to run PX4 on a glider and have some success doing it. Future plans involve weather balloons and high altitude work, and we can’t wait to see the results.
That’s what [IzzyBrand] and his cohorts did, and we have to say we’re mightily impressed. The glider itself looks like nothing to write home about: in true Flite Test fashion, it’s just a flying wing made with foam core and Coroplast reinforced with duct tape. A pair of servo-controlled elevons lies on the trailing edge of the wings, while inside the fuselage are a Raspberry Pi and a Pixhawk flight controller along with a GPS receiver. Cameras point fore and aft, a pair of 5200 mAh batteries provide the juice, and handwarmers stuffed into the avionics bay prevent freezing.
After a long series of test releases from a quadcopter, flight day finally came. Winds aloft prevented a full 30-kilometer release, so the glider was set free at 10 kilometers. The glider then proceeded to a pre-programmed landing zone over 80 kilometers from the release point. At one point the winds were literally pushing the glider backward, but the little plane prevailed and eventually spiraled down to a perfect landing.
We’ve been covering space balloons for a while, but take a moment to consider the accomplishment presented here. On a shoestring budget, a team of amateurs hit a target the size of two soccer fields with an autonomous aircraft from a range of almost 200 kilometers. That’s why we’re impressed, and we can’t wait to see what they can do after a release from the edge of space.
Wouldn’t it be nice if you had a flying machine that could maneuver in any direction while rotating around any axis while maintaining both thrust and torque? Attach a robot arm and the machine could position itself anywhere and move objects around as needed. [Dario Brescianini] and [Raffaello D’Andrea] of the Institute for Dynamic Systems and Control at ETH Zurich, have come up with their Omnicopter that does just that using eight rotors in configurations that give it six degrees of freedom. Oh, and it plays fetch, as shown in the first video below.
Each propeller is reversible to provide thrust in either direction. Also on the vehicle itself is a PX4FMU Pixhawk flight computer, eight motors and motor controllers, a four-cell 1800 mAh LiPo battery, and communication radios. Radio communication is necessary because the calculations for the position and outer attitude are done on a desktop computer, which then sends the desired force and angular rates to the vehicle. The desktop computer knows the vehicle’s position and orientation because they fly it in the Flying Machine Arena, a large room at ETH Zurich with an infrared motion-capture system.
The result is a bit eerie to watch as if gravity doesn’t apply to the Omnicopter. The flying machine can be just plain playful, as you can see in the first video below where it plays fetch by using an attached net to catch a ball. When returning the ball, it actually rotates the net to dump the ball into the thrower’s hand. But you can see that in the video.