[Enza3D] shows off a surprisingly compact articulated animatronic eyeball that can be intuitively controlled with a Wii nunchuk controller. The design uses 3D printed parts and some tiny servos, and all of the necessary electronics can be easily purchased online. The mechanical design of the eye is very impressive, and [Enza3D] walks through several different versions of the design, the end result of which is a tidy little assembly that would fit nicely into masks, costumes, or other projects.
A Wii nunchuk is ideal for manual control of such a device, thanks to its ergonomic design and ease of interface (the nunchuk communicates over I2C, which is easily within the reach of even most modest of microcontrollers.) Of course, since driving servos is also almost trivial nowadays, it doesn’t look like working this into an automated project would pose much of a challenge.
Robotic mowers are becoming a common sight in some places, enabled by the cost of motors and the needed control electronics being much lower, thanks to the pace of modern engineering. But, in many cases, they still appear to be really rather dumb, little more than a jacked up bump-and-go with a spinning blade. [Clemens Elflein] has taken a cheap, dumb mower and given it a brain transplant based around a Raspberry Pi 4 paired up with a Raspberry Pi Pico for the real time control side of things. [Clemens] is calling this OpenMower, with the motivation to create an open source robot mower controller with support for GPS navigation, using RTK for extra precision.
The donor robot was a YardForce Classic 500, and after inspection of the control PCB, it looks like many other robot mower models are likely to use the same controller and thus be compatible with the openmower platform. A custom mainboard houses the Pi 4 and Pico, an ArduSimple RTK GPS module (giving a reported navigational accuracy of 1 cm,) as well as three BLDC motor drivers for the wheels and rotor. Everything is based on modules, plugging into the mainboard, reducing the complexity of the project significantly. For a cheap mower platform, the Yardforce unit has a good build quality, with connectors everywhere, making OpenMower a plug and play solution. Even the user interface on top of the mower was usable, with a custom PCB below presenting some push buttons at the appropriate positions.
Motor control is courtesy of the xESC project, which provides FOC motor control for low cost, interfacing with the host controller via a serial link. This is worth looking into in its own right! On the software side of things, [Clemens] is using ROS, which implements the low level robot control, path planning (using code taken from Slic3r) as well a kinematics constraints for object avoidance. The video below, shows how simple the machine is to operate — just drive it around the perimeter of lawn with a handheld controller, and show it where obstacles such as trees are, and then set it going. The mower is even capable of mowing multiple lawns, making the journey between them automatically!
Today, we shall talk about how [Adam Bäckström] took a DS3225 servo and rebuilt it to improve its accuracy, then built a high-precision robot arm with those modified servos to show just how much of an improvement he’s got – up to 36 times better positional accuracy. If this brings a déjà vu feeling, that’s because we’ve covered his servo modifications before, but now, there’s more. In a year’s time since the last video came out, [Adam] has taken it to the next level, showing us how the modification is made, and how we ourselves can do it, in a newly released video embedded below.
After ordering replacement controller PCBs designed by [Adam] (assembled by your PCBA service of choice), you disassemble the servo, carefully setting the gearbox aside for now. Gutting the stock control board is the obvious next step, but from there, you don’t just drop the new PCB in – there’s more to getting a perfect servo than this, you have to add extra sensing, too. First, you have to print a spacer and a cover for the control board, as well as a new base for the motor. You also have to print (or perhaps, laser-cut) two flat encoder disks, one black and one white, the white one being eccentric. It only escalates from here!
It’s always good to welcome a new hackerspace to the fold, and thus we’re pleased to hear about the upcoming opening of Hackerspace Drenthe, on the north-eastern edge of the Netherlands. Starting a new space during a global pandemic is something of a feat. As part of their opening something is required to demonstrate a robot for the curious public, and what could be more accessible than a robot arm playing tic-tac-toe!
It would be correct to say that a robot moving blocks with precision is not necessarily a ground-breaking achievement, but in its purpose of providing eye-candy for a hackerspace opening while also serving as an experiment for some of the students from the school adjacent to the space it is a success. The interface is a pleasingly retro War Games style terminal, and the software is written in Python. For the curious all can be found on a GitHub repository, and should you be in that region of Europe you can find Hackerspace Drenthe in the Netherlands border town of Coevorden and attend their opening on the 2nd of April.
There are several projects you can imagine where it would be useful to have a robot follow you. For example, we’ve always wanted luggage that would trail us at the airport and we’ve seen several coolers that will follow you. [Madmax95] apparently dream of having a medical cart following a patient, though, and that’s good too. But how do you do it? [Max’s] method was to strip down a Roomba and build a work table and electronics on it. An Arduino controls the motor and communicates with a PC. The PC reads video from a Kinect camera on the robot and uses special tracking software to follow the patient.
We could easily imagine all of this project except the tracking. That depended on a service called Nuitrack. There is a free version that only works for 3 minutes, but it costs if you want to use it practically. However, it would still be cheaper than rolling your own if your time has value.
[Peng Zhihui] seems to have found some spare time and energy to crack out another sweet robot build, this time it’s a much smaller, and cuter emoji-bot (Original GitHub Link,) with the usual production-ready levels of attention to detail. With a lot of fine details in the 3D printed models, this is one for SLS printing in nylon, but that can be done for a reasonable outlay, in China at least. The electronics package consists of a few full custom, and tiny, PCBs designed with Altium Designer, with off-the-shelf modules for the circular LCD and camera. The main board hosts an STM32F405 and deals with the display and SD card, The reason for this choice of STM32 was due to the requirement for connecting to an external USB3300 high-speed USB PHY. There is a sensor PCB which handles the gesture sensor, a USB hub, MPU6050 9-axis sensor, and also the USB camera module. This board attaches to the USB-C connector in the base, via a FFC cable, allowing the robot to rotate on its base.
[Peng] clearly has exacting standards as to how things should work, and we guess wanted to have the arms back-driveable in a way that enabled the host computer to track and record the motor positions for replaying later on. The connection back to the controller is via I2C, allowing all five servos to hang on the same bus, saving previous resources. Smart! Getting a processor and motor driver in such a tiny space was a bit of challenge, but a walk in the park for [Peng] as is demonstrates in the video embedded below (We believe English subtitles are pending!) The arm mechanism is particularly interesting, and rather elegantly executed, and he does seem rather proud of this part of the design, and so he should! Like with [Peng’s] other projects, there is a lot to see, and plenty of scope for feature explosion. It was nice to see the ‘bot being used as an input device, not only with gesture sensing via the dedicated sensor, but also using the camera with OpenCV to track user posture and act accordingly. This thing could act as genuinely useful AI device, as was a being darn cute at the same time!
Farming is a challenge under even the best of circumstances. Almost all conventional farmers use some combination of tillers, combines, seeders and plows to help get the difficult job done, but for those like [Taylor] who do not farm large industrial monocultures, more specialized tools are needed. While we’ve featured the Acorn open source farming robot before, it’s back now with new and improved features and a simulation mode to help rapidly improve the platform’s software.
The first of the two new physical features includes a fail-safe braking system. Since the robot uses electric geared hub motors for propulsion, the braking system consists of two normally closed relays which short the motor leads in emergency situations. This makes the motors see an extremely high load and stops them from turning. The robot also has been given advanced navigation facilities so that it can follow custom complex routes. And finally, [Taylor] created a simulation mode so that the robot’s entire software stack can be run in Docker and tested inside a simulation without using the actual robot.
For farmers who are looking to buck unsustainable modern agricultural practices while maintaining profitable farms, a platform like Acorn could be invaluable. With the ability to survey, seed, harvest, and even weed, it could perform every task of larger agricultural machinery. Of course, if you want to learn more about it, you can check out our earlier feature on this futuristic farming machine.