Imagine. There you are, comfortable in your lounge pants. Lounging in your lounge. Suddenly in the distance you hear a buzzing. Quiet at first, then louder. A light bulb goes on in your head.
You forgotten that you’d scheduled an Amazon drone repair service in partnership with The Home Depot and Dewalt. They break through the window, spraying you with shards. They paint the spots on the walls. Snap photos of the brands in your closet. Change the light bulbs. Place a bandaid on your glass wounds. Pick up the shards and leave. Repairing it on their way out.
Horrible.
Of course the first step before this dark future comes to be is to see if it can be done; which is what [Marek Baczynski] and a friend accomplished many broken light bulbs later. Using an off the shelf drone with three springy prongs glued to the top they try time and time again to both unscrew and screw in a light bulb. They try at first with a lighter drone, but eventually switch to a more robust model.
After a while they finally manage it, so it’s possible. Next step, automate. Video after the break.
There are many kits available to today’s hobbyists who wish to try their hand at producing simple computer-controlled robots. Small concoctions of servos and laser-cut acrylic, to which boards such as the Arduino, Raspberry Pi, or Beaglebone can easily be fitted.
In the 1980s though this was a market that was yet to be adequately served. The sheer size of the many 8-bit machines of the day meant they could not be incorporated in your robot, and interfacing to them was a bit more challenging than the easy-to-use GPIOs of their modern counterparts. Then the mechanical hardware of a small robot was something that had not been easily and cheaply packaged for the constructor, making building a physical robotic platform a significant task in itself.
[Charlie] is a robot based on the Capsela construction system, a toy consisting of interlocking plastic spheres containing different functions of shafts, gears, and motors. There was a Robotic Workshop kit for Capsella that featured a Commodore 64 interface, and it is through this means that [Charlie]’s three motors are controlled. It includes a ROM that extends Commodore BASIC with extra commands, which allow the robot to be easily controlled.
Artie the robot, with Dacta box in foreground
Meanwhile [Artie] is a Lego robot, using the Dacta TC Logo, a kit sold for the educational market and available at the time with interfaces for the PC and the Apple II. They had a Dacta control box but not the Apple II card to go with it, so had to make do with a functional replica built on a prototyping card. As the name suggests, this was programmed using Logo, and came with the appropriate interpreter software.
Both robots are reported to have been a success in terms of working in the first place, then demonstrating the 1980s technology and providing entertainment and engagement with the faire’s visitors.
We have covered numerous Lego robots over the years, as a search of our site will confirm. But this is only the second time we’ve featured a Capsela project, the first being this Arduino rover from 2011. [Mike] mused why we don’t see Capsela more often, and the same sentiment is true today. Do you have a Capsela set gathering dust somewhere that could make a robotic project?
How do you get teenagers interested in science, technology, and engineering? [Erich]’s team at the Lucerne University of Applied Sciences makes them operate three robots to get a gumball. The entire demonstration was whipped together in a few days, and has been field-repaired at least once; a green-wire fix was a little heavy on the solder and would short out to a neighboring trace when mechanical force was applied.
It’s no secret that we love bizarre robot locomotion, so we are naturally suckers for BALLU (YouTube link, also embedded below) the Bouyancy-Assisted Lightweight Legged Unit. The project started with a simple observation — walking robots are constrained by having to hold themselves up — and removing that constraint make success much easier. Instead of walking, BALLU almost floats and uses what little net weight it does have to push against the ground.
Press a button on the robot and it moves forward until it’s a certain distance from an object. It then takes a picture and sends it off to Google Cloud Vision along with a request to do face detection. The response that Google returns is in JSON format and, if it finds a face, includes the likelihood of the face being happy, sad, sorrowful or surprised. The robot parses that response and gives an appropriate canned speech using the text-to-speech software, eSpeak e.g. “You seem happy! Tell me why you are so happy!”.
[Dexter] has made the source code available on github. It’s written in python and is easy to read by anyone with even just a little programming experience. The video after the break gives a number of demonstrations, including some with non-human subjects.
Most inexpensive 3D printers use a type of lead screw to move some part of the printer in the vertical direction. A motor turns a threaded rod and that causes a nut to go up or down. The printer part rides on the nut. This works well, but it is slower than other drive mechanisms (which is why you don’t often see them on the horizontal parts of a printer). Some cheap printers use common threaded rod, which is convenient, but prone to bad behavior since the rods are not always straight, the threads are subject to backlash, and the tolerances are not always the best.
More sophisticated printers use ACME threaded rod or trapezoidal threaded rods. These are made for this type of service and have thread designs that minimize things like backlash. They typically are made to more exacting standards, too. Making the nut softer than the rod (for example, brass or Delrin) is another common optimization.
However, when lead screws aren’t good enough, mechanical designers turn to ball screws. In principle, these are very similar to lead screws but instead of a nut, there is a race containing ball bearings that moves up and down the screw. The ball bearings lead to less friction.
Misumi recently posted a few blog articles about ball screws. Some of the information is basic, but it also covers preloading and friction. Plus they are promising future articles to expand on the topic. If you prefer to watch a video, you might enjoy the one below.
Australian roboticists from the Queensland University of Technology have developed a prototype agricultural robot that uses machine vision to identify both weed and crop plants before either uprooting or poisoning the weeds or applying fertiliser to the crop.
The machine is a wide platform designed to straddle a strip of the field upon which it is working, with electric wheel motors for propulsion. It is solar-powered, and it is envisaged that a farm could have several of them continuously at work.
At a superficial level there is nothing new in the robot, its propulsion, or even the plant husbandry and weeding equipment. The really clever technology lies in the identification and classification of the plants it will encounter. It is on the success or failure of this in real farm environments that the robot’s future will hinge. The university’s next step will be to take it on-farm, and the ABC report linked above has a wonderfully pithy quote from a farmer on the subject. You can see the machine in action in the video below the break.
Farming robots have a significant following among the hardware hacker community, but it is possible that the machine-vision and plant-identifying abilities of this one would be beyond most hackers. However it is still an interesting project to watch, marking as it does a determined attempt to take the robot out of the lab and into real farm settings.