Imagine trying to make a ball-shaped robot that rolls in any direction but with a head that stays on. When I saw the BB-8 droid doing just that in the first Star Wars: The Force Awakens trailer, it was an interesting engineering challenge that I couldn’t resist. All the details for how I made it would fill a book, so here are the highlights: the problems I ran into, how I solved them and what I learned.
The HTC Vive’s Lighthouse localization system is one of the cleverest things we’ve seen in a while. It uses a synchronization flash followed by a swept beam to tell any device that can see the lights exactly where it is in space. Of course, the device has to understand the signals to figure it out.
[Alex Shtuchkin] built a very well documented device that can use these signals to localize itself in your room. For now, the Lighthouse stations are still fairly expensive, but the per-device hardware requirements are quite reasonable. [Alex] has the costs down around ten dollars plus the cost of a microcontroller if your project doesn’t already include one. Indeed, his proof-of-concept is basically a breadboard, three photodiodes, op-amps, and some code.
His demo is awesome! Check it out in the video below. He uses it to teach a quadcopter to land itself back on a charging platform, and it’s able to get there with what looks like a few centimeters of play in any direction — more than good enough to land in the 3D-printed plastic landing thingy. That fixture has a rotating drum that swaps out the battery automatically, readying the drone for another flight.
If this is just the tip of the iceberg of upcoming Lighthouse hacks, we can’t wait!
Imagine. There you are, comfortable in your lounge pants. Lounging in your lounge. Suddenly in the distance you hear a buzzing. Quiet at first, then louder. A light bulb goes on in your head.
You forgotten that you’d scheduled an Amazon drone repair service in partnership with The Home Depot and Dewalt. They break through the window, spraying you with shards. They paint the spots on the walls. Snap photos of the brands in your closet. Change the light bulbs. Place a bandaid on your glass wounds. Pick up the shards and leave. Repairing it on their way out.
Horrible.
Of course the first step before this dark future comes to be is to see if it can be done; which is what [Marek Baczynski] and a friend accomplished many broken light bulbs later. Using an off the shelf drone with three springy prongs glued to the top they try time and time again to both unscrew and screw in a light bulb. They try at first with a lighter drone, but eventually switch to a more robust model.
After a while they finally manage it, so it’s possible. Next step, automate. Video after the break.
There are many kits available to today’s hobbyists who wish to try their hand at producing simple computer-controlled robots. Small concoctions of servos and laser-cut acrylic, to which boards such as the Arduino, Raspberry Pi, or Beaglebone can easily be fitted.
In the 1980s though this was a market that was yet to be adequately served. The sheer size of the many 8-bit machines of the day meant they could not be incorporated in your robot, and interfacing to them was a bit more challenging than the easy-to-use GPIOs of their modern counterparts. Then the mechanical hardware of a small robot was something that had not been easily and cheaply packaged for the constructor, making building a physical robotic platform a significant task in itself.
[Charlie] is a robot based on the Capsela construction system, a toy consisting of interlocking plastic spheres containing different functions of shafts, gears, and motors. There was a Robotic Workshop kit for Capsella that featured a Commodore 64 interface, and it is through this means that [Charlie]’s three motors are controlled. It includes a ROM that extends Commodore BASIC with extra commands, which allow the robot to be easily controlled.
Meanwhile [Artie] is a Lego robot, using the Dacta TC Logo, a kit sold for the educational market and available at the time with interfaces for the PC and the Apple II. They had a Dacta control box but not the Apple II card to go with it, so had to make do with a functional replica built on a prototyping card. As the name suggests, this was programmed using Logo, and came with the appropriate interpreter software.
Both robots are reported to have been a success in terms of working in the first place, then demonstrating the 1980s technology and providing entertainment and engagement with the faire’s visitors.
We have covered numerous Lego robots over the years, as a search of our site will confirm. But this is only the second time we’ve featured a Capsela project, the first being this Arduino rover from 2011. [Mike] mused why we don’t see Capsela more often, and the same sentiment is true today. Do you have a Capsela set gathering dust somewhere that could make a robotic project?
How do you get teenagers interested in science, technology, and engineering? [Erich]’s team at the Lucerne University of Applied Sciences makes them operate three robots to get a gumball. The entire demonstration was whipped together in a few days, and has been field-repaired at least once; a green-wire fix was a little heavy on the solder and would short out to a neighboring trace when mechanical force was applied.
It’s no secret that we love bizarre robot locomotion, so we are naturally suckers for BALLU (YouTube link, also embedded below) the Bouyancy-Assisted Lightweight Legged Unit. The project started with a simple observation — walking robots are constrained by having to hold themselves up — and removing that constraint make success much easier. Instead of walking, BALLU almost floats and uses what little net weight it does have to push against the ground.
Press a button on the robot and it moves forward until it’s a certain distance from an object. It then takes a picture and sends it off to Google Cloud Vision along with a request to do face detection. The response that Google returns is in JSON format and, if it finds a face, includes the likelihood of the face being happy, sad, sorrowful or surprised. The robot parses that response and gives an appropriate canned speech using the text-to-speech software, eSpeak e.g. “You seem happy! Tell me why you are so happy!”.
[Dexter] has made the source code available on github. It’s written in python and is easy to read by anyone with even just a little programming experience. The video after the break gives a number of demonstrations, including some with non-human subjects.