Robotically-Tuned Tube Radio

Dubbed the “Robot Radio” by [Brek], this clinking-&-clunking project merges three generations of hackers’ favorite technologies: robots, vacuum tubes, and microcontrollers. After the human inputs the desired radio frequency the machine chisels its way through the spectrum, trying its best to stay on target.

This build began its life as a junky old tube radio that [Brek] pulled out of a shed. The case was restored and then the hacking began. Inserted between the human and the radio, a PIC 16F628A keeps watch in both directions. On one side, the radio’s tank circuit is monitored to see what frequency the radio is currently playing. On the other, the human’s input sets a desired frequency. If the two do not match, the PIC tells a stepper motor to begin cranking a pair of gears until they do.

Another interesting feature is that as the tubes and other electronics warm up and change their values, the matching circuit will keep them in line. [Brek] shows this in the video by deliberately sabotaging the gears and seeing the robot adjust them back where they belong.

As an afterthought, the Robot Radio was supplemented with a module that adds 100khz to the signal so that the information from a nearby airport can be received.

[Brek] styled the whole machine up with some copper framing and other bits, similar to his spectacular atomic clock build we featured last month.

See the video of the radio tuning after the break.

Continue reading “Robotically-Tuned Tube Radio”

Santa’s Autonomous Helping Hands Let The Jolly Ol’ Fellow Kick Back This Season

For those skeptical about the feasibility of Santa’s annual delivery schedule, here’s an autonomous piece of the puzzle that will bewilder even the most hard-hearted of non-believers.

The folks over at the Center of Excellence Cognitive Interaction Technology (CITEC) in Germany have whipped together a fantastic demo featuring Santa’s extra pair of helping hands. In the two-and-a-half minute video, the robot executes a suite of impressive autonomous stocking-stuffing maneuvers: from recognizing the open hole in the stocking, to grasping specific candies from the cluster of goodies available.

On the hardware-side, the arms appear to be a KUKA-variant, while on the software-side, the visualizations are being handled by the open source robot software ROS‘ RVIZ tool.

If some of the props in the video look familiar, you’ll find that the researchers at CITEC have already explored some stellar perception, classification, and grasping of related research topics. Who knew this pair of hands would be so jolly to clock some overtime this holiday season? The entire video is set to a crisp computer-voiced jingle that serves as a sneaky summary of their approach to this project.

Now, if only we could set these hands off to do our other dirty work….

Continue reading “Santa’s Autonomous Helping Hands Let The Jolly Ol’ Fellow Kick Back This Season”

LabVolt Robot Arm Reverse Engineering

Reverse Engineering A Robotic Arm

Not too many people will argue that Robot Arms aren’t cool. [Dan] thinks they are cool and purchased a LabVolt Armdroid robotic arm on eBay for a mere $150. Unfortunately, he did not get the power supply or the control unit. To most, this would a serious hurdle to overcome, but not for [Dan]. He opened up the robot and started probing around the circuit board to figure out what was going on.

Since there was a DB9 connector on the outside of the robot arm, he assumed it was a standard RS-232 controlled device. Good thing he checked the internal circuitry because this was not the case at all. There was no mircocontroller or microprocessor found inside.  [Dan] painstakingly reversed engineered the circuit board and documented his results. He found that there were SN76537A chips that drove the 6 unipolar stepper motors and SN75HC259 latches to address each individual motor.

Now knowing how the robot works, [Dan] had to figure out how to control the robot from his computer. He started by making a custom Parallel Port to DB9 cable to connect the computer to the arm. After a series of several programs, starting with simply moving just one arm joint, the latest iteration allows manual control of all joints using the computer keyboard. A big ‘Thanks’ goes out to [Dan] for all his work and documentation.

 

Gift Your Next Robot With The Brain Of A Roundworm

A group of developers called [OpenWorm] have mapped the 302 neurons of the Caenorhabditis elegans species of roundworm and created a virtual neural network that can be used to solve all the types of problems a worm might encounter. Which, when you think about it, aren’t much different from those a floor-crawling robots would be confronted with.

wormy

In a demo video released by one of the projects founders, [Timothy Busbice], their network is used to control a small Lego-rover equipped with a forward sonar sensor. The robot is able to stop before it hits a wall and determine an appropriate response, which may be to stop, back up, or turn. This is all pretty fantastic when you compare these 302 neural connections to any code you’ve ever written to accomplish the same task! It might be a much more complex route to the same outcome, but its uniquely organic… which makes watching the little Lego-bot fascinating; its stumbling around even looks more like thinking than executing.

I feel obligated to bring up the implications of this project. Since we’re all thinking about it now, let’s all imagine the human brain similarly mapped and able to simulate complex thought processes. If we can pull this off one day, not only will we learn a lot more about how our squishy grey hard drives process information, artificial intelligence will also improve by leaps and bounds. An effort to do this is already in effect, called the connectome project, however since there are a few more connections to map than with the c. elegans’ brain, it’s a feat that is still underway.

The project is called “open”worm, which of course means you can download the code from their website and potentially dabble in neuro-robotics yourself. If you do, we want to hear about your wormy brain bot.

Continue reading “Gift Your Next Robot With The Brain Of A Roundworm”

Robot Vision: Detecting Obstacles With FPGAs And Line Lasers

Somewhere down the road, you’ll find that your almighty autonomous robot chassis is going to need some sensor feedback. Otherwise, that next small step down the road may end with a blind leap off the coffee table. The first low-cost sensors we might throw at this problem would be sonars or IR rangefinders, but there’s a problem: those sensors only really provide distance data back from the pinpoint view directly ahead of them.

Rest assured, [Jonathan] wrote in to let us know that he’s got you covered. Combining a line laser, camera, and an FPGA, he’s able to detect obstacles that fall within the field of view of the camera and laser.

If you thought writing algorithms in software is tricky, wait till to you try hardware! (We know: division sucks!) [Jonathan] knows no fear though; he’s performing gradient computation on the FPGA directly to detect the laser in the camera image at a wicked 30 frames-per-second. Why roll up your sleeves and take the hardware route, you might ask? If we took a CPU-based approach at the tiny embedded-robot scale, Jonathan estimates a mere 10 frames-per-second. With an FPGA, we’re able to process images about as fast as they’re received.

Jonathan is using the Logi Board, a Kickstarter success we’ve visited in the past, and all of his code is up on the Githubs. If you crack it open, you’ll also find that many of his modules are Wishbone compliant, so developing your own projects with just some of these parts has been made much easier than trying to rip out useful features from a sea of hairy logic.

With computer-vision hardware keeping such a low profile in the hobbyist community, we’re excited to hear more about [Jonathan’s] FPGA-based robotics endeavors.

Continue reading “Robot Vision: Detecting Obstacles With FPGAs And Line Lasers”

Pico-Kubik Quadruped Fits In The Palm Of Your Hand

Most of the legged robots we see here are of the hexapod variety, and with good reason. Hexapods are very stable and can easily move even if one or more of the legs has been disabled. [Radomir] has taken this a step farther and has become somewhat of an expert on the more technically difficult quadruped robot, building smaller and smaller ones each time. He has been hard at work on his latest four-legged creation called the Pico-Kubik, and this one will fit in the palm of your hand.

The Pico-Kubik runs Micropython on a VoCore board, which allows for it to have a small software footprint to complement its small hardware footprint. It accomplishes the latter primarily through the use of HK-282A Ultra-Micro Servos, an Arduino Pro Mini, and a tiny lithium ion battery. It’s still a work in progress, but the robot can already crawl across the tabletop.

This isn’t [Radomir]’s first time at the tiny quadruped rodeo, either. He has already built the Nano-Kubik and the µKubik, all of which followed the first (aptly-named) Kubik quadruped. Based on the use of SI prefixes, we can only assume the next one will be the hella-Kubik!

The Halfbug

[Alex] posted up build details of his robot, Halfbot, on Tinkerlog. We’ve been big fans of his work ever since his Synchronizing Fireflies Instructable way back in the day. [Alex’s] work usually combines an unconventional idea with minimalistic design and precise execution, and Halfbot is no exception.

You’ll have to watch the video (embedded below the break) to fully appreciate the way it moves. The two big front legs alternate with the small front pads to make an always-stable tripod with the caster wheel at the back. It lifts itself up, moves a bit forward, and then rests itself down on the pads again while the legs get in position for the next step. It’s not going to win any speed tournaments, but it’s a great-looking gait.

The head unit also has two degrees of freedom, allowing it to scan around with its ultrasonic rangefinder unit, and adding a bit more personality to the whole affair.

[Alex] mentions that he’d recently gotten a lathe and then a CNC mill. So it’s no surprise that he made all the parts from scratch just to give the machines a workout. We think he did a great job with the overall aesthetics, and in particular the battery pack.

We’re excited to see how [Alex] adds new behaviors as he develops the firmware. No pressure!

Continue reading “The Halfbug”