Twisted String Actuators

[Travis] tells us about a neat actuator concept that’s as old as dirt. It’s capable of lifting 7kg when powered by a pager motor, and the only real component is a piece of string.

The concept behind the twisted string actuator, as it’s known to academia, is as simple as putting a motor on one end of a piece of string, tying the other end off to a load, and putting a few twists in the string. It’s an amazingly simple concept that has been known and used for thousands of years: ballistas and bow-string fire starters use the same theory.

Although the concept of a twisted string actuator is intuitively known by anyone over the age of six, there aren’t many studies and even fewer projects that use this extremely high gear ratio, low power, and very cheap form of linear motion. A study from 2012 (PDF) put some empirical data behind this simple device. The takeaway from this study is that tension on the string doesn’t matter, and more strands or larger diameter strands means the actuator shrinks with a fewer number of turns. Fewer strands and smaller diameter strands take more turns to shrink to the same length.

As for useful applications of these twisted string actuators, there are a few projects that have used these systems in anthropomorphic hands and elbows. No surprise there, really; strings don’t take up much space, and they work just like muscles and tendons do in the human body.

Thanks [ar0cketman] for the link.

Jamming Robot Will Destroy You At Beer Pong

DSC_0363Wandering the aisles of Eureka Park, the startup area of the Consumer Electronics Show, I spotted a mob of people and sauntered over to see what the excitement was all about. Peeking over this gentleman’s shoulder I realized he was getting spanked at Beer Pong… by a robot!

Those in the know will recognize that the bot has only 3 cups left and so the guy definitely was giving it run for its money. But the bot’s ability to swish the ball on nearly every throw accounts for the scoreboard which read Robot: 116, Humans: 11. Unlike the ping pong robot hoax from last March, we can vouch for this one being real!

If you’re trying to attract the geek demographic, this must be one of the best offerings ever shown at a trade show. Empire Robotics manufactures the VERSABALL gripper. We know this as a jamming gripper and have been looking at the tech progress for many years now. Looking back to this Cornell research video from 2010 we realize it is based on the white paper which [John Amend, PhD] co-authored. He’s now CTO and Co-Founder of the company and was one of the people running the booth. We love it when trade show booths are staffed by the engineers!

Join me after the break for a rundown of how the system works along with a video clip of it hitting the target.

Continue reading “Jamming Robot Will Destroy You At Beer Pong”

Robotically-Tuned Tube Radio

Dubbed the “Robot Radio” by [Brek], this clinking-&-clunking project merges three generations of hackers’ favorite technologies: robots, vacuum tubes, and microcontrollers. After the human inputs the desired radio frequency the machine chisels its way through the spectrum, trying its best to stay on target.

This build began its life as a junky old tube radio that [Brek] pulled out of a shed. The case was restored and then the hacking began. Inserted between the human and the radio, a PIC 16F628A keeps watch in both directions. On one side, the radio’s tank circuit is monitored to see what frequency the radio is currently playing. On the other, the human’s input sets a desired frequency. If the two do not match, the PIC tells a stepper motor to begin cranking a pair of gears until they do.

Another interesting feature is that as the tubes and other electronics warm up and change their values, the matching circuit will keep them in line. [Brek] shows this in the video by deliberately sabotaging the gears and seeing the robot adjust them back where they belong.

As an afterthought, the Robot Radio was supplemented with a module that adds 100khz to the signal so that the information from a nearby airport can be received.

[Brek] styled the whole machine up with some copper framing and other bits, similar to his spectacular atomic clock build we featured last month.

See the video of the radio tuning after the break.

Continue reading “Robotically-Tuned Tube Radio”

Santa’s Autonomous Helping Hands Let The Jolly Ol’ Fellow Kick Back This Season

For those skeptical about the feasibility of Santa’s annual delivery schedule, here’s an autonomous piece of the puzzle that will bewilder even the most hard-hearted of non-believers.

The folks over at the Center of Excellence Cognitive Interaction Technology (CITEC) in Germany have whipped together a fantastic demo featuring Santa’s extra pair of helping hands. In the two-and-a-half minute video, the robot executes a suite of impressive autonomous stocking-stuffing maneuvers: from recognizing the open hole in the stocking, to grasping specific candies from the cluster of goodies available.

On the hardware-side, the arms appear to be a KUKA-variant, while on the software-side, the visualizations are being handled by the open source robot software ROS‘ RVIZ tool.

If some of the props in the video look familiar, you’ll find that the researchers at CITEC have already explored some stellar perception, classification, and grasping of related research topics. Who knew this pair of hands would be so jolly to clock some overtime this holiday season? The entire video is set to a crisp computer-voiced jingle that serves as a sneaky summary of their approach to this project.

Now, if only we could set these hands off to do our other dirty work….

Continue reading “Santa’s Autonomous Helping Hands Let The Jolly Ol’ Fellow Kick Back This Season”

LabVolt Robot Arm Reverse Engineering

Reverse Engineering A Robotic Arm

Not too many people will argue that Robot Arms aren’t cool. [Dan] thinks they are cool and purchased a LabVolt Armdroid robotic arm on eBay for a mere $150. Unfortunately, he did not get the power supply or the control unit. To most, this would a serious hurdle to overcome, but not for [Dan]. He opened up the robot and started probing around the circuit board to figure out what was going on.

Since there was a DB9 connector on the outside of the robot arm, he assumed it was a standard RS-232 controlled device. Good thing he checked the internal circuitry because this was not the case at all. There was no mircocontroller or microprocessor found inside.  [Dan] painstakingly reversed engineered the circuit board and documented his results. He found that there were SN76537A chips that drove the 6 unipolar stepper motors and SN75HC259 latches to address each individual motor.

Now knowing how the robot works, [Dan] had to figure out how to control the robot from his computer. He started by making a custom Parallel Port to DB9 cable to connect the computer to the arm. After a series of several programs, starting with simply moving just one arm joint, the latest iteration allows manual control of all joints using the computer keyboard. A big ‘Thanks’ goes out to [Dan] for all his work and documentation.

 

Gift Your Next Robot With The Brain Of A Roundworm

A group of developers called [OpenWorm] have mapped the 302 neurons of the Caenorhabditis elegans species of roundworm and created a virtual neural network that can be used to solve all the types of problems a worm might encounter. Which, when you think about it, aren’t much different from those a floor-crawling robots would be confronted with.

wormy

In a demo video released by one of the projects founders, [Timothy Busbice], their network is used to control a small Lego-rover equipped with a forward sonar sensor. The robot is able to stop before it hits a wall and determine an appropriate response, which may be to stop, back up, or turn. This is all pretty fantastic when you compare these 302 neural connections to any code you’ve ever written to accomplish the same task! It might be a much more complex route to the same outcome, but its uniquely organic… which makes watching the little Lego-bot fascinating; its stumbling around even looks more like thinking than executing.

I feel obligated to bring up the implications of this project. Since we’re all thinking about it now, let’s all imagine the human brain similarly mapped and able to simulate complex thought processes. If we can pull this off one day, not only will we learn a lot more about how our squishy grey hard drives process information, artificial intelligence will also improve by leaps and bounds. An effort to do this is already in effect, called the connectome project, however since there are a few more connections to map than with the c. elegans’ brain, it’s a feat that is still underway.

The project is called “open”worm, which of course means you can download the code from their website and potentially dabble in neuro-robotics yourself. If you do, we want to hear about your wormy brain bot.

Continue reading “Gift Your Next Robot With The Brain Of A Roundworm”

Robot Vision: Detecting Obstacles With FPGAs And Line Lasers

Somewhere down the road, you’ll find that your almighty autonomous robot chassis is going to need some sensor feedback. Otherwise, that next small step down the road may end with a blind leap off the coffee table. The first low-cost sensors we might throw at this problem would be sonars or IR rangefinders, but there’s a problem: those sensors only really provide distance data back from the pinpoint view directly ahead of them.

Rest assured, [Jonathan] wrote in to let us know that he’s got you covered. Combining a line laser, camera, and an FPGA, he’s able to detect obstacles that fall within the field of view of the camera and laser.

If you thought writing algorithms in software is tricky, wait till to you try hardware! (We know: division sucks!) [Jonathan] knows no fear though; he’s performing gradient computation on the FPGA directly to detect the laser in the camera image at a wicked 30 frames-per-second. Why roll up your sleeves and take the hardware route, you might ask? If we took a CPU-based approach at the tiny embedded-robot scale, Jonathan estimates a mere 10 frames-per-second. With an FPGA, we’re able to process images about as fast as they’re received.

Jonathan is using the Logi Board, a Kickstarter success we’ve visited in the past, and all of his code is up on the Githubs. If you crack it open, you’ll also find that many of his modules are Wishbone compliant, so developing your own projects with just some of these parts has been made much easier than trying to rip out useful features from a sea of hairy logic.

With computer-vision hardware keeping such a low profile in the hobbyist community, we’re excited to hear more about [Jonathan’s] FPGA-based robotics endeavors.

Continue reading “Robot Vision: Detecting Obstacles With FPGAs And Line Lasers”