When I read old books, I like to look for predictions of the future. Since we are living in that future, it is fun to see how they did. Case in point: I have a copy of “The New Wonder Book of Knowledge”, an anthology from 1941. This was the kind of book you wanted before there was a Wikipedia to read in your spare time. There are articles about how coal is mined, how phonographs work, and the inner workings of a beehive. Not the kind of book you’d grab to look up something specific, but a great book to read if you just want to learn something interesting. In it there are a few articles about technology that seemed ready to take us to the future. One of those is the Televox — a robot from Westinghouse poised to usher in an age of home and industrial mechanical servants. Robots in 1941? Actually, Televox came into being in 1927.
If you were writing about the future in 2001, you might have pictured city sidewalks congested with commuters riding Segways. After all, in 2001, we were told that something was about to hit the market that would “change everything.” It had a known inventor, Dean Kamen, and a significant venture capitalist behind it. While it has found a few niche markets, it isn’t the billion dollar personal transportation juggernaut that was predicted.
But technology is like that. Sometimes things seem poised for greatness and disappear — bubble memory comes to mind. Sometimes things have a few years of success and get replaced by something better. Fax machines or floppy drives, for example. The Televox was a glimpse of what was to come, but not in any way that people imagined in 1941. Continue reading “Televox: The Past’s Robot of the Future”→
Sensors are critical in robotics. A robot relies on its sensor package to perform its programmed duties. If sensors are damaged or non-functional, the robot can perform unpredictably, or even fail entirely. [Dheera Venkatraman] has been working to make debugging sensor issues easier with the rosshow package for Robot Operating System.
Normally, if you want to be certain a camera feed is working on a robot, normally you’d have to connect a monitor and other peripherals, check manually, then put everything away again when you’re finished. [Dheera] considered this was altogether too much of a pain for basic sensor checks.
Instead, rosshow uses the power of SSH to speed things along. Log in to the robot, fire off a few command line instructions, and rosshow will start displaying sensor data in the terminal on your remote machine. It’s achieved through the use of Unicode Braille art in the terminal. Sure, you won’t get a full-resolution feed from your high-definition camera, and the display from the laser scanner isn’t exactly perfect. But it’s enough to provide an instant verification that sensors are connected and working, and will speed up those routine is-it-connected checks by an order of magnitude.
Every few months or so, a new video from Boston Dynamics will make the rounds on the Internet. This is their advertising, because unless the military starts buying mechanical mules, Boston Dynamics is going to be out of business pretty soon. You’ll see robots being kicked down the stairs, robots walking through doors, and robots acting like dogs. If a hundred or so highly skilled and highly educated roboticists, technologists, and other experts can put together a walking dog robot in a decade, obviously one person can cut through the cruft and build one in a basement. That’s what [Misha] is doing. It’s the Dizzy Wolf, a robotic wolf, or dog, or cat, we don’t actually know because there’s no fur (or head) yet. But it is interesting.
The key component for any quadruped robot is a high-torque, low-noise servo motor. This isn’t a regular ‘ol brushless motor, and for this application nine gram servos go in the trash. This means custom made motors, or DizzyMotors. You’re looking at a big brushless motor with a planetary gearset, all squished into something that could actually fit into the joint of a robotic wolf’s leg.
There’s a driver for these motors, strangely not called the DizzyDriver, that turns a BLDC into a direct drive servo motor. It is effectively a smart servo, that will move to a specific rotation, receive commands over RS-485, and write back the angular position. It also applies constant torque. Of course, there is a video of the DizzyMotor and servo driver below.
Building a robotic dog that will walk around the house is one of the hardest engineering challenges out there. You’ve got fairly crazy kinematics, you’ll need to think about the strength of the frame, control systems, and eventually how to fit everything in a compact design. This project is hitting all the marks, and we can’t wait to see the Dizzy Wolf do a backflip or chase a ball.
While robotic arms can handle a wide variety of tasks, the specific job at hand will have a major influence on the type of end effector used. For sorting ferromagnetic parts an electromagnet might be enough, while for more accurate location a mechanical gripper could be employed. If you’re working with particularly delicate objects or in concert with human beings, it may be desired to have a force controlled gripper to avoid damage. [James Bruton] has been whipping up a design of his own for just this purpose.
The basic gripper is 3D printed, with 3 fingers consisting of two joints each. Retraction of each finger is courtesy of bungee cord, while extension is via a servo attached to the finger through a spring. The position of each finger is measured with a resistive flex sensor. An Arduino Uno is employed to run the servos and read the attached sensors.
As force is applied by the servo, the spring begins to stretch. This leads to a greater difference between the servo position and the finger position as the applied force increases. By calculating this difference, it’s possible to determine the force applied by the fingers. This can then be used to limit the applied force of the gripper, to avoid breaking delicate objects or crushing soft, fleshy humans.
[James] notes that there are some drawbacks to the current design. The force required to move the fingers is inconsistent along their travel, and this interferes somewhat with accurate measurement. Overall though it’s a solid proof of concept and a good base for further revisions. Files are on Github for those who wish to tinker at home.
Social media has become pervasive in modern life. It can be impossible to get so much as an invite to a party without offering up your personal data at the altar of the various tech companies. [David] wanted to avoid the pressures of seeing countless photos of people climbing mountains and eating tacos, but also didn’t want to ostracize himself by avoiding social media altogether. Naturally, automation was the answer.
[David] aptly named his robot Telephone Operator, and that’s precisely what it does. Stepper motors and a servo allow the robot’s capacitive appendage to interact with the touch screen on [David]’s iPhone. A camera is fitted, and combined with OpenCV, the robot is capable of a great many important tasks.
Liking Instagram posts? Done. Reposting inane tweets? Easy. Asking your pal Mike what’s up? Yep, Telephone Operator has it covered. Given the low quality of human interaction on such platforms, it’s entirely possible [David] has the Turing Test beat without even trying. The robot even has that lazy continuous Sunday morning scroll down pat. It’s spooky stuff.
Robotic arms are fascinating devices, capable of immense speed and precision when carrying out their tasks. They’re also capable of carrying great loads, and a full-sized industrial robot in operation at maximum pace is a sight to behold. However, while it’s simple to design grippers to move strong metal objects, picking up delicate or soft objects can be much harder. A team at MIT CSAIL have been working on a solution to this problem, which they call the Origami gripper.
The gripper consists of a flexible, folding skeleton surrounded by an airtight skin. When vacuum is applied, the skeleton contracts around the object to be picked up. The gripper is capable of grasping objects sized up to 70% of its diameter, and over 100 times its weight.
Fabrication of the device involved the creation of 3D printed molds to produce the silicone rubber skeleton. Combined with precise lasercutting and advanced layering techniques, this created a part that can self-fold itself into shape under the right conditions. The structure was inspired by a “magic ball” origami design. The outer skin is remarkably simple in comparison – consisting of a regular latex balloon.
When reading textual communications, it can be difficult to accurately acertain emotional intent. Individual humans can be better or worse at this, with sometimes hilarious results when it goes wrong. Regardless, there’s nothing a human can do that a machine won’t eventually do better. For just this purpose, Tweetbot is here to emotionally react to Twitter so you don’t have to.
The ‘bot receives tweets over a bluetooth link, handled by a PIC32, which also displays them on a small TFT screen. The PIC then analyses the tweet for emotional content before sending the result to a second PIC32, which displays emotes on a second TFT screen, creating the robot’s face. Varying LEDs are also flashed depending on the emotion detected – green for positive emotions, yellow for sadness, and red for anger.
The final bot is capable of demonstrating 8 unique emotional states, far exceeding the typical Facebook commenter who can only express unbridled outrage. With the ‘bot packing displays, multiple microcontrollers, and even motor drives, we imagine the team learned a great deal in the development of the project.