[T-Kuhn]’s Octo-Bouncer platform has learned some new tricks since we saw it last. If you haven’t seen it before, this device uses computer vision from a camera mounted underneath its thick, clear acrylic platform to track a ball in 3D space, and make the necessary (and minute) adjustments needed to control the ball’s movements with a robotic platform in real time.
We loved the Octo-Bouncer’s mesmerizing action when we saw it last, and it’s only gotten better. Not only is there a whole new custom ball detection algorithm that [T-Kuhn] explains in detail, there are also now visualizations of both the ball’s position as well as the plate movements. There’s still one small mystery, however. Every now and again, [T-Kuhn] says that the ball will bounce in an unexpected direction. It doesn’t seem to be a bug related to the platform itself, but [T-Kuhn] has a suspicion. Since contact between the ball and platform is where all the control comes from, and the ball and platform touch only very little during a bounce, it’s possible that bits of dust (or perhaps even tiny imperfections on the ball’s surface itself) might be to blame. Regardless, it doesn’t detract from the device’s mesmerizing performance.
Design files and source code are available on the project’s GitHub repository for those who’d like a closer look. It’s pretty trippy watching the demonstration video because there is so much going on at once; you can check it out just below the page break.
Continue reading “Robotic Ball-Bouncing Platform Learns New Tricks”
People rightly marvel at modern surgical techniques that let surgeons leverage the power of robotics to repair the smallest structures in the human body through wounds that can be closed with a couple of stitches. Such techniques can even be applied remotely, linking surgeon and robot through a telesurgery link. It can be risky, but it’s often a patient’s only option.
NASA has arrived at a similar inflection point, except that their patient is the Mars InSight lander, and the surgical suite is currently about 58 million kilometers away. The lander’s self-digging “mole” probe needs a little help getting started, so they’re planning a high-stakes rescue attempt that would make the most seasoned telesurgeon blanch: they want to use the lander’s robotic arm to press down on the mole to help it get back on track.
Continue reading “Interplanetary Whack-A-Mole: NASA’s High-Stakes Rescue Plan For InSight Lander’s Science Mission”
Direct from the “Just Because I Can” department, this blog post by [Eddie Zhang] shows us how easy it is to get the Xiaomi robotic vacuum cleaner working as what might be the world’s most unnecessary Spotify Connect speaker. Will your home be the next to play host to an impromptu performance by DJ Xiaomi? Judging by the audio quality demonstrated in the video after the break, we doubt it. But this trick does give us a fascinating look at the current state of vacuum hacking.
For the first phase of this hack, [Eddie] makes use of Dustcloud, an ongoing project to document and reverse engineer various Xiaomi smart home gadgets. Using the information provided there you can get root-level SSH access to your vacuum cleaner and install your own software. There’s a sentence you never thought you’d read, right?
With the vacuum rooted, [Eddie] then installs a Spotify Connect client intended for the Raspberry Pi. As they’re both ARM devices, the software will run on the Xiaomi bot well enough, but the Linux environment needs a little tweaking. Namely, you need to manually create an Upstart .conf file for the service, as the vacuum doesn’t have systemd installed. There goes another one of those unexpected sentences.
We’re certainly no stranger to robotic vacuum hacking, though historically the iRobot Roomba has been the target platform for such mischief. Other players entering the field can only mean good things for those of us who get a kick out of seeing home appliances pushed outside of their comfort zones.
Continue reading “DJ Xiaomi Spins Beats And Brushes At The Same Time”
Light painting is the process of moving a light while taking a long-exposure photograph, which creates a sort of drawing from the path of the light source. It’s been done in one way or another since at least the early-to-mid 1900s, but modern hardware and methods have allowed for all kinds of new spins on this old idea. [Josh Sheldon] demonstrates just how true this is with the light painting he did for a gum ad, showing what’s possible with a single multicolor LED under CNC control combined with stop-motion animation techniques. The rest of the magic comes from the software. [Josh] designs the animations in Blender, and the paths are then exported and used as the instructions for his self-made Light Painting Machine. The machine therefore recreates the original animation with lights and camera and not a single computer-generated graphic.
[Josh] is no stranger to light painting in this way. We’ve seen his fantastic machine at work before and we’re glad he shared the details behind his latest work. Embedded below is a concise video that shows the whole process, but if you’re in a hurry and just want to see the end product, here’s a shortcut to the results.
For those of you who would like to know more, there are plenty of details on [Josh]’s Light Painting Machine on GitHub along with a more in-depth description of the workflow and software, so check it out.
Continue reading “Utterly Precise Light Painting, Thanks To CNC And Stop Motion”
The House of Mouse has been at the forefront of entertainment technology from its very beginnings in an old orange grove in Anaheim. Disney Imagineers invented the first modern animatronics in the 1960s and they’ve been improving the technology ever since, often to the point of being creepy.
But the complicated guts of an animatronic are sometimes too much for smaller characters, so in the spirit of “cheaper, faster, better”, Disney has developed some interesting techniques for animated characters made from wire. Anyone who has ever played with a [Gumby] or other posable wireframe toys knows that eventually, the wire will break, and even before then will plastically deform so it can’t return to its native state.
Wires used as the skeletons of animated figures can avoid that fate if they are preloaded with special shapes, or “templates,” that redirect the forces of bending. The Disney team came up with a computational model to predict which template shapes could be added to each wire to make it bend to fit the animation needs without deforming. A commercially available CNC wire bender installs the templates that lie in the plane of the wire, while coiled templates are added later with a spring-bending jig.
The results are impressive — the wire skeleton of an animated finger can bend completely back on itself with no deformation, and the legs of an animated ladybug can trace complicated paths and propel the beast with only servos pulling cables on the jointless legs. The video below shows the method and the animated figures; we can imagine that figures animated using this technique will start popping up at Disney properties eventually.
From keeping guests safe from robotic harm to free-flying robotic aerialists, it seems like the Disney Imagineers have a hardware hacker’s paradise at the Happiest Place on Earth.
Continue reading “Kinetic Wire Animatronics Bend It Like Disney”
If you have ever had to complete a task such as building a LEGO model over a remote connection, you will know that the challenges are like an absurd grade school group project. The person giving directions often has trouble describing what they are thinking, and the person doing the work has trouble interpreting what the instructor wants. “Turn the blue block over. No, only half way. Go back. Now turn it. No, the other way. NO! Not clockwise, downward. That’s Upward! Geez. Are you even listening‽” Good times.
While you may not be in this situation every day, the Keio University of Japan has an intuitive way to give instructors a way to physically interact with an instructee through a Moore/Swayze experience. The instructor has a camera in typical pirate parrot placement over the shoulder. Two arms are controlled by the instructor who can see through stereoscopic cameras to have a first-person view from across the globe. This natural way to interact with the user’s environment allows muscle memory to pass from the instructor to the wearer.
For some of the other styles of telepresence, see this deep-sea bot and a cylindrical screen that looks like someone is beaming up directly from the holodeck.
Continue reading “Robots Invade Your Personal Space”
Rover V2 is an open-source, 3D-printable robotic rover platform that has seen a lot of evolution and development from its creator, [tlalexander]. There are a number of interesting things about Rover V2’s design, such as the way the wheel hubs themselves contain motors and custom planetary gearboxes. This system is compact and keeps weight down low to the ground, which helps keep a rover stable. The platform is all wheel drive, and moving parts like the suspension are kept high up, as far away from the ground as possible. Software is a custom Python stack running on a Raspberry Pi that provides basic control.
The Rover V2 is a full mechanical redesign of the previous version, which caught our attention with its intricate planetary gearing inside the wheel hubs. [tlalexander]’s goal is to create a robust, reliable rover platform for development that, thanks to its design, can be mostly 3D printed and requires a minimum of specialized hardware.