BB-8 the new droid in the star wars franchise made his first public appearance (YouTube link) at Star Wars Celebration last week. While cast and crew of the movie have long said that BB-8 is real, seeing it up on stage, driving circles around R2D2 takes things to a whole new level. The question remains, how exactly does it work?
Our (and probably any other tech geek worth their salt’s) immediate reaction was to think of xkcd’s “New Pet” comic. All the way back in 2008, [Randall Munroe] suggested omnidirectional wheels and magnets could be used to create exactly this kind of ‘bot. Is this what’s going on inside BB-8? No one knows for sure, but that won’t stop us from trying to figure it out!
BB-8’s family tree may actually start with Sphero. Fortune reports that Sphero was part of Disney’s accelerator program in 2014. Each company in the accelerator program gets a mentor from Disney. Sphero’s mentor was Disney CEO Bob Iger himself.
So if BB-8’s body is based on a Sphero, how does the head work? The Disney crew has been mum on this so far, but there is plenty of speculation! If you watch the video in HD, several flashes can be seen between the body and head gap. These might be status LEDs on BB-8’s electronics, but they could also be IR LEDs – possibly part of an optical mouse style sensor. Sensor fusion between gyroscopes, accelerometers and the optical flow sensors would make for a robust solution to the inverted pendulum problem presented by BB-8’s head.
How do you think BB-8 works? Is it magnets, motors, or The Force? Let us know in the comments!
Continue reading “BB-8 is real! But how did they do it?”
Is your landscape congested with toxic waste, parched, or otherwise abandoned? The Terra Spider may be your answer to new life in otherwise barren wastelands.
Bred in the Digital Craft Lab at the California College of the Arts, the current progress demonstrates the principle of deploying multiple eight-legged drones that can drill and deploy their liquid payload, intended to “repair or maintain” the landing site.
To deliver their project, students [Manali Chitre], [Anh Vu], and [Mallory Van Ness] designed and assembled a laser-cut octopod chassis, an actuated drilling mechanism, and a liquid deployment system all from easily available stock components and raw materials. While project details are sparse, the comprehensive bill-of-materials gives us a window into the process of putting together the pieces of a Terra Spider. The kinematics for movement are actuated by servos, a Sparkfun gear-reduced motor enables drilling, and a peristaltic pump handles the payload deployment.
It’s not every day that flying robots deploy drill-wielding spider drones. Keep in mind, though, that the Terra Spider is a performance piece, a hardware-based demonstration of a bigger idea, in our case: remote coverage and sample deployments in a barren wasteland. While, this project is still a work-in-progress, the bill-of-materials and successful deployment demos both testify towards this project’s extensive development.
With the earnest intent of repairing withering environments, perhaps this project has a future as an entry into this year’s Earth-saving Hackaday Prize….
Coming soon to a galaxy near you!
Continue reading “Terra Spider Repairs and Resurfaces new Frontiers”
For anyone looking for a capable robotic arm for automation of an industrial process, education, or just a giant helping hand for a really big soldering project, most options available can easily break the bank. [Mads Hobye] and the rest of the folks at FabLab RUC have tackled this problem, and have come up with a very capable, inexpensive, and open-source industrial arm robot that can easily be made by anyone.
The robot itself is Arduino-based and has the option to attach any end effector that might be needed for a wide range of processes. The schematics for all of the parts are available on the project site along with all of the Arduino source code. [Mads Hobye] notes that they made this robot during a three-day sprint, so it shouldn’t take very long to get your own up and running. There’s even a virtual robot that can be downloaded and used with the regular robot code, which can be used for testing or for simply getting the feel for the robot without having to build it.
This is a great project, and since it’s open source it will be great for students, small businesses, and hobbyists alike. The option to attach any end effector is also a perk, and we might suggest trying out [Yale]’s tendon-driven robotic hand. Check after the break for a video of this awesome robot in action.
Continue reading “Open-Source Robotic Arm Now Within Reach”
You’re gunna love my cuts.
KUKA robots are cool. They’re both elegant and terrifying to watch in action as they move unyieldingly to preform tasks. Not many of us get to use industrial tools like this because they aren’t exactly trivial to wield (or cheap!). Artists [Clemens Weisshaar] and [Reed Kram] however created an installation that allows anyone to potentially control one of these orange beauties to do their bidding… all from the safety and comfort of a computer chair.
For their piece, “ROBOCHOP”, the artists developed a web app that allows you to easily manipulate the surface of a virtual cube. You can rotate for positioning and then use a straight or curved line tool to draw vectors through its surface and subtract material. Once you’re finished sculpting your desired masterpiece, one of the four KUKA robots in the installation will retrieve a 40 x 40 x 40 cm block of foam and shape it into a real-life version of whatever you created in the app.
Starting today you can visit the project’s website and upload your own mutilated cube designs. If your design is selected by the artists, it will be among the 2000 pieces carved by the robots throughout their installation during CeBit in Hanover. After the show, your cube spawn will then be mailed to you free of charge! The only way I could see this being cooler, is if they filmed the process so you could watch your shape being born.
Anyhow, I personally couldn’t resist the invitation to sculpt Styrofoam remotely with an industrial grade robot arm and came up with this gem.
You can go to their page if you want to give the app a go, and really… why wouldn’t you?
Continue reading “ROBOCHOP! It Slices, Dices, But Wait! There’s More…”
Many of the early applications for the much anticipated Oculus Rift VR rig have been in gaming. But it’s interesting to see some more useful applications besides gaming, before it’s commercial release sometime this year. [JoLau] at the Institute i4Ds of FHNW School of Engineering wanted to go a step beyond rendering virtual worlds. So he built the Intuitive Rift Explorer a.k.a IRE. The IRE is a moving reality system consisting of a gimbaled stereo-vision camera rig transmitting video to the Rift, and matching head movements received from the Oculus Rift. The vision platform is mounted on a Remote-controlled robot which is completely wireless.
One of the big challenges with using VR headsets is lag, causing motion sickness in some cases. He had to tackle the problem of latency – reducing the time from moving the head to getting a matching image on the headset – Oculus Rift team specified it should be less than 20ms. The other important requirement is a high frame rate, in this case 60 frames per second. [JoLau] succeeded in overcoming most of the problems, although in conclusion he does mention a couple of enhancements that he would like to add, given more time.
[JoLau] provides a detailed description of the various sub-systems that make up IRE – the Stereo camera, audio and video transmission, media processing, servo driven gimbal for the stereo camera, and control system code. [JoLau]’s reasoning on some of the interesting hardware choices for several components used in the project makes for interesting reading. Watch a video of the IRE in action below.
Continue reading “Putting Oculus Rift on a Robot”
There are robots that will vacuum your house, mow your lawn, and keep their unblinking electronic eyes on you at all times while hovering hundreds of feet in the air. How about a robot that plays a violin? That’s what [Seth Goldstein] built. He calls it a ‘kinetic sculpture’, but there more than enough electronics and mechatronics to keep even the most discerning tinkerer interested.
There are three main parts of [Seth]’s violin-playing kinetic sculpture. The first is a bow carriage that draws the bow across the strings using an electromagnet to press the bow against the strings. The individual strings are fingered with four rubber disks, and a tilting mechanism rotates the violin so the desired string is always underneath the bow and mechanical fingers.
As far as software goes, the Ro-Bow transforms MIDI files into robotic mechanization that make the violin sing. From what we can tell, it’s not quite as good as a human player; only one string at a time can be played. It is, however, great at what it does and is an amazing mechanical sculpture.
Continue reading “Ro-Bow, The Violin Playing Robot”
Yale University brings us quite a treat with their Openhand Project.
If you’ve ever operated a robotic arm, you know that one of the most cumbersome parts is always the end effector. It will quickly make you realize what an amazing work of engineering the human hand really is, and what a poor intimation a simple open-close gripper ends up being.
[Yale] is working to bring tendon-driven robotic hands to the masses with an interesting technique of combining 3D printing and resin/urethane casting. Known as Hybrid Deposition Manufacturing (HDM), it allows the team to 3D print robotic fingers that also contain the mold for finger pads and joints, all built right into the 3D part. The tendon-driven fingers allow for a very simple design that are not only easy to make, but have a low parts count as well. Because of the human-like tendons, the fingers naturally curl around the object, distributing it’s force much more evenly and naturally, much like a human hand would. In the videos after the break, you can see the building process, as well as the hand in action.
Best news is that it’s all open source. They also include some python libraries so you can customize the CAD files fit your needs.
Continue reading “Openhand Combines 3D Printing with Urethane Casting”