Soft robotics is an exciting field. Mastering the pneumatic control of pliable materials has enormous potential, from the handling of delicate objects to creating movement with no moving parts. However, pneumatics has long been overlooked by the hacker community as a mode of actuation. There are thousands of tutorials, tools and products that help us work with motor control and gears, but precious few for those of us who want to experiment with movement using air pressure, valves and pistons.
Physicist and engineer [tinkrmind] wants to change that. He has been developing an open source soft robotics tool called Programmable Air for the past year with the aim of creating an accessible way for the hacker community to work with pneumatic robotics. We first came across [tinkrmind]’s soft robotics modules at World Maker Faire in New York City in 2018 but fifty beta testers and a wide range of interesting projects later — from a beating silicone heart to an inflatable bra — they are now being made available on Crowd Supply.
We had the chance to play with some of the Programmable Air modules after this year’s Makerfaire Bay Area at Bring A Hack. We can’t wait to see what squishy, organic creations they will be used for now that they’re out in the wild.
If you need more soft robotics inspiration, take a look at this robotic skin that turns teddy bears into robots from Yale or these soft rotating actuators from Harvard.
See a video of the Programmable Air modules in action below the cut. Continue reading “Bringing Pneumatics To The Masses With Open Source Soft Robotics”
A simple robot that performs line-following or obstacle avoidance can fit all of its logic inside a single Arduino sketch. But as a robot’s autonomy increases, its corresponding software gets complicated very quickly. It won’t be long before diagnostic monitoring and logging comes in handy, or the desire to encapsulate feature areas and orchestrate how they work together. This is where tools like the Robot Operating System (ROS) come in, so we don’t have to keep reinventing these same wheels. And Open Robotics just released ROS 2 Dashing Diademata for all of us to use.
ROS is an open source project that’s been underway since 2007 and updated regularly, each named after a turtle species. What makes this one worthy of extra attention? Dashing marks the first longer term support (LTS) release of ROS 2, a refreshed second generation of ROS. All high level concepts stayed the same, meaning almost everything in our ROS orientation guide is still applicable in ROS 2. But there were big changes under the hood reflecting technical advances over the past decade.
ROS was built in an age where a Unix workstation cost thousands of dollars, XML was going to be how we communicate all data online, and an autonomous robot cost more than a high-end luxury car. Now we have $35 Raspberry Pi running Linux, XML has fallen out of favor due to processing overhead, and some autonomous robots are high-end luxury cars. For these and many other reasons, the people of Open Robotics decided it was time to make a clean break from legacy code.
The break has its detractors, as it meant leaving behind the vast library of freely available robot intelligence modules released by researchers over the years. Popular ones were (or will be) ported to ROS 2, and there is a translation bridge sufficient to work with some, but the rest will be left behind. However, this update also resolved many of the deal-breakers preventing adoption outside of research, making ROS more attractive for commercial investment which should bring more robots mainstream.
Judging by responses to the release announcement, there are plenty of people eager to put ROS 2 to work, but it is not the only freshly baked robotics framework around. We just saw Nvidia release their Isaac Robot Engine tailored to make the most of their Jetson hardware.
Putting a 3D printer on a mobile robotic platform is one thing, but two robots co-cooperatively printing a large object together is even more impressive. AMBOTS posted the video on Twitter and we’ve embedded it below.
The robots sport omnidirectional wheels and SCARA format arms, and appear to interact with some kind of active tabletop to aid positioning. The AMBOTS website suggests that the same ideas could be used for other tasks such as pick and place style assembly work, and the video below of co-operative 3D printing is certainly a neat proof of concept.
As a side note: most omni wheels we see (such as the ones on these robots) are of the Mecanum design but there are other designs out there you may not have heard of, such as the Liddiard omnidirectional wheel.
Continue reading “Watch These Two Robots Cooperate On A 3D Print”
One of the unfortunate things about Hackaday’s globe-spanning empire is that you often don’t get to meet the people you work with in person. Since I was in China and it’s right next door, I really wanted to pop over to Vietnam and meet Sean Boyce, who has been writing for Hackaday for a couple of years, yet we’ve never met. I suggested we could make this happen if we put together a meetup or unconference. Sean was immediately confident that the Ho Chi Minh City hardware hackers would turn out in force and boy was he right! On Sunday night we had a full house for the first ever Hackaday Vietnam Meetup.
Continue reading “Hacker Abroad: Vietnam’s Hardware Hackers”
My first full day in China was spent at Electronica, an absolutely massive conference showcasing companies involved in electronics manufacturing and distribution. It’s difficult to comprehend how large this event is, filling multiple halls at the New International Expo Center in Shanghai.
I’ve seen the equipment used for PCB assembly many times before. But at this show you get to see another level below that, machines that build components and other items needed to build products quickly and with great automation. There was also big news today as the 2019 Hackaday Prize China was launched. Join me after the break for a look at this equipment, and more about this new development for the Hackaday Prize.
Continue reading “Hacker Abroad: Massive Conference Brings Big News Of Hackaday Prize China”
In a complete surprise, Sony has moved to release the latest version of their robotic dog series, Aibo, in North America. The device is already out in Japan, where there are a number of owner’s clubs that would rival any dedicated kennel club. Thanks to the [Robot Start] team, we now have a glimpse of what goes into making the robotic equivalent of man’s best friend in their teardown of an Aibo ERS-1000.
According to Yoshihiro of Robot Start, Aibo looks to be using a proprietary battery reminiscent of the Handycam camcorders. Those three gold contacts are used for charging on the rug shaped power base that Aibo will periodically return to in order to take a”nap”. There are a couple of square OLED screens behind those puppy dog eyes. They are full-color OLEDs somewhere in the one-inch ballpark. Between the screens is a capacitive touch sensor that wraps around to the top of the head that are also pressure sensitive.
According to Sony’s press release, the fish-eye camera housed in Aibo’s snout is used to identify faces as well as navigating spaces.
Laying out all the major parts out together certainly drives home the complexity of the latest Aibo. It’ll be interesting to see the progression of this device as all of them come equipped with 4G LTE and 802.11 b/g/n WiFi that connect to Sony’s servers for deep learning.
New behaviors are supposed to download automatically as long as the device is under the subscription plan. While Sony has no current plans to integrate with any voice-activated virtual assistant, we can still look forward to the possibility of some expanded functionality from the Hackaday community.
For the rest of the teardown photos make sure to head over to [Yoshihiro]’s write up on Robot Start. Also just in case anybody cared to see what happens when the first generation Aibo ERS-111 from 1999 meets the 2018 Aibo ERS-1000, you’ll find the answer in the video below:
Continue reading “Teardown: Sony’s New Aibo Goes Under The Knife”
[Moritz Simon Geist]’s experiences as both a classically trained musician and a robotics engineer is clearly what makes his Techno Music Robots project so stunningly executed. The robotic electronic music he has created involves no traditional instruments of any kind. Instead, the robots themselves are the instruments, and every sound comes from some kind of physical element.
A motor might smack a bit of metal, a hard drive arm might tap out a rhythm, and odder sounds come from stranger devices. If it’s technological and can make a sound, [Moritz Simon Geist] has probably carefully explored whether it can be turned into one of his Sonic Robots. The video embedded below is an excellent example of his results, which is electronic music without a synthesizer in sight.
We’ve seen robot bands before, and they’re always the product of some amazing work. The Toa Mata Lego Band are small Lego units and Compressorhead play full-sized instruments on stage, but robots that are the instruments is a different direction that still keeps the same physical element to the music.
Continue reading “Sonic Robots Don’t Play Instruments, They Are The Instruments”