33C3: Edible Soft Robotics

Certainly one of the more entertaining talks of the 33rd Chaos Communications Congress was [Kari Love]’s talk on her experiments in mixing food with function. In [Kari]’s talk at the 2016 Hackaday Supercon, she talked extensively about working on soft robotic for NASA. At the 33C3, her focus was twofold: on a fun side project to make mobile robots out of stuff that you can eat, and to examine the process of creative engineering through the lens of a project like this.

homeelliotpendrive33c3-8113-eng-edible_soft_roboticsmp4-shot0005If you look up edible robotics, you get a lot of medical literature about endoscopes that you can swallow, or devices that take samples while they’re inside you. That’s not what [Kari]’s after at all. She’s after a robot that’s made of candy, a yummy machine. And while this is still a work in progress, she demonstrated a video of an all-licorice cable-based actuator.

homeelliotpendrive33c3-8113-eng-edible_soft_roboticsmp4-shot0006_thumbnailBut more than that, she demonstrated all of the materials she’s looked at so far, and the research she’s done. To some extent, the process is the substance of this project, but there’s nothing wrong with some tasty revelations along the way.

This talk was a potpourri of helpful tips and novel facts. For instance, if you’re working in candy robotics, don’t eat your mistakes. That stomach ache that your mom always said you’d get? You will. Did you know that the gummi in gummibears is re-heatable and re-moldable? In addition, of the gels that she made, it was the most delicious. And finally, Pop Rocks don’t have enough CO2 in them to drive pneumatics. Who knew? [Kari] knows. And now you do too.

Continue reading “33C3: Edible Soft Robotics”

AI And The Ghost In The Machine

The concept of artificial intelligence dates back far before the advent of modern computers — even as far back as Greek mythology. Hephaestus, the Greek god of craftsmen and blacksmiths, was believed to have created automatons to work for him. Another mythological figure, Pygmalion, carved a statue of a beautiful woman from ivory, who he proceeded to fall in love with. Aphrodite then imbued the statue with life as a gift to Pygmalion, who then married the now living woman.

chateau_de_versailles_salon_des_nobles_pygmalion_priant_venus_danimer_sa_statue_jean-baptiste_regnault
Pygmalion by Jean-Baptiste Regnault, 1786, Musée National du Château et des Trianons

Throughout history, myths and legends of artificial beings that were given intelligence were common. These varied from having simple supernatural origins (such as the Greek myths), to more scientifically-reasoned methods as the idea of alchemy increased in popularity. In fiction, particularly science fiction, artificial intelligence became more and more common beginning in the 19th century.

But, it wasn’t until mathematics, philosophy, and the scientific method advanced enough in the 19th and 20th centuries that artificial intelligence was taken seriously as an actual possibility. It was during this time that mathematicians such as George Boole, Bertrand Russel, and Alfred North Whitehead began presenting theories formalizing logical reasoning. With the development of digital computers in the second half of the 20th century, these concepts were put into practice, and AI research began in earnest.

Over the last 50 years, interest in AI development has waxed and waned with public interest and the successes and failures of the industry. Predictions made by researchers in the field, and by science fiction visionaries, have often fallen short of reality. Generally, this can be chalked up to computing limitations. But, a deeper problem of the understanding of what intelligence actually is has been a source a tremendous debate.

Despite these setbacks, AI research and development has continued. Currently, this research is being conducted by technology corporations who see the economic potential in such advancements, and by academics working at universities around the world. Where does that research currently stand, and what might we expect to see in the future? To answer that, we’ll first need to attempt to define what exactly constitutes artificial intelligence.

Continue reading “AI And The Ghost In The Machine”

I’m BatBot

How would you like a bat bot for your next pet drone? Researchers from the University of Illinois at Urbana-Champaign’s Coordinated Science Laboratory and from the California Institute of Technology, created a bat drone. This is not your regular drone; it’s not a styrofoam, bat-shaped, four-propeller kind of drone. It’s a drone that mimics not only the shape but the movement of the bats wings to achieve flight.

The biomimetic robotic platform, dubbed Bat Bot B2, is an autonomous flying robot. The wing mechanics are controlled by a brushless DC motor for the wing flapping along with four wings actuators to provide linear motion that allows the wings to further change shape in flight. The wings are made of a 56-micron, silicone-based membrane (thinner than an average condom), which for sure helps with their elasticity as well as reducing overall weight, which is only 93 grams.

The bat has only made twenty flights so far, ranging up to 30 meters with some rough landings. It’s not much yet, but the prototype looks pretty slick. We covered another bat bot back in 2012 but the original information is no longer available, and we don’t know what happened to that project. There was also no video. In contrast, you can watch Bat Bot B2 glide.

Continue reading “I’m BatBot”

The Ultimate FPV Cleans House

With much of the world in the doldrums of the winter, hackers are getting a bit stir crazy. [Notamed Closed] would much rather be outside flying his First Person View (FPV) quadcopters. Sure there are indoor drones, but [Notamed] wanted to keep grounded. He grabbed his R/C equipment, his Roomba, and of course an Arduino to build the ultimate FPV experience.

There aren’t many details on this build, but it’s not too hard to deduce what [Notamed] has done. He’s using a standard R/C transmitter and receiver. Instead of driving servos, the receiver plugs into an Arduino Uno. The Uno translates the PPM R/C signals to serial commands. Most Roomba’s include a serial port made especially for hackers. [Notamed] simply sends the proper iRobot Serial Command Interface (SCI) messages, and the robot is his to control.

The FPV side of things is a bog standard FPV camera and transmitter, sending standard definition video to his goggles. A GoPro is along for the ride to capture high-quality video.

Sure this is a quick hacked together build. All the parts are taped on to the Roomba. We’re sure this is on purpose. When the weather warms up, the R/C equipment goes back in the air, and the Roomba becomes just another vacuuming robot – once again a danger to pet messes everywhere.

Check out the video after the break.

Continue reading “The Ultimate FPV Cleans House”

Robot Leaps Uncanny Valley On Backward Knees

We’ve covered a ton of Boston Dynamics robots but this is the second one in a row that has shown a departure from what a lot of people’s notion of an ‘advanced’ robot should look like. It’s a cellphone camera clip of a video played at a conference, but at least it isn’t vertical video — kudos to [juvertson]. At about 3:40 seconds into the video you get a good look “Handle” at a four-limbed robot with backwards joints and wheel.

This design makes a lot of sense and it’s good to see Boston Dynamics thinking about unique robot kinematics alongside the realities of motion. The result is something that appears neither human nor animal — it’s definitely not natural. Despite the presenter’s assertion that this will be nightmare-inducing, we think it’s the opposite, since it doesn’t tweak that string in your brain that cries “predator”.

Obviously this is what we’d call a self-balancer. But two-wheels-plus-rigid-frame it is not. The articulated lower limbs allow it to shift its mass over the wheels. The upper limbs play their part in balancing, at one point acting in the same way a figure skater’s arms would during a spin. And its dexterity in hopping over an obstacle is only made better by [juvertson’s] commentary. This is a really good balance between purely wheeled and purely humanoid designs and a nice addition to the evolution of robotics.

Continue reading “Robot Leaps Uncanny Valley On Backward Knees”

Tiny Robot Clings To Leaves With Static Electricity

Flying is an energy-intensive activity. The birds and the bees don’t hover around incessantly like your little sister’s quadcopter. They flit to and fro, perching on branches and leaves while they plan their next move. Sure, a quadcopter can land on the ground, but then it has to spend more energy getting back to altitude. Researchers at Harvard decided to try to develop flying robots that can perch on various surfaces like insects can.

Perching on surfaces happens electrostatically. The team used an electrode patch with a foam mounting to the robot. This allows the patch to make contact with surfaces easily even if the approach is a few degrees off. This is particularly important for a tiny robot that is easily affected by even the slightest air draft. The robots were designed to be as light as possible — just 84mg — as the electrostatic force is not particularly strong.

It’s estimated that perching electrostatically for a robot of this size uses approximately 1000 times less power than during flight. This would be of great use for surveillance robots that could take up a vantage point at altitude without having to continually expend a great deal of energy to stay airborne. The abstract of the research paper notes that this method of perching was successful on wood, glass, and a leaf. It appears testing was done with tethers; it would be interesting to see if this technique would be powerful enough for a robot that carries its own power source. Makes us wonder if we ever ended up with tiny flyers that recharge from power lines?

We’re seeing more tiny flying robots every day now – the IMAV 2016 competition was a great example of the current state of the art.

Continue reading “Tiny Robot Clings To Leaves With Static Electricity”

Ping Pong Ball-Juggling Robot

There aren’t too many sports named for the sound that is produced during the game. Even though it’s properly referred to as “table tennis” by serious practitioners, ping pong is probably the most obvious. To that end, [Nekojiru] built a ping pong ball juggling robot that used those very acoustics to pinpoint the location of the ball in relation to the robot. Not satisfied with his efforts there, he moved onto a visual solution and built a new juggling rig that uses computer vision instead of sound to keep a ping pong ball aloft.

The main controller is a Raspberry Pi 2 with a Pi camera module attached. After some mishaps with the planned IR vision system, [Nekojiru] decided to use green light to illuminate the ball. He notes that OpenCV probably wouldn’t have worked for him because it’s not fast enough for the 90 fps that’s required to bounce the ping pong ball. After looking at the incoming data from this system, an algorithm extracts 3D information about the ball and directs the paddle to strike the ball in a particular way.

If you’ve ever wanted to get into real-time object tracking, this is a great project to look over. The control system is well polished and the robot itself looks almost professionally made. Maybe it’s possible to build something similar to test [Nekojiru]’s hypothesis that OpenCV isn’t fast enough for this. If you want to get started in that realm of object tracking, there are some great projects that make use of that piece of software as well.