On the left, a transluscent yellowy-tan android head with eyes set behind holes in the face. On the right, a bright pink circle with small green eyes. It is manipulated into the image of a smiling face via its topography.

A Robot Face With Human Skin

Many scifi robots have taken the form of their creators. In the increasingly blurry space between the biological and the mechanical, researchers have found a way to affix human skin to robot faces. [via NewScientist]

Previous attempts at affixing skin equivalent, “a living skin model composed of cells and extracellular matrix,” to robots worked, even on moving parts like fingers, but typically relied on protrusions that impinged on range of motion and aesthetic concerns, which are pretty high on the list for robots designed to predominantly interact with humans. Inspired by skin ligaments, the researchers have developed “perforation-type anchors” that use v-shaped holes in the underlying 3D printed surface to keep the skin equivalent taut and pliable like the real thing.

The researchers then designed a face that took advantage of the attachment method to allow their robot to have a convincing smile. Combined with other research, robots might soon have skin with touch, sweat, and self-repair capabilities like Data’s partial transformation in Star Trek: First Contact.

We wonder what this extremely realistic humanoid hand might look like with this skin on the outside. Of course that raises the question of if we even need humanoid robots? If you want something less uncanny, maybe try animating your stuffed animals with this robotic skin instead?

Flexures Make Robotic Fingers Simpler To Print

Designing an anthropomorphic robotic hand seems to make a lot of sense — right up until the point that you realize just how complex the human hand is. What works well in bone and sinew often doesn’t translate well to servos and sensors, and even building a single mechanical finger can require dozens of parts.

Or, if you’re as clever about things as [Adrian Perez] is, only one part. His print-in-place robotic finger, adorably dubbed “Fingie,” is a huge step toward simplifying anthropomorphic manipulators. Fingie is printed in PLA and uses flexures for the three main joints of the finger, each of which consists of two separate and opposed coil springs. The flexures allow the phalanges to bend relative to each other in response to the motion of three separate tendons that extend through a channel on the palmar aspect of the finger, very much like the real thing.

The flexures eliminate the need for bearings at each joint and greatly decrease the complexity of the finger, but the model isn’t perfect. As [Adrian] points out, the off-center attachment for the tendons makes the finger tend to curl when the joints are in flexion, which isn’t how real fingers work. That should be a pretty easy fix, though. And while we appreciate the “one and done” nature of this print, we’d almost like to see the strap-like print-in-place tendons replaced with pieces of PLA filament added as a post-processing step, to make the finger more compact and perhaps easier to control.

Despite the shortcomings, and keeping in mind that this is clearly a proof of concept, we really like where [Adrian] is going with this, and we’re looking forward to seeing a hand with five Fingies, or four Fingies and a Thumbie. It stands to be vastly simpler than something like [Will Cogley]’s biomimetic hand, which while an absolute masterpiece of design, is pretty daunting for most of us to reproduce.

Continue reading “Flexures Make Robotic Fingers Simpler To Print”

Twelve pink tentacles are wrapped around a small, green succulent plant. The leaves seem relatively undisturbed. They are dangling from brass and white plastic pressure fittings attached to a brass circle.

Tentacle Robot Wants To Hold You Gently

Human hands are remarkable pieces of machinery, so it’s no wonder many robots are designed after their creators. The amount of computation required to properly attenuate the grip strength and position of a hand is no joke though, so what if you took a tentacular approach to grabbing things instead?

Inspired by ocean creatures, researchers found that by using a set of pneumatically-controlled tentacles, they could grasp irregular objects reliably and gently without having to faff about with machine learning or oodles of sensors. The tentacles can wrap around the object itself or intertwine with each other to encase parts of an object in its gentle grasp.

The basic component of the device is 12 sections “slender elastomeric filament” which dangle at gauge pressure, but begin to curl as pressure is applied up to 172 kPa. All of the 300 mm long segments run on the same pressure source and are the same size, but adding multiple sized filaments or pressure sources might be useful for certain applications.

We wonder how it would do feeding a fire or loading a LEGO train with candy? We also have covered how to build mechanical tentacles and soft robots, if that’s more your thing.

Continue reading “Tentacle Robot Wants To Hold You Gently”

An illustration of jellyfish swimming in the ocean by Rebecca Konte. The jellyfish are wearing cones on their "heads" to streamline their swimming that contain some sort of electronics inside.

The Six Million Dollar Jellyfish

What if you could rebuild a jellyfish: better, stronger, faster than it was before? Caltech now has the technology to build bionic jellyfish.

Studying the ocean given its influence on the rest of the climate is an important scientific task, but the wild pressure differences as you descend into the eternal darkness make it a non-trivial engineering problem. While we’ve sent people to the the deepest parts of the ocean, submersibles are much too expensive and risky to use for widespread data acquisition.

The researchers found in previous work that making a cyborg jellyfish was more effective than biomimetic jellyfish robots, and have now given the “biohybrid robotic jellyfish” a 3D-printed, neutrally buoyant, swimming cap. In combination with the previously-developed “pacemaker,” these cyborg jellyfish can explore the ocean (in a straight line) at 4.5x the speed of a conventional moon jelly while carrying a scientific payload. Future work hopes to make them steerable like the well-known robo-cockroaches.

If you’re interested in some other attempts to explore Earth’s oceans, how about drift buoys, an Open CTD, or an Open ROV? Just don’t forget to keep the noise down!

Continue reading “The Six Million Dollar Jellyfish”

A human hand holds a stack of several plexiglass sheets with needles glued into the ends. Very faint lines can be seen in the transparent stackup.

Biomimetic Building Facades To Reduce HVAC Loads

Buildings currently consume about 50% of the world’s electricity, so finding ways to reduce the loads they place on the grid can save money and reduce carbon emissions. Scientists at the University of Toronto have developed an “optofluidic” system for tuning light coming into a building.

The researchers devised a biomimetic system inspired by the multi-layered skins of squid and chameleons for active camouflage to be able to actively control light intensity, spectrum, and scattering independently. While there are plenty of technologies that can regulate these properties, doing so independently has been too complicated a task for current window shades or electrochromic devices.

To make the prototype devices (15 × 15 × 2 cm), 3 mm PMMA sheets were stacked after millifluidic channels (1.5 mm deep and 6.35 mm wide) were CNC milled into the sheets. Fluids could be injected and removed by needles glued into the ends of the channels. By using different fluids in the channels, researchers were able to tune various aspects of the incoming light. Scaled up, one application of the system could be to keep buildings cooler on hot days without keeping out IR on colder days which is one disadvantage of static window coatings currently in use.

If you want to control some of the light going OUT of your windows, maybe you should try building this smart LED curtain instead?

Continue reading “Biomimetic Building Facades To Reduce HVAC Loads”

Mechatronic Hand Mimics Human Anatomy To Achieve Dexterity

Behold the wondrous complexity of the human hand. Twenty-seven bones working in concert with muscles, tendons, and ligaments extending up the forearm to produce a range of motions that gave us everything from stone tools to symphonies. Our hands are what we use to interface with the physical world on a fine level, and it’s understandable that we’d want mechanical versions of ourselves to include hands that were similarly dexterous.

That’s a tall order to fill, but this biomimetic mechatronic hand is a pretty impressive step in that direction. It’s [Will Cogley]’s third-year university design project, which he summarizes in the first video below. There are two parts to this project; the mechanical hand itself and the motion-capture glove to control it, both of which we find equally fascinating. The control glove is covered with 3D-printed sensors for each joint in the hand. He uses SMD potentiometers to measure joint angles, with some difficulty due to breakage of the solder joints; perhaps he could solve that with finer wires and better strain relief.

The hand that the glove controls is a marvel of design, like something on the end of a Hollywood android’s arm. Each finger joint is operated by a servo in the forearm pulling on cables; the joints are returned to the neutral position by springs. The hand is capable of multiple grip styles and responds fairly well to the control glove inputs, although there is some jitter in the sensors for some joints.

The second video below gives a much more detailed overview of the project and shows how [Will]’s design has evolved and where it’s going. Anthropomorphic hands are far from rare projects hereabouts, but we’d say this one has a lot going for it.

Continue reading “Mechatronic Hand Mimics Human Anatomy To Achieve Dexterity”

I’m BatBot

How would you like a bat bot for your next pet drone? Researchers from the University of Illinois at Urbana-Champaign’s Coordinated Science Laboratory and from the California Institute of Technology, created a bat drone. This is not your regular drone; it’s not a styrofoam, bat-shaped, four-propeller kind of drone. It’s a drone that mimics not only the shape but the movement of the bats wings to achieve flight.

The biomimetic robotic platform, dubbed Bat Bot B2, is an autonomous flying robot. The wing mechanics are controlled by a brushless DC motor for the wing flapping along with four wings actuators to provide linear motion that allows the wings to further change shape in flight. The wings are made of a 56-micron, silicone-based membrane (thinner than an average condom), which for sure helps with their elasticity as well as reducing overall weight, which is only 93 grams.

The bat has only made twenty flights so far, ranging up to 30 meters with some rough landings. It’s not much yet, but the prototype looks pretty slick. We covered another bat bot back in 2012 but the original information is no longer available, and we don’t know what happened to that project. There was also no video. In contrast, you can watch Bat Bot B2 glide.

Continue reading “I’m BatBot”