Some plants react quickly enough for our senses to notice, such as a Venus flytrap or mimosa pudica. Most of the time, we need time-lapse photography at a minimum to notice while more exotic sensors can measure things like microscopic pores opening and closing. As with any sensor reading, those measurements can be turned into action through a little trick we call automation. [Harpreet Sareen] and [Pattie Maes] at MIT brought these two ideas together in a way which we haven’t seen before where a plant has taken the driver’s seat in a project called Elowan. Details are sparse but the concept is easy enough to grasp.
We are not sure if this qualifies as a full-fledged cyborg or if this is a case of a robot using biological sensors. Maybe it all depends on which angle you present this mixture of plant and machine. Perhaps it is truly is the symbiotic relationship that the project claims it to be. The robot would not receive any instructions without the plant and the plant would receive sub-optimal light without the robot. What other ways could plants be integrated into robotics to make it a bona fide cyborg?
When we think of pneumatic actuators, we typically consider the standard varieties of pneumatic cylinder, capable of linear motion. These can be referred to as “hard” actuators, made of rigid components and capable of great accuracy and force delivery. However, “soft” actuators have their own complementary abilities – such as being able to handle more delicate tasks and being less likely to injure human operators when used in collaborative operations. The Whitesides Research Group at Harvard University has undertaken significant research in this field, and released a paper covering a novel type of soft pneumatic actuator.
The actuator consists of a series of soft, flexible sealed chambers which surround a wooden dowel in the center. By applying vacuum to these various chambers, the dowel in the center can be pulled into up to eight different positions. It’s a unique concept, and one we can imagine could have applications in various material processing scenarios.
The actuator was built by moulding elastomers around 3D printed components, so this is a build that could theoretically be tackled by the DIYer. The paper goes into great detail to quantify the performance of the actuator, and workshops several potential applications. Testing is done on a fluid delivery and stirring system, and a tethered robotic walker was built. The team uses the term cVAMS – cyclical vacuum actuated machine – to describe the actuator technology.
20 years ago, PCB production was expensive and required a multitude of phone calls and emails to a fab with significant minimum order restrictions. Now, it’s cheap and accessible online, which in addition to curtailing the home etching market has created significant new possibilities for home projects. Now that flexible PCBs are also readily available, it’s possible to experiment with some cool concepts – and that’s precisely what [Carl] has been doing.
The aim is to build a walking robot that uses actuators made from flexible PCBs. The flexible PCB is printed with a coil, capable of generating a small magnetic field. This then interacts with a strong permanent magnet, causing the flexible PCB to move when energised.
Initial attempts with four actuators mounted to a 3D printed frame were unsuccessful, but [Carl] has persevered. With a focus on weight saving, the MK II prototype has shown some promise, gently twitching its way across a desk in testing. Future steps will involve building an untethered version. This will replace the 3D printed chassis with a standard fibreglass PCB acting as both control board and the main chassis to minimise weight, similar to PCB quadcopter designs we’ve seen in the past.
When it rains, it pours (wonderful electronic sculpture!). The last time we posted about freeform circuit sculptures there were a few eye-catching comments mentioning other fine examples of the craft. One such artist is [Eirik Brandal], who has a large selection of electronic sculptures. Frankly, we’re in love.
A common theme of [Eirik]’s work is that each piece is a functional synthesizer or a component piece of a larger one. For instance, when installed the ihscale series uses PIR sensors to react together to motion in different quadrants of a room. And the es #17 – #19 pieces use ESP8266’s to feed the output of their individual signal generators into each other to generate one connected sound.
Even when a single sculpture is part of a series there is still striking variety in [Eirik]’s work. Some pieces are neat and rectilinear and obviously functional, while others almost looks like a jumble of components. Whatever the style we’ve really enjoyed pouring through the pages of [Eirik]’s portfolio. Most pieces have demo videos, so give them a listen!
Getting people to space is extremely difficult, and while getting robots to space is still pretty challenging, it’s much easier. For that reason, robots and probes have been helping us explore the solar system for decades. Now, though, a robot assistant is on board the ISS to work with the astronauts, and rather than something impersonal like a robot arm, this one has a face, can navigate throughout the ship, and can respond to voice inputs.
The robot is known as CIMON, the Crew Interactive Mobile Companion. Built by Airbus, this interactive helper will fly with German astronaut Alexander Gerst to test the concept of robotic helpers such as this one. It is able to freely move about the cabin and can learn about the space it is in without being specifically programmed for it. It processes voice inputs similarly to a smart phone, but still processes requests on Earth via the IBM Watson AI. This means that it’s not exactly untethered, and future implementations of this technology might need to be more self-contained for missions outside of low Earth orbit.
While the designers have listened to the warnings of 2001 and not given it complete control of the space station, they also learned that it’s helpful to create an interactive robot that isn’t something as off-putting as a single creepy red-eye. This robot can display an interactive face on the screen, as well as use the same screen to show schematics, procedure steps, or anything else the astronauts need. If creepy design is more your style though, you can still have HAL watching you in your house.
The technique of assembling circuits without substrate goes by many names; you may know it as flywiring, deadbugging, point to point wiring, or freeform circuits. Sometimes this technique is used for practical purposes like fixing design errors post-production or escaping tiny BGA components (ok, that one might be more cool than practical). Perhaps our favorite use is to create art, and [Mohit Bhoite] is an absolute genius of the form. He’s so prolific that it’s difficult to point to a particular one of his projects as an exemplar, though he has a dusty blog we might recommend digging through [Mohit]’s Twitter feed and marveling at the intricate works of LEDs and precision-bent brass he produces with impressive regularity.
So where to begin? Very recently [Mohit] put together a small wheeled vehicle for persistence of vision drawing (see photo above). We’re pretty excited to see some more photos and videos he takes as this adorable little guy gets some use! Going a little farther back in time there’s this microcontroller-free LED scroller cube which does a great job showing off his usual level of fit and finish (detail here). If you prefer more LEDs there’s also this hexagonal display he whipped up. Or another little creature with seven segment displays for eyes. Got those? That covers (most) of his last month of work. You may be starting to get a sense of the quality and quantity on offer here.
It’s a build that relies on an assemblage of off-the-shelf parts to quickly put together a telepresence robot. Real-time video and audio communications are easily handled by a Huawei smartphone running Skype, set up to automatically answer video calls at all times. The phone is placed onto the robotic chassis using a car cell phone holder, attached to the body with a suction cup. The drive is a typical two-motor skid steer system with rear caster, controlled by a microcontroller connected to the phone.
Operation is simple. The user runs a custom app on a remote phone, which handles video calling of the robot’s phone, and provides touchscreen controls for movement. While the robot is a swift mover, it’s really only sized for tabletop operation — unless you wish to talk to your contact’s feet. However, we can imagine there has to be some charm in driving a pint-sized ‘bot up and down the conference table when Sales and Marketing need to be whipped back into shape.
It’s a build that shows that not everything has to be a 12-month process of research and development and integration. Sometimes, you can hit all the right notes by cleverly lacing together a few of the right eBay modules. Getting remote video right can be hard, too – as we’ve seen before.
By using our website and services, you expressly agree to the placement of our performance, functionality and advertising cookies. Learn more