Flying is an energy-intensive activity. The birds and the bees don’t hover around incessantly like your little sister’s quadcopter. They flit to and fro, perching on branches and leaves while they plan their next move. Sure, a quadcopter can land on the ground, but then it has to spend more energy getting back to altitude. Researchers at Harvard decided to try to develop flying robots that can perch on various surfaces like insects can.
Perching on surfaces happens electrostatically. The team used an electrode patch with a foam mounting to the robot. This allows the patch to make contact with surfaces easily even if the approach is a few degrees off. This is particularly important for a tiny robot that is easily affected by even the slightest air draft. The robots were designed to be as light as possible — just 84mg — as the electrostatic force is not particularly strong.
It’s estimated that perching electrostatically for a robot of this size uses approximately 1000 times less power than during flight. This would be of great use for surveillance robots that could take up a vantage point at altitude without having to continually expend a great deal of energy to stay airborne. The abstract of the research paper notes that this method of perching was successful on wood, glass, and a leaf. It appears testing was done with tethers; it would be interesting to see if this technique would be powerful enough for a robot that carries its own power source. Makes us wonder if we ever ended up with tiny flyers that recharge from power lines?
We’re seeing more tiny flying robots every day now – the IMAV 2016 competition was a great example of the current state of the art.
Continue reading “Tiny Robot Clings To Leaves With Static Electricity”
Well, that’s it. If SkyNet goes live once this 4-meter tall Avatar-style mech suit is in production, we’re all doomed.
Named [Method-2], the bipedal giant towers over the engineers testing it at Korea’s Hankook Mirae Technology, where they appear to have done everything possible to make this thing look terrifyingly awesome. The first video below shows the mech with a pilot on board, putting the arms through their paces. We count at least six degrees of freedom on each arm, not including the five digits on each hand that look like they could punch through a brick wall. Later in the video we see a tethered walking test with no pilot, but we also found a webcam video that purports to be the first walk with a pilot. Either way, the 1.5-ton machine shakes the floor with every step.
This is still a development phase project, as evidenced by the fact that the mech seems to be getting its power from an umbilical. But this company has dumped a lot of money into this thing, and we’d bet they intend to capitalize on it. Once it can run untethered, though, watch out. Until then, we’ll settle for this mecha-baby costume.
Continue reading “Say Hello to our New Robot Overlords”
If you take an object and turn it into something else, does that constitute a hack? Can a musical robot call to question the ethics of firearms exports? If you take a disabled shotgun and turn it into a flute, does it become an art piece? Deep questions indeed — and deliberately posed by [Constantine Zlatev] along with his collaborators [Kostadin Ilov] and [Velina Ruseva].
The Last Gun — a mechano-robotic flute, as [Zlatev] calls it — is built from recovered industrial parts, played using compressed air, and controlled by an Arduino and Raspberry Pi. After graphing the annual arms exports from the United States, the installation plays a mournful tune for each year that they rise, and a jubilant theme for each year they fall.
Continue reading “Mechano-Robotic Flute Made From An Old Shotgun”
Every December and May the senior design projects from engineering schools start to roll in. Since the students aren’t yet encumbered with real-world detractors (like management) the projects are often exceptional, unique, and solve problems we never even thought we had. Such is the case with [Mark] and [Peter]’s senior design project: a pick and place machine that promises to solve all of life’s problems.
Of course we’ve seen pick-and-place machines before, but this one is different. Rather than identifying resistors and capacitors to set on a PCB, this machine is able to identify and sort candies. The robot — a version of the MeARM — has three degrees of freedom and a computer vision system to alert the arm as to what it’s picking up and where it should place it. A Raspberry Pi handles the computer vision and feeds data to a PIC32 which interfaces with the hardware.
One of the requirements for the senior design class was to keep the budget under $100, which they were able to accomplish using pre-built solutions wherever possible. Robot arms with dependable precision can’t even come close to that price restraint. But this project overcomes the lack of precision in the MeArm by using incremental correcting steps to reach proper alignment. This is covered in the video demo below.
Senior design classes are a great way to teach students how to integrate all of their knowledge into a final class, and the professors often include limits they might find in the real world (like the budget limit in this project). The requirement to thoroughly document the build process is also a lesson that more people could stand to learn. Senior design classes have attempted to solve a lot of life’s other problems, too; from autonomous vehicles to bartenders, there’s been a solution for almost every problem.
Continue reading “Pick-And-Place Machine for Candy”
(Bipolar Junction) Transistors versus MOSFETs: both have their obvious niches. FETs are great for relatively high power applications because they have such a low on-resistance, but transistors are often easier to drive from low voltage microcontrollers because all they require is a current. It’s uncanny, though, how often we find ourselves in the middle between these extremes. What we’d really love is a part that has the virtues of both.
The ask in today’s Ask Hackaday is for your favorite part that fills a particular gap: a MOSFET device that’s able to move a handful of amps of low-voltage current without losing too much to heat, that is still drivable from a 3.3 V microcontroller, with bonus points for PWM ability at a frequency above human hearing. Imagine driving a moderately robust small DC robot motor forwards with a microcontroller, all running on a LiPo — a simple application that doesn’t need a full motor driver IC, but requires a high-efficiency, moderate current, and low-voltage-logic compatible transistor. If you’ve been here and done that, what did you use?
Continue reading “Ask Hackaday: Dude, Where’s My MOSFET?”
[Massimiliano Patacchiola] writes this handy guide on using a histogram intersection algorithm to identify different objects. In this case, lego superheroes. All you need to follow along are eyes, Python, a computer, and a bit of machine learning magic.
He gives a good introduction to the idea. You take a histogram of the colors in a properly cropped and filtered photo of the object you want to identify. You then feed that into a neural network and train it to identify the different superheroes by color. When you feed it a new image later, it will compare the new image’s histogram to its model and output confidences as to which set it belongs.
This is a useful thing to know. While a lot of vision algorithms try to make geometric assertions about the things they see, adding color to the mix can certainly help your friendly robot project recognize friend from foe.
[Basti] was playing around with Artificial Neural Networks (ANNs), and decided that a lot of the “hello world” type programs just weren’t zingy enough to instill his love for the networks in others. So he juiced it up a little bit by applying a reasonably simple ANN to teach a four-legged robot to walk (in German, translated here).
While we think it’s awesome that postal systems the world over have been machine sorting mail based on similar algorithms for years now, watching a squirming quartet of servos come to forward-moving consensus is more viscerally inspiring. Job well done! Check out the video embedded below.
Continue reading “Train Your Robot To Walk with a Neural Network”