Quadcopter With Stereo Vision

Flying a quadcopter or other drone can be pretty exciting, especially when using the video signal to do the flying. It’s almost like a real-life video game or flight simulator in a way, except the aircraft is physically real. To bring this experience even closer to the reality of flying, [Kevin] implemented stereo vision on his quadcopter which also adds an impressive amount of functionality to his drone.

While he doesn’t use this particular setup for drone racing or virtual reality, there are some other interesting things that [Kevin] is able to do with it. The cameras, both ESP32 camera modules, can make use of their combined stereo vision capability to determine distances to objects. By leveraging cloud computing services from Amazon to offload some of the processing demands, the quadcopter is able to recognize faces and keep the drone flying at a fixed distance from that face without needing power-hungry computing onboard.

There are a lot of other abilities that this drone unlocks by offloading its resource-hungry tasks to the cloud. It can be flown by using a smartphone or tablet, and has its own web client where its user can observe the facial recognition being performed. Presumably it wouldn’t be too difficult to use this drone for other tasks where having stereoscopic vision is a requirement.

Thanks to [Ilya Mikhelson], a professor at Northwestern University, for this tip about a student’s project.

Shoelace-Tying Robot With Only Two Motors

Many things that humans do are very difficult for machines. Case in point: tying shoelaces. Think of the intricate dance of fingers crossing over fingers that it takes to pass off a lace from one hand to the other. So when a team of five students from UC Davis got together and built a machine that got the job done with two hooks, some very clever gears, and two motors, we have to say that we’re impressed. Watch it in action on Youtube (also embedded below).

The two-motor constraint would seem at first to be a show-stopper, but now that we’ve watched the video about a hundred times, we’re pretty convinced that a sufficiently clever mechanical engineer could do virtually anything with two motors and enough gears. You see, the secret is that one motor is dedicated to moving a drive gear back and forth to multiple destinations, and the other motor provides the power.

This being Hackaday, I’m sure that some of you are saying “I could do that with one motor!” Consider that a challenge.

Meanwhile, if you need to see more gear-porn, check out this hummingbird automaton. Or for the miracles of cam-driven machines, check out [Fran Blanche]’s work with the Maillardet Automaton.

Continue reading “Shoelace-Tying Robot With Only Two Motors”

Laser Cut Enclosures From Eagle Files

Once a project is finished, it might still need a decent enclosure. While it’s possible to throw a freshly soldered PCB in a standard enclosure, or piece of Tupperware, or cardboard box, these options don’t have the fit and finish of something custom-made. If you have a laser cutter sitting around, it’s a simple matter to cut your own enclosure, but now that process is much easier thanks to [Ray]’s latest project.

Since [Ray] was already using Eagle to design his PCBs, it seemed like a short step to using the Eagle files to design the enclosure as well. The script runs from those files and creates everything necessary to send to the laser cutter for manufacturing. Right now, [Ray] points out that the assembly time for each enclosure can be high, and this method might not be suited for large numbers of enclosures. Additionally, some of the calculations still need to be done by hand, but there are plans to automate everything in the future.

For single projects, though, this script could cut a lot of time off of designing an enclosure and building it from scratch, and could also help improve aesthetics over other options like 3D printed enclosures. Of course, if you have a quality 3D printer around but no laser cutter, there are options for custom enclosures as well.

RadarCat Gives Computers A Sense Of Touch

So far, humans have had the edge in the ability to identify objects by touch. but not for long. Using Google’s Project Soli, a miniature radar that detects the subtlest of gesture inputs, the [St. Andrews Computer Human Interaction group (SACHI)] at the University of St. Andrews have developed a new platform, named RadarCat, that uses the chip to identify materials, as if by touch.

Realizing that different materials return unique radar signals to the chip, the [SACHI] team combined it with their recognition software and machine learning processes that enables RadarCat to identify a range of materials with accuracy in real time! It can also display additional information about the object, such as nutritional information in the case of food, or product information for consumer electronics. The video displays how RadarCat has already learned an impressive range of materials, and even specific body parts. Can Skynet be far behind?

Continue reading “RadarCat Gives Computers A Sense Of Touch”

Rita’s Dolls Probably Live Better Than You Do

If it wasn’t for the weird Dutch-Norwegian techno you’d presumably have to listen to forever, [Gianni B.]’s doll house for his daughter, [Rita] makes living in a Barbie World seem like a worthwhile endeavor. True to modern form, it’s got LED lighting. It’s got IoT. It’s got an app and an elevator. It even has a tiny, working, miniature television.

It all started with a Christmas wish. [Rita] could no longer stand to bear the thought of her Barbie dolls living a homeless lifestyle on her floor, begging passing toys for enough monopoly money to buy a sock to sleep under. However, when [Gianni] visited the usual suspects to purchase a dollhouse he found them disappointing and expensive.

So, going with the traditional collaborating-with-Santa ruse, he and his family had the pleasure of collaborating on a dollhouse development project. Each room is lit by four ultra bright LEDs. There is an elevator that’s controlled by an H-bridge module, modified to have electronic braking. [Rita] doesn’t own a Dr. Barbie yet, so safety is paramount.

The brain of the home automation is a PIC micro with a Bluetooth module. He wrote some code for it, available here. He also went an extra step and used MIT’s scratch to make an app interface for the dollhouse. You can see it work in the video after the break. The last little hack was the TV. An old arduino, an SD Card shield, and a tiny 2.4 inch TFT combine to make what’s essentially a tiny digital picture frame.

His daughter’s are overjoyed with the elevation of their doll’s economic class and a proud father even got to show it off at a Maker Faire. Very nice!

Continue reading “Rita’s Dolls Probably Live Better Than You Do”

Add Robotic Farming To Your Backyard With Farmbot Genesis

Growing your own food is a fun hobby and generally as rewarding as people say it is. However, it does have its quirks and it definitely equires quite the time input. That’s why it was so satisfying to watch Farmbot push a weed underground. Take that!

Farmbot is a project that has been going on for a few years now, it was a semifinalist in the Hackaday Prize 2014, and that development time shows in the project documented on their website. The robot can plant, water, analyze, and weed a garden filled with arbitrarily chosen plant life. It’s low power and low maintenance. On top of that, every single bit is documented on their website. It’s really well done and thorough. They are gearing up to sell kits, but if you want it now; just do it yourself.

The bot itself is exactly what you’d expect if you were to pick out the cheapest most accessible way to build a robot: aluminum extrusions, plate metal, and 3D printer parts make up the frame. The brain is a Raspberry Pi hooked to its regular companion, an Arduino. On top of all this is a fairly comprehensive software stack.

The user can lay out the garden graphically. They can get as macro or micro as they’d like about the routines the robot uses. The robot will happily come to life in intervals and manage a garden. They hope that by selling kits they’ll interest a whole slew of hackers who can contribute back to the problem of small scale robotic farming.

JumpRoaCH Is Kind Of Cute, Kind Of Creepy

There’s a theory that the fear of scurrying things is genetic. Likewise, a similar theory arose about the tendency for humans to find helpless things cute. After all, our useless babies do best in a pest free environment. This all could explain why we found this robotic roach to be both a little cute and a little creepy.

The university sponsored project, JumpRoaCH, is a collaboration between South Korea’s SNU Biorobotics Lab and Berkeley’s Biomimetic Millisystems Lab. Imitating insects has been a popular avenue for robotic research, and often results in very interesting experiments.

This robot looks like a ladybug going through its rebellious teen phase. It runs on six hook shaped legs which allow it to traverse a wider array of surfaces than wheels would, at the expense of speed and higher vibrations. The robot does a very convincing, if wobbly, scurry across the surface of its test table.

It also has a secret attack in the form of a single Rockem Sockem Robot arm located on its belly. With a powerful burst, the arm can launch the robot up a few feet to a higher surface. If the robot lands on its wheels the researchers high-five. If the robot lands on its back, it can use its ,”wings,” to flip itself right-side-up again.

The resulting paper (PDF file) has a nice description of the robot and its clever jumping mechanism. At least if these start multiplying like roaches, hackers will never short for tiny motors for their projects. Video after the break.

Continue reading “JumpRoaCH Is Kind Of Cute, Kind Of Creepy”