Add Robotic Farming to Your Backyard with Farmbot Genesis

Growing your own food is a fun hobby and generally as rewarding as people say it is. However, it does have its quirks and it definitely equires quite the time input. That’s why it was so satisfying to watch Farmbot push a weed underground. Take that!

Farmbot is a project that has been going on for a few years now, it was a semifinalist in the Hackaday Prize 2014, and that development time shows in the project documented on their website. The robot can plant, water, analyze, and weed a garden filled with arbitrarily chosen plant life. It’s low power and low maintenance. On top of that, every single bit is documented on their website. It’s really well done and thorough. They are gearing up to sell kits, but if you want it now; just do it yourself.

The bot itself is exactly what you’d expect if you were to pick out the cheapest most accessible way to build a robot: aluminum extrusions, plate metal, and 3D printer parts make up the frame. The brain is a Raspberry Pi hooked to its regular companion, an Arduino. On top of all this is a fairly comprehensive software stack.

The user can lay out the garden graphically. They can get as macro or micro as they’d like about the routines the robot uses. The robot will happily come to life in intervals and manage a garden. They hope that by selling kits they’ll interest a whole slew of hackers who can contribute back to the problem of small scale robotic farming.

Hackaday Prize Entry: Harmonicas, Candy, And Van Halen

Watch enough How It’s Made, and you’ll soon become very enthusiastic about computer vision and compressed air. In factories all around the world, production lines automatically sort the wheat from the chaff by running a product underneath a camera and blowing defective product off the line.

For his Hackaday Prize entry, [Fabien] is attempting this same task. He’s building a machine that will rapidly sort candy with computer vision and precisely controlled jets of air. He’s also planning for the Van Halen reunion and building a CNC harmonica.

Right now, the design has a hopper full of M&Ms dropping through a channel where a camera looks at each individual piece of candy. A Raspberry Pi, camera, and OpenMV detect all the red, yellow, brown, and blue M&Ms, and send that information to a computer controlling a suite of pneumatic valves. When these valves open, candy of different colors is shuffled off into it’s own bin. It’s the perfect device for someone responsible for reading Van Halen’s rider.

In an interesting little side project, [Fabien] needed a way to test the pneumatic valves before building the color sensor and candy chute. He had a harmonica lying around, and built something we’re surprised we’ve never seen before. It’s a CNC harmonica, capable of belting out a few tunes. You can check out that testing video after the break.

Continue reading “Hackaday Prize Entry: Harmonicas, Candy, And Van Halen”

Eye Tracking Makes the Musical Eye Conductor for Everyone!

For his final project at the Copenhagen Institute of Interaction Design, [Andreas Refsgaard] decided to make something that matters : a system that allows anyone to control a musical instrument using only their eyes and facial expressions. Someone should enter this into a certain contest that’s running…

Dubbed the Eye Conductor, [Andreas] has created a highly customizable system that allows for a control interface that can be operated using only your eyes, and some facial expressions. Designed with the intent to allow everyone to enjoy playing music, [Andreas] user test the system at schools, housing communities for people with physical disabilities, and anyone he could find in a wheel chair. His intent is to continue the project so that all people can enjoy playing music.

The system is open, designed for inclusion and can be customised to fit the physical abilities of whoever is using it.

Continue reading “Eye Tracking Makes the Musical Eye Conductor for Everyone!”

Robot Cheats at Rock Paper Scissors

It is hard enough to beat computers at games like chess. Now robotics engineers at the Ishikawa Watanabe Laboratory in Japan have created a janken robot that wins every time (if you didn’t know, janken is the Japanese name for rock-paper-scissors). How can it win every time? Easy. It cheats.

The janken robot evolved through three different versions. In the first version, the robotic hand would note the human player’s hand with a high-speed camera and then move the hand to a winning counter play with about a 20 millisecond delay. In the second version, the delay was greatly reduced.

However, in the third version, the robot uses a scanning technique to capture an entire field of view and determines what play the human is making. Again, a winning counter play is instantly produced by the robotic hand.

Continue reading “Robot Cheats at Rock Paper Scissors”

Hackaday Prize Semifinalist: Picking Up Litter With Robots

On beaches, in parks, and in [BDM]’s back yard, there’s a lot of liter everywhere. The normal solution to this problem is to hire someone or find some volunteers to pick up all this trash. We’re living in the future, though, and that means robots. For his Hackaday Prize entry, [BDM] is building a robot that picks up trash.

A robot that picks up litter is a very, very interesting problem. It can’t be controlled by a person, or else it would be more efficient to just get out there and kill your back picking up bottles. This means it must work autonomously, and that means identifying litter, picking it up, and disposing of it.

For the identification part of the problem, [BDM] is using computer vision that captures an RGB image and discriminates against natural objects. Right now the computer vision is far from perfect, but it does a very good job, all things considering.

The next biggest problem is picking the trash up and disposing of it. For this, [BDM] has repurposed a Power Wheels and attached a DIY robot arm. It’s not a very powerful arm, and a children’s toy probably isn’t the best platform, but it is the start of something very, very cool.

You can check out [BDM]’s video for the project below.

The 2015 Hackaday Prize is sponsored by:

Continue reading “Hackaday Prize Semifinalist: Picking Up Litter With Robots”

A Handheld CNC Router

Over the last few years, the state of the art in handheld routers has been tucked away in the back of our minds. It was at SIGGRAPH in 2012 and we caught up to it at Makerfair last year. Now, it’s getting ready for production.

Originally called Taktia, the Shaper router looks a lot like a normal, handheld router. This router is smart, though, with the ability to look at a work piece marked with a tape designed for computer vision and slightly reposition the cutter in response to how the user is moving it. A simple description doesn’t do this tool justice, so check out the video the Shaper team recently uploaded.

With the user moving the Shaper router over a work piece and motors moving the cutter head, this tool is able to make precision cuts – wooden gears and outlines of the United States – quickly, easily, and accurately. Cutting any shape is as easy as loading a file into Shaper, calling that file up on a touch screen display, and turning on the cutter. Move the router around the table, and the Shaper takes care of the rest.

Accuracy, at least in earlier versions, is said to be on the order of a hundredth of an inch. That’s good enough for wood, like this very interesting bit of joinery that would be pretty hard with traditional tools. Video below.

Thanks [martin] for the tip.

Continue reading “A Handheld CNC Router”

What You See Is What You (Laser) Cut

WYSIWYG editors revolutionized content management systems, will WYSIWYC interfaces do the same for laser cutters? Unlikely, but we still appreciate the concepts shown here. Chalkaat uses computer vision to trace lines drawn in ink with the cutting power of a laser.

At its core, you simply draw on your work piece with a colored marker and the camera system will ensure the laser traces this line exactly. There is even a proof of concept here for different behavior based on different line color, and the technique is not limited to white paper but can also identify and cut printed materials.

This is a spin on [Anirudh’s] first version which used computer vision with a projector to create a virtual interface for a laser cutter. This time around we can think of a few different uses for this. The obvious is the ability for anyone to use a laser cutter by drawing their designs by hand. Imagine introducing grade-school children to this type of technology by having them draw paper puppets and scenery in advance and have it cut in shop class for use in art projects.

A red arrow indicates cut line, but a pink arrow is used for indicating positioning on a work piece. The example shows a design from a cellphone etched next to a positioning marker. But we could see this used to position expensive things (like a Macbook) for etching. We also think the red marker could be used to make slight adjustments to cut pieces by scribing a work piece with the marker and having the laser cut it away.

This concept is a product of [Nitesh Kadyan] and [Anirudh Sharma] at the Fluid Interfaces group at the MIT Media Lab and is something we could see being built into future laser cutter models. What do you think?

Continue reading “What You See Is What You (Laser) Cut”