Double-pendulum Spray Gives This Graffiti Bot Some Style

Here’s an art exhibit that does its own painting. The Senseless Drawing Bot (translated) uses the back and forth motion of the wheeled based to get a double-pendulum arm swinging. At the end of the out-of-control appendage, a can of spray paint is let loose. We’re kind of surprised by the results as they don’t look like a machine made them.

The video after the break gives a pretty good synopsis of how the robot performs its duties. The site linked above is a bit difficult to navigate, but if you start digging you’ll find a lot of build information. For instance, it looks like this was prototyped with a small RC car along with sticks of wood as the pendulums.

We can’t help but be reminded of this robot that balances an inverted double pendulum. We wonder if it could be hacked to purposefully draw graffiti that makes a bit more sense than what we see here?

Continue reading “Double-pendulum Spray Gives This Graffiti Bot Some Style”

Snake-bot Gives Us The Mechanical Heevy-jeevies

Basilisk? Nope, just your run-of-the-mill giant serpentine robot build. This build aims to recreate Titanoboa, a prehistoric snake which measured more than fifty feet long and weighted over a ton. They’re well on their way to completing the goal, as what you see above is fully operational, lacking only cosmetic niceties which would only serve to make the beast less horrifying.

The video after the shows the snake getting round an open space, presumably at the eatArt headquarters in Vancouver. You may remember the team from one of their other builds also featured in that clip, the Mondo Spider. Eventually, the snake will have a rider just like the spider does, sitting in a saddle mounted just behind the head. There’s few details about the hardware, but we know it’s hydraulic, and that they raised $10k to make the build possible.

For some reason seeing these bots interact gives us flashbacks to childhood cartoons. Is it possible the eatArt crew has been watching too many old G.I. Joe cartoons and the like?

Continue reading “Snake-bot Gives Us The Mechanical Heevy-jeevies”

There’s A Lot Packed Into This BeagleBoard Controlled Rover

That black box is hiding all kinds of goodies that make this rover a hacking playground. [Andrey] built the device around a BeagleBoard, which offers the processing power and modules that he needed to make the rest of it work.

The control unit shrinks the pilot down to the rover’s size, using a cockpit that has a steering wheel and other controls, and a monitor playing the stream from the camera on the front of the bot. It has a WiFi adapter which allows control via the Internet. The camera, which can be rotated thanks to its servo mounting, feeds the video to the BeagleBoard where it is compressed using the h264 codec (more about that and the cockpit here) to lighten the streaming load. You’ll also find an ultrasonic rangefinder on the front for obstacle avoidance, and a magnetic compass for orientation information. Finally, a GPS bolsters that data, allowing you to plot your adventures on the map.

It’s great, but it will cost you. Material estimates are North of five hundred Euros!

Video: Working With The 3pi Robot’s Line Sensors

This week, we are serving up part five in our series where we are using the Pololu 3pi robot as a fancy development board for the ATmega328p processor. This week we are taking a quick break from working with the perpherals specific to the processor and will show how to work with the 3pi’s line sensors. A quick look at the schematic for the 3pi might lead you to think that you should be reading the line sensors with the A2D peripheral. Even though they are wired to the A2D pins, they need to be read digitally. In the video, [Jack] will show how to read raw values from the sensors and then how to calibrate the results so that you can get a nice clean 8-bit value representing what the sensors are seeing. Of course, that would happen under normal circumstances. Murphy had his way in this video and it turned out that our studio lighting was interfering a bit with the sensor readings when we were shooting so we didn’t get as good of a calibration as we would have liked when we shot.

Video is after the break.

In case you have missed the previous videos here are some links:

Part 1: Setting up the development environment
Part 2: Basic I/O
Part 3: Pulse Width Modulation
Part 4: Analog to Digital conversion

Continue reading “Video: Working With The 3pi Robot’s Line Sensors”

Pikachu Is Coming For You (especially On Carpet)

If you look closely, you’ll see that Pikachu isn’t sporting a pair of funky throwing stars, but is actually suspended between there. Our furry friend is just putting a happy face on this carpet roving robot called the Carpet Monkey V5. It’s been in the works for years, and this is just one more stop in the prototyping process as the development of version 6 is already under way.

The project is a testament to what can be accomplished using all of the design tools at your disposal. The motive mechanism was conceived as a cross between the qualities of legs and the ease of using wheels. Each of the appendages are covered with strategically placed points meant to grab onto carpet, and allow the ‘wheel’ to grip objects as the machine vaults over them. You can see that each has a spring mechanism to further facilitate gripping with each turn of the axle. This seems to go far beyond what usually comes out of hobby robotics, and we think that’s a great thing!

After the break there’s a video showing how all the parts of these grippers are assembled. See the bot cruising around the room at about 3 minutes in.

Continue reading “Pikachu Is Coming For You (especially On Carpet)”

mykeepon-hacking

Reverse Engineering MyKeepon

[qDot] recently got his hands on a MyKeepon toy and after messing with it a bit, decided to tear it down to see what was inside. He had hopes of easily modding the toy, but like most adventures in hacking, things might take a while longer than he first imagined.

In his teardown you can see the various components that make up the MyKeepon, including a trio of motors for movement, along with a series of buttons and a microphone used to interact with the toy. Of course, the part that interested him the most was MyKeepon’s circuit board, since that’s where the real work would begin.

There, he discovered two main processor Padauk processor chips, described as “Field Programmable Processor Arrays” in their data sheets. He says that the brand is well known for lifting text verbatim from PIC data sheets, so he doesn’t have a ton of faith in what’s printed there. Sketchy documentation aside, he poked around on the I2C bus connecting the two chips and was able to sniff a bit of traffic. He is documenting his findings as he goes along, which you can see more of on his Github project site.

He has made a few simple modifications to the toy already, but there’s plenty more to do before he has complete control over it. His work is bound to make tons of MyKeepon fans happy, including our own [Caleb Kraft], whose love for the toy can be seen in the video below taken at last year’s CES.

Continue reading “Reverse Engineering MyKeepon”

Build A Kinect Bot For 500 Bones

[Eric] sent in his tutorial on building a Kinect based robot for $500, a low-cost solution to a wife that thinks her husband spends too much on robots.

For the base of his build, [Eric] used an iRobot Create, a derivative of the Roomba that is built exclusive for some hardware hackery. For command and control of the robot, an EEE netbook takes data from the Kinect and sends it to the iRobot over a serial connection.

The build itself is remarkably simple: two pieces of angle aluminum were attached to the iRobot, and a plastic milk crate was installed with zip ties. The Kinect sits on top of the plastic crate and the netbook comfortably fits inside.

A few weeks ago, [Eric] posted a summary of the history and open-source software for the Kinect that covers the development of the Libfreenect driver. [Eric] used this same driver for his robot. Currently, the robot is configured for two modes. The first mode has the robot travel to the furthest point from itself. The second mode instructs the robot to follow the closest thing to itself – walk in front of the robot and it becomes an ankle biter.

There is a limitation of the Kinect that [Eric] is trying to work around. Objects closer than 19 inches to the Kinect appear to be very far away. This caused a lot of wall bumping, but he plans on adding a few ultrasonic sensors to fill the gap in the sensor data. Not bad for a very inexpensive autonomous robot.