ROBOCHOP! It Slices, Dices, But Wait! There’s More…

You’re gunna love my cuts. 

KUKA robots are cool. They’re both elegant and terrifying to watch in action as they move unyieldingly to preform tasks. Not many of us get to use industrial tools like this because they aren’t exactly trivial to wield (or cheap!). Artists [Clemens Weisshaar] and [Reed Kram] however created an installation that allows anyone to potentially control one of these orange beauties to do their bidding… all from the safety and comfort of a computer chair.

For their piece, “ROBOCHOP”, the artists developed a web app that allows you to easily manipulate the surface of a virtual cube. You can rotate for positioning and then use a straight or curved line tool to draw vectors through its surface and subtract material. Once you’re finished sculpting your desired masterpiece, one of the four KUKA robots in the installation will retrieve a 40 x 40 x 40 cm block of foam and shape it into a real-life version of whatever you created in the app.

Screen Shot 2015-03-06 at 1.03.39 PMStarting today you can visit the project’s website and upload your own mutilated cube designs. If your design is selected by the artists, it will be among the 2000 pieces carved by the robots throughout their installation during CeBit in Hanover. After the show, your cube spawn will then be mailed to you free of charge! The only way I could see this being cooler, is if they filmed the process so you could watch your shape being born.

Anyhow, I personally couldn’t resist the invitation to sculpt Styrofoam remotely with an industrial grade robot arm and came up with this gem.

You can go to their page if you want to give the app a go, and really… why wouldn’t you?

Continue reading “ROBOCHOP! It Slices, Dices, But Wait! There’s More…”

Putting Oculus Rift on a Robot

Many of the early applications for the much anticipated Oculus Rift VR rig have been in gaming. But it’s interesting to see some more useful applications besides gaming, before it’s commercial release sometime this year. [JoLau] at the Institute i4Ds of FHNW School of Engineering wanted to go a step beyond rendering virtual worlds. So he built the Intuitive Rift Explorer a.k.a IRE. The IRE is a moving reality system consisting of a gimbaled stereo-vision camera rig transmitting video to the Rift, and matching head movements received from the Oculus Rift. The vision platform is mounted on a Remote-controlled robot which is completely wireless.

One of the big challenges with using VR headsets is lag, causing motion sickness in some cases. He had to tackle the problem of latency – reducing the time from moving the head to getting a matching image on the headset – Oculus Rift team specified it should be less than 20ms. The other important requirement is a high frame rate, in this case 60 frames per second. [JoLau] succeeded in overcoming most of the problems, although in conclusion he does mention a couple of enhancements that he would like to add, given more time.

[JoLau] provides a detailed description of the various sub-systems that make up IRE – the Stereo camera,  audio and video transmission, media processing, servo driven gimbal for the stereo camera,  and control system code. [JoLau]’s reasoning on some of the interesting hardware choices for several components used in the project makes for interesting reading. Watch a video of the IRE in action below.

Continue reading “Putting Oculus Rift on a Robot”

Ro-Bow, The Violin Playing Robot

There are robots that will vacuum your house, mow your lawn, and keep their unblinking electronic eyes on you at all times while hovering hundreds of feet in the air. How about a robot that plays a violin? That’s what [Seth Goldstein] built. He calls it a ‘kinetic sculpture’, but there more than enough electronics and mechatronics to keep even the most discerning tinkerer interested.

There are three main parts of [Seth]’s violin-playing kinetic sculpture. The first is a bow carriage that draws the bow across the strings using an electromagnet to press the bow against the strings. The individual strings are fingered with four rubber disks, and a tilting mechanism rotates the violin so the desired string is always underneath the bow and mechanical fingers.

As far as software goes, the Ro-Bow transforms MIDI files into robotic mechanization that make the violin sing. From what we can tell, it’s not quite as good as a human player; only one string at a time can be played. It is, however, great at what it does and is an amazing mechanical sculpture.

Video Below.

Continue reading “Ro-Bow, The Violin Playing Robot”

Openhand Combines 3D Printing with Urethane Casting

Yale University brings us quite a treat with their Openhand Project.

If you’ve ever operated a robotic arm, you know that one of the most cumbersome parts is always the end effector. It will quickly make you realize what an amazing work of engineering the human hand really is, and what a poor intimation a simple open-close gripper ends up being.

[Yale] is working to bring tendon-driven robotic hands to the masses with an interesting technique of combining 3D printing and resin/urethane casting. Known as Hybrid Deposition Manufacturing (HDM), it allows the team to 3D print robotic fingers that also contain the mold for finger pads and joints, all built right into the 3D part.  The tendon-driven fingers allow for a very simple design that are not only easy to make, but have a low parts count as well. Because of the human-like tendons, the fingers naturally curl around the object, distributing it’s force much more evenly and naturally, much like a human hand would. In the videos after the break, you can see the building process, as well as the hand in action.

Best news is that it’s all open source. They also include some python libraries so you can customize the CAD files fit your needs.

Continue reading “Openhand Combines 3D Printing with Urethane Casting”

Whiteboard Clock Draws the Time

[Maurice] recently built a clock that draws the time (Google Doc) on a white board. We’ve seen plenty of clock hacks in the past, and even a very similar one. It’s always fun to see the different creative solutions people can come up with to solve the same problem.

This device runs on a PIC16F1454 microcontroller. The code for the project is available on GitHub. The micro is also connected to a 433MHz receiver. This allows a PC to keep track of the time, instead of having to include a real-time clock in the circuit. The USB connector is only used for power. All of the mounting pieces were designed in OpenSCAD and printed on a 3D printer. Two servos control the drawing arms. A third servo can raise and lower the marker to the whiteboard. This also has the added benefit of being able to place the marker tip inside of an eraser head. That way the same two servos can also erase the writing.

The communication protocol for this systems is interesting. The transmitter shows up on [Maurice’s] PC as a modem. All he needs to do to update the time is “echo 12:00 > /dev/whiteboard”. In this case, the command is run by a cron job every 5 minutes. This makes it easy to tweak the rate at which the time updates on the whiteboard. All communication is done one-way. The drawing circuit will verify the checksum each time it receives a message. If the check fails, the circuit simply waits for another message. The computer transmits the message multiple times, just in case there is a problem during transmission.

Caption CERN Contest Turns out Big Brains and Comic Brilliance

Week 1 of Hackaday’s Caption CERN Contest is complete. We have to say that the Hackaday.io users outdid themselves with funny captions but we also helped CERN add meaning to one of their orphan images. First a few of our favorite captions:

The Funnies:

If you adjust that scope again, when I haven’t touched the controls, I’m donating you to a city college. – [Johnny B. Goode]

SAFTEY FIRST – The proper way to test a 6kv power supply for ripple on the output. – [milestogoh]

Dr. Otto Gunther Octavius – R&D some years before the accident. – [jlbrian7]

The prize though, goes to Hackaday commenting superstar [DainBramage], who proved he knows us all too well with his Portal inspired caption:

Here we see Doug Rattmann, one of Aperture’s best and brightest, perfecting our neurotoxin prior to delivery.

Congrats [DainBramage], enjoy your shirt from The Hackaday Store!

The Meaning of the Image:

8106409Funny captions weren’t the only thing in the comments though – the image tickled [jlbrian7’s] memory and led to a link for CERN Love. A four-year old blog entry about robots at CERN turned out to be the key to unraveling the mystery of this captionless photo. The image depicts [Robert Horne] working with a prototype of the MANTIS system. MANTIS was a teleoperation manipulator system created to work in sections of the CERN facility which were unsafe for humans due to high levels of radioactivity. The MANTIS story is an epic hack itself, so keep your eyes peeled for a future article covering it! We’ve submitted the information to CERN, and we’re giving [jlbrian7] a T-shirt as well for his contribution to finding the actual caption for this image.

Get Started on Next Week:

The image for week 2 is already up, so head over and see for yourself. We’re eager for your clever captions. Ideally we can also figure out the backstory for each week’s randomly chosen image.

Cute Tiny Robot Gets a Pair of Hacked Eyes

One day while at our poor, poor Radio Shack, [davidhend] purchased a little 6-legged walking robot. It came with an infrared remote that allowed a user to control its movements from afar. After a few minutes of making the robot walk around [davidhend] got bored and decided it would be a great toy to hack.

His plan was to make the robot autonomous and able to avoid obstacles. To start off, the robot was taken apart enough to expose the circuit board. There he found a ST1155A bi-directional motor driver that was controlled by an on-board microcontroller. After checking out the ST1155A data sheet, [davidhend] thought he would be able to drive it with an Arduino. So, out came the soldering iron and all the unnecessary components were removed from the original circuit board.

An off the shelf PING))) sensor was mounted on the front of the robot and is responsible for detecting obstacles. That information is then sent back to the Arduino Nano which controls the motor driver to make the robot back up, turn and then start walking straight again until another obstacle is detected. [davidhend] made his Arduino Code (.zip file) available to anyone who wants to make a similar project. Check out the video after the break!

Oh, and if you plan to run down to the Shack to pick up a robot of your own you better do it like right now.

Continue reading “Cute Tiny Robot Gets a Pair of Hacked Eyes”