This Robot Barfs Comics!

If there’s one thing that’s more fun than a comic, it’s a randomly generated comic. Well, perhaps that’s not true, but Reddit user [cadinb] wrote some software to generate a random comic strip and then built a robot case for it. Push a button on the robot and you’re presented with a randomly generated comic strip from the robot’s mouth.

The software that [cadinb] wrote is in Processing, an open source programming language and “sketchbook” for learning to code if you’re coming from a visual arts background. The Processing code determines how the images are cropped and placed and what kind of background they get. Each image is hand drawn by [cadinb] and has information associated with it so the code knows what the main focus of the image is. Once the panels are created, the final image is passed on to a thermal printer for printing. Everything is controlled from a Python script running on a Raspberry Pi and the code, strip artwork, and case is all available online to check out.

Now that the comic can print, a case is needed for the printer and controls. [cadinb] designed a case in Illustrator after creating a prototype out of foam core. The design was laser cut and then coloured – the main body with fabric dye and the arms stained with coffee!

Now [cadinb] has a robot that can sit on his table at conventions and a fan can press a button and have a randomly generated comic strip printed out before their eyes! We have a neat article about printing a comic on a strand of hair, and one about bringing the Banana Jr. 6000 to life!

Continue reading “This Robot Barfs Comics!”

Hackaday Links: October 8, 2017

On the top of the popcorn pile for this weekend is an ambiguous tweet from Adafruit that was offered without comment or commentary. [Lady Ada] is holding some sort of fancy incorporation papers for Radio Shack. The smart money is that Adafruit just bought these at the Radio Shack auction a month or so ago. The speculation is that Adafruit just bought Radio Shack, or at least the trademarks and other legal ephemera. Either one is cool, but holy crap please bring back the retro 80s branding.

A Rubik’s Cube is a fantastic mechanical puzzle, and if you’ve never taken one apart, oh boy are you in for a treat. Here’s an RGB LED Rubick’s Cube with not enough detail as to how each square is getting powered. Here’s an open challenge for anyone: build an RGB LED Rubick’s Cube, and Open Source the design.

Last weekend, the front fell off the engine of an Air France A380 flying over Greenland. As with all aircraft incidents, someone has to find the missing bits. It only took a week to find a mangled cowling on an ice sheet. This is incredibly impressive; if you want a comparison to another accident, it took three months to find the fan disk for UA 232 in an Iowa cornfield.

Poorly thought out Kickstarters don’t grab our attention like they used to, but this is an exception. The Aire is a mashup of one of those voice-activated home assistants (Alexa, whatever the Google one is named…) and a drone. The drone half of the build is marginally interesting as a ducted fan coaxial thingy, and building your own home assistant isn’t that hard with the right mics and a Raspberry Pi. The idea is actually solid — manufacturing is another story, though. It appears no one thought about how annoying it would be to have a helicopter following them around their house, or if the mics would actually be able to hear anyone over beating props. Here’s the kicker: this project was successfully funded. People want to buy this. A fool and his or her money…

Processing is cool, although we’re old skool and still reppin’ Max/MSP. It looks like the first annual Processing Community Day is coming up soon. The Processing Community Day will be at the MIT Media Lab on October 21st, with talks from the headliners of the Processing community.

Maker Faire NYC was two weekends ago, the TCT show in Birmingham was last week, and Open Hardware Summit was in Denver this weekend. Poor [Prusa] was at all of them, racking up the miles. He did, however, get to ride [James from]’s electric longboard. There’s some great videos from [James] right here and here.

Speaking of Open Hardware Summit, there was a field trip to Sparkfun and Lulzbot this Friday. The highlight? The biggest botfarm in the states, and probably the second largest in the world. That’s 155 printers, all in their own enclosures, in a room that’s kept at 80° F. They’re printing ABS. Control of the printers is through a BeagleBone running Octoprint. These ‘Bones and Octoprint only control one printer each, and there is no software layer ‘above’ the Octoprint instances for managing multiple printers simultaneously. That probably means the software to manage a botfarm doesn’t exist. There have been attempts, though, but nothing in production. A glove thrown down?

Music Box Plays “Still Alive” Thanks to Automated Hole Puncher

Custom hole punch and feed system

Most projects have one or two significant aspects in which custom work or clever execution is showcased, but this Music Box Hole Punching Machine by [Josh Sheldon] and his roommate [Matt] is a delight on many levels. Not only was custom hardware made to automate punching holes in long spools of paper for feeding through a music box, but a software front end to process MIDI files means that in a way, this project is really a MIDI-to-hand-cranked-music-box converter. What a time to be alive.

The hole punch is an entirely custom-made assembly, and as [Josh] observes, making a reliable hole punch turns out to be extremely challenging. Plenty of trial and error was involved, and the project’s documentation as well as an overview video go into plenty of detail. Don’t miss the music box version of “Still Alive”, either. Both are embedded below.

Continue reading “Music Box Plays “Still Alive” Thanks to Automated Hole Puncher”

Float Spectrum, a Sound-Reactive Installation

[Sam Kent] and friends built a sound-reactive LED display as part of the Leeds (UK) Digital Festival and exhibited it at Hyde Park Book Club. The installation consists of a grid of 25 tubes, each one made out of four recycled 2-liter bottles equipped with a string of a dozen WS2812B LEDs controlled by a central Arduino.

Connected to the Arduino via USB, a computer running a Processing application analyzes the audio input and tells the Arduino which LEDs to light and when. The red tube in the center responds to bass, the ring of yellow LEDs mids, and the outer ring glows blue in response to high frequencies.

It’s amazing how just a simple 2-liter makes a rather effective light pipe to amplify the effect of each burst of color. We think this installation would be a great addition to the magnificent LED dance floor we recently looked at from our friends up in Toronto. If you seek an LED art piece that’s a lot easier to move around, what you’re after is a rave shopping cart.

Continue reading “Float Spectrum, a Sound-Reactive Installation”

Neural Network Composes Music; Says “I’ll be Bach”

[carykh] took a dive into neural networks, training a computer to replicate Baroque music. The results are as interesting as the process he used. Instead of feeding Shakespeare (for example) to a neural network and marveling at how Shakespeare-y the text output looks, the process converts Bach’s music into a text format and feeds that to the neural network. There is one character for each key on the piano, making for an 88 character alphabet used during the training. The neural net then runs wild and the results are turned back to audio to see (or hear as it were) how much the output sounds like Bach.

The video embedded below starts with a bit of a skit but hang in there because once you hit the 90 second mark things get interesting. Those lacking patience can just skip to the demo; hear original Bach followed by early results (4:14) and compare to the results of a full day of training (11:36) on Bach with some Mozart mixed in for variety. For a system completely ignorant of any bigger-picture concepts such as melody, the results are not only recognizable as music but can even be pleasant to listen to.

Continue reading “Neural Network Composes Music; Says “I’ll be Bach””

Enter the Space Tunnel

What’s better than 1 string of LED lights? 96. That’s how many. Each string of the 96 has 60 ws2812b LEDs, for a total of 5760 individually addressable RGB LEDs.  That’s not the cool part of [jaymeekae]’s Space Tunnel installation, the cool part is that they’re interactive.

Starting out with some PVC piping, dark cloth was used as a backdrop and the LED strips were attached to it. Several power supplies are used to supply the voltage necessary and each strip controlled by FadeCandy chips which connect to, in this case, a Windows PC via USB. Initially, computer power supplies were used, but they couldn’t supply the current necessary. [jaymeekae] used them for the first installation, but switched to better power supplies for further installations.

Once the lights were up and powered, [jaymeekae] started work on the interface to control them. Starting with a used bureau, [jaymeekae] cut out a section for the touchscreen, and installed the controlling computer in the bottom half. Processing is used to interface with the FadeCandy controllers and HTML is used for a user interface. Each mode runs a different Processing program for different effects, including audio visualization, a space tunnel mode (hence the name) and a cool drawing app where the user draws on the touchscreen and sees the results in the lights overhead.

Over several iterations, the Space Tunnel has evolved, with better power supplies and a better interface. It’s a great art installation and [jaymeekae] takes it to festivals, including one in Spain and one in the UK. There are some other LED string projects at Hack-a-Day, including this one with ping-pong balls, and this one that involves drinking a lot of beer first.

[via Reddit]

Continue reading “Enter the Space Tunnel”

Algorithm Turns PCBs Into Art

Many of us have held a circuit board up to a strong light to get a sense for how many layers of circuitry it might contain. [alongruss] did this as well, but, unlike us, he saw art.

We’ve covered some art PCBs before. These, for the most part, were about embellishing the traces in some way. They also resulted in working circuits. [alongruss]’s work focuses more on the way light passes through the FR4: the way the silkscreen adds an interesting dimension to the painting, and how the tin coating reflects light.

To prove out and play with his algorithm he started with GIMP. He ran the Mona Lisa through a set of filters until he had layers of black and white images that could be applied to the layers of the circuit board. He ordered a set of boards from Seeed Studio and waited.

They came back a success! So he codified his method into Processing code. If you want to play with it, take a look at his GitHub.