How To Get Started With Fadecandy And LEDs

The internet is awash with millions of stunning LED projects, and for that, we are all very thankful. For those outside the hacker/maker matrix, it can be difficult to know how to approach such a build. Never fear, for [Amy Goodchild] has put together a beginner’s guide to building pretty glowables, using Fadecandy and Processing.

Fadecandy is a platform specifically designed to drive WS2812B LEDs for artistic purposes. This allows users to focus on the visual side of things without getting bogged down with the hassle of selecting the right microcontroller and choosing the applicable libraries. It works great in combination with Processing, a piece of software designed for coders experimenting with visual arts. Through a USB link, any graphics drawn by processing can be mapped to the LEDs attached to the Fadecandy controller.

[Amy] does a great job of explaining how to do everything required, from purchasing the right equipment, through wiring everything up, and then getting it all humming along with the correct software. If you’ve ever wanted to build a big flashy project with a ton of LEDs, this would be a great place to start.

We’ve seen Fadecandy put to good use before, too. Video after the break.

Continue reading “How To Get Started With Fadecandy And LEDs”

Interactive LED Dome Glows With The Best Of Them

With the price and availability of components these days, it’s easier than ever to throw a whole pile of LEDs at a build and get them flashing away. The hard part is doing it well. [Amy Goodchild] is an artist, and has a knack for producing rather beautiful LED projects. The When in Dome installation is no exception.

The build is based around a large geodesic dome, fitted with LED panels that glow and react to the occupants inside. Using the Microsoft Kinect as a sensor enables the dome to map out what’s happening in 3D space, and use this data to guide its animations. WS2812B LED strips were used, in combination with a Fadecandy controller along with Processing. This is a powerful combination which makes designing attractive LED effects easier, without forcing users to go to the effort of writing their own libraries or optimizing their microcontroller code.

For those more interested in the dome itself, you’ll be happy to know that [Amy] doesn’t skimp on the details there either. The build actually started as a commercially available kit, though there’s still plenty of manual cutting, screwing, and painting required. She does an excellent job documenting the dome build through a series of videos, and walks the reader through some of the design decisions she made (and would remake, if given the chance).

People love geodesic domes at the best of times; adding an interactive LED installation just takes things to the next level. We’ve seen them used as greenhouses too, and they make a great hackerspace project as well. Video after the break.

Continue reading “Interactive LED Dome Glows With The Best Of Them”

This Robot Barfs Comics!

If there’s one thing that’s more fun than a comic, it’s a randomly generated comic. Well, perhaps that’s not true, but Reddit user [cadinb] wrote some software to generate a random comic strip and then built a robot case for it. Push a button on the robot and you’re presented with a randomly generated comic strip from the robot’s mouth.

The software that [cadinb] wrote is in Processing, an open source programming language and “sketchbook” for learning to code if you’re coming from a visual arts background. The Processing code determines how the images are cropped and placed and what kind of background they get. Each image is hand drawn by [cadinb] and has information associated with it so the code knows what the main focus of the image is. Once the panels are created, the final image is passed on to a thermal printer for printing. Everything is controlled from a Python script running on a Raspberry Pi and the code, strip artwork, and case is all available online to check out.

Now that the comic can print, a case is needed for the printer and controls. [cadinb] designed a case in Illustrator after creating a prototype out of foam core. The design was laser cut and then coloured – the main body with fabric dye and the arms stained with coffee!

Now [cadinb] has a robot that can sit on his table at conventions and a fan can press a button and have a randomly generated comic strip printed out before their eyes! We have a neat article about printing a comic on a strand of hair, and one about bringing the Banana Jr. 6000 to life!

Continue reading “This Robot Barfs Comics!”

Hackaday Links: October 8, 2017

On the top of the popcorn pile for this weekend is an ambiguous tweet from Adafruit that was offered without comment or commentary. [Lady Ada] is holding some sort of fancy incorporation papers for Radio Shack. The smart money is that Adafruit just bought these at the Radio Shack auction a month or so ago. The speculation is that Adafruit just bought Radio Shack, or at least the trademarks and other legal ephemera. Either one is cool, but holy crap please bring back the retro 80s branding.

A Rubik’s Cube is a fantastic mechanical puzzle, and if you’ve never taken one apart, oh boy are you in for a treat. Here’s an RGB LED Rubick’s Cube with not enough detail as to how each square is getting powered. Here’s an open challenge for anyone: build an RGB LED Rubick’s Cube, and Open Source the design.

Last weekend, the front fell off the engine of an Air France A380 flying over Greenland. As with all aircraft incidents, someone has to find the missing bits. It only took a week to find a mangled cowling on an ice sheet. This is incredibly impressive; if you want a comparison to another accident, it took three months to find the fan disk for UA 232 in an Iowa cornfield.

Poorly thought out Kickstarters don’t grab our attention like they used to, but this is an exception. The Aire is a mashup of one of those voice-activated home assistants (Alexa, whatever the Google one is named…) and a drone. The drone half of the build is marginally interesting as a ducted fan coaxial thingy, and building your own home assistant isn’t that hard with the right mics and a Raspberry Pi. The idea is actually solid — manufacturing is another story, though. It appears no one thought about how annoying it would be to have a helicopter following them around their house, or if the mics would actually be able to hear anyone over beating props. Here’s the kicker: this project was successfully funded. People want to buy this. A fool and his or her money…

Processing is cool, although we’re old skool and still reppin’ Max/MSP. It looks like the first annual Processing Community Day is coming up soon. The Processing Community Day will be at the MIT Media Lab on October 21st, with talks from the headliners of the Processing community.

Maker Faire NYC was two weekends ago, the TCT show in Birmingham was last week, and Open Hardware Summit was in Denver this weekend. Poor [Prusa] was at all of them, racking up the miles. He did, however, get to ride [James from XRobots.co.uk]’s electric longboard. There’s some great videos from [James] right here and here.

Speaking of Open Hardware Summit, there was a field trip to Sparkfun and Lulzbot this Friday. The highlight? The biggest botfarm in the states, and probably the second largest in the world. That’s 155 printers, all in their own enclosures, in a room that’s kept at 80° F. They’re printing ABS. Control of the printers is through a BeagleBone running Octoprint. These ‘Bones and Octoprint only control one printer each, and there is no software layer ‘above’ the Octoprint instances for managing multiple printers simultaneously. That probably means the software to manage a botfarm doesn’t exist. There have been attempts, though, but nothing in production. A glove thrown down?

Music Box Plays “Still Alive” Thanks To Automated Hole Puncher

Custom hole punch and feed system

Most projects have one or two significant aspects in which custom work or clever execution is showcased, but this Music Box Hole Punching Machine by [Josh Sheldon] and his roommate [Matt] is a delight on many levels. Not only was custom hardware made to automate punching holes in long spools of paper for feeding through a music box, but a software front end to process MIDI files means that in a way, this project is really a MIDI-to-hand-cranked-music-box converter. What a time to be alive.

The hole punch is an entirely custom-made assembly, and as [Josh] observes, making a reliable hole punch turns out to be extremely challenging. Plenty of trial and error was involved, and the project’s documentation as well as an overview video go into plenty of detail. Don’t miss the music box version of “Still Alive”, either. Both are embedded below.

Continue reading “Music Box Plays “Still Alive” Thanks To Automated Hole Puncher”

Float Spectrum, A Sound-Reactive Installation

[Sam Kent] and friends built a sound-reactive LED display as part of the Leeds (UK) Digital Festival and exhibited it at Hyde Park Book Club. The installation consists of a grid of 25 tubes, each one made out of four recycled 2-liter bottles equipped with a string of a dozen WS2812B LEDs controlled by a central Arduino.

Connected to the Arduino via USB, a computer running a Processing application analyzes the audio input and tells the Arduino which LEDs to light and when. The red tube in the center responds to bass, the ring of yellow LEDs mids, and the outer ring glows blue in response to high frequencies.

It’s amazing how just a simple 2-liter makes a rather effective light pipe to amplify the effect of each burst of color. We think this installation would be a great addition to the magnificent LED dance floor we recently looked at from our friends up in Toronto. If you seek an LED art piece that’s a lot easier to move around, what you’re after is a rave shopping cart.

Continue reading “Float Spectrum, A Sound-Reactive Installation”

Neural Network Composes Music; Says “I’ll Be Bach”

[carykh] took a dive into neural networks, training a computer to replicate Baroque music. The results are as interesting as the process he used. Instead of feeding Shakespeare (for example) to a neural network and marveling at how Shakespeare-y the text output looks, the process converts Bach’s music into a text format and feeds that to the neural network. There is one character for each key on the piano, making for an 88 character alphabet used during the training. The neural net then runs wild and the results are turned back to audio to see (or hear as it were) how much the output sounds like Bach.

The video embedded below starts with a bit of a skit but hang in there because once you hit the 90 second mark things get interesting. Those lacking patience can just skip to the demo; hear original Bach followed by early results (4:14) and compare to the results of a full day of training (11:36) on Bach with some Mozart mixed in for variety. For a system completely ignorant of any bigger-picture concepts such as melody, the results are not only recognizable as music but can even be pleasant to listen to.

Continue reading “Neural Network Composes Music; Says “I’ll Be Bach””