Making Minty Fresh Music With Markov Chains: The After Eight Step Sequencer

Step sequencers are fantastic instruments, but they can be a little, well, repetitive. At it’s core, the step sequencer is a pretty simple device: it loops through a series of notes or phrases that are, well, sequentially ordered into steps. The operator can change the steps while the sequencer is looping, but it generally has a repetitive feel, as the musician isn’t likely to erase all of the steps and enter in an entirely new set between phrases.

Enter our old friend machine learning. If we introduce a certain variability on each step of the loop, the instrument can help the musician out a bit here, making the final product a bit more interesting. Such an instrument is exactly what [Charis Cat] set out to make when she created the After Eight Step Sequencer.

The After Eight is an eight-step sequencer that allows the artist to set each note with a series of potentiometers (which are, of course, housed in an After Eight mint tin). The potentiometers are read by an Arduino, which passes MIDI information to a computer running the popular music-oriented visual programming language Max MSP. The software uses a series of Markov Chains to augment the musician’s inputted series of notes, effectively working with the artist to create music. The result is a fantastic piece of music that’s different every time it’s performed. Make sure to check out the video at the end for a fantastic overview of the project (and to hear the After Eight in action, of course)!

[Charis Cat]’s wonderful creation reminds us of some the work [Sara Adkins] has done, blending human performance with complex algorithms. It’s exactly the kind of thing we love to see at Hackaday- the fusion of a musician’s artistic intent with the stochastic unpredictability of a machine learning system to produce something unique.

Thanks to [Chris] for the tip!

Continue reading “Making Minty Fresh Music With Markov Chains: The After Eight Step Sequencer”

Hardware Vs Software: Fight!

It’s one of the great cliches in the hacker world: the hardware type and the software type. You can tell which of these two you are quite easily. When a project is actually 20% done, but you think it’s 90% done, and you say to yourself “And the rest is a simple matter of software”, you’re a hardware type. Ask anyone who has read my code, and they’ll tell you, I’m a hardware type.

Along with my blindness to the difficulties of getting the code right, I’ve also admittedly got an underappreciation of what powers lie in the dark typing arts. But I am not too proud to tip my hat when I see an awesome application of the soft stuff. Case in point: this Go board sequencer that we ran last week. An overhead webcam parses players’ moves as they put black and white stones down while playing the game of Go, and turns this into music.

The pure software type will be saying “but there’s a webcam and a Go board”. And indeed, that’s true. There are physical elements to this project that anchor it in the shared reality of the two people playing. But a hardware project this isn’t; it’s OpenCV and Max/MSP that make it work.

For comparison, look at the complexity of this similar physical sequencer. It’s got a 16 x 16 array of LEDs and switches and a CNC milled, primed, and painted surface that’s the size of a twin bed. Sawdust and hand-soldering: that’s a hardware project.

What I love about the Go sequencer is that it uses software just right. The piece is still physical. It could have just as easily been a VR world, where the two people would interact with each other only inside their goggles. But somehow that’s not quite as human as putting stones on a wooden board, sitting across from, and maybe even looking at, your opponent. The players aren’t forced to think about the software. They don’t feel like they’re playing a video game.

But at the same time, the software side of things makes all of the horrible hardware problems go away. Nobody is soldering a rat’s nest of 169 switches. There’s a webcam plugged into the USB port of a laptop. There’s a deep simplicity there.

Should you always trade out arcade buttons for OpenCV? Absolutely not! But is it worth considering the soft side when doing it in hardware is just too, well, hard? I’m open.

Generating MIDI With Ruby

[vimeo 720761]

[Giles Bowkett] has been working on a music library for Ruby called Archaeopteryx. He describes it as a “Ruby MIDI DJing/live-coding thing“. In the video above, He’s using it to generate and then morph rhythms. The Ruby code is directly controlling the step sequencer in Reason. It’s an interesting approach to music development. The video above gives a full intro to the probability approach to generation. To really get a feel for the library, we suggest you watch his presentation from RubyFringe. It shows him playing music by editing a live block of code. Check out his Vimeo feed for many more demo videos.

[via CDM]