Arpeggio – The Piano SuperDroid

I never had the musical talent in me. Every now and then I would try to pick up a guitar or try and learn the piano, romanticising a glamorous career out of it at some point. Arpeggio – the Piano SuperDroid (YouTube, embedded below) sure makes me glad I chose a different career path. This remarkable machine is the brain child of [Nick Morris], who spent two years building it.

Although there are no detailed technical descriptions yet, at its heart this handsome robot consists of a set of machined ‘fingers’ connected to a set of actuators — most likely solenoids . The solenoids are controlled by proprietary software that combines traditional musical data with additional parameters to accurately mimic performances by your favourite pianists, right in your living room. Professional pianists, who were otherwise assuming excellent job security under Skynet, clearly have to reconsider now.

Along with incredible musical talent, Arpeggio is equipped with a set of omniwheels, allowing it to navigate around quite efficiently. This is not completely autonomous (yet). I cant wait to see the havok this robot causes if it were to go rogue.

[GIF credit: Gizmodo]

43 thoughts on “Arpeggio – The Piano SuperDroid

  1. I listen to a lot of piano music, and this is really more than just MIDI playback. When I hear this i got the feel like a person is playing, not very descriptive I know, but this is something special. wonder if there is a blur function that put that delicate variating touch in the strokes?

    1. Not by themselves, they don’t, but they are very well characterized so that its relatively simple in software to drive them in a way to produce any force curve you want, using pulsed power.

  2. It doesn’t do Jazz, if you disagree you just don’t get what Jazz really is. There is more to live music than what can be captured in the sense that once it is recorded it is no longer live and the next note in the sequence is predictable.

    1. What if you combined the ability to do subtle variations akin to live performances with the efficiency of direct machine control? It would be inherently unpredictable and yet still, if done right, sound really amazing. Not really the same as just playing back a prerecorded set of notes, although that is basically what a live human performance is when you break it down to a fundamental level anyway.

      1. The interaction between jazz players in a live performance borders on the psychic/telepathic, and I say that as a person who is not inclined at all toward magic thinking.

        I’m not sure we even understand the process well enough to be able to emulate it.

          1. I see what you mean but that is off in a direction that doesn’t fully relate to the point I made about what makes jazz special. A NN still needs it’s inputs defined, but we can’t define those inputs when it comes to the interaction between human jazz performers. If we could an AI could play along with them and even take the lead without the piece going off the rails. I think we will get there in the end because it is ultimately a problem with defining the inputs and the complexity of the process, but we are still a long way away from that level of sophistication.

          2. I think what Dan is defining here is General Artificial Intelligence. So in that sense no NN is not there yet, but to most people a NN can sound a lot like jazz.

        1. “It doesn’t do Jazz, if you disagree you just don’t get what Jazz really is.”
          In other words, if you get to choose the definition of ‘Jazz’, then you can make it so by definition. Your anthrocentrism will one day be obsolete.

          But maybe I’m going too far – you’re not saying it’s not Jazz unless there is a human involved, exactly. But if the “live” element is essential, this must mean that there is feedback required, and therefore it is a closed-loop process rather than the open-loop process that this robot appears to use. But that’s just a matter of adding sensors and improving the software.

          Unless you can define what consciousness is, you can’t really say that a machine can’t have it.

          1. The real question is why would we want a robot that can play jazz? Surely only other robots would want to go hear/see another robot playing robot jazz? This just seems more like a scientific and technological experiment at this point, though I can see how the technology (especially if it were based on neural networks) could be used for something actually useful or helpful for people or help us study and understand how the human brain works when improvising for example.

        1. Should all black people be proud of what a few did in New Orleans? Even if they don’t like jazz? Should all white people or all gay people be proud of Liberace?

          How about all Americans be proud of the music created by Americans. That is a lot less divisive.

    2. Yeah jazz is the notes you don’t hear lmao. Nope, Dan it probably can. Just have it strike every 8th note it plays wrong, have it go off on wild tangental parts that are not in the original key, and maybe a volt starve to mimic heroin use. I used to just layer piano and drum voices on top of each other to generate instant jazz. You would be surprised at how well it works.
      The very fact that one can define jazz means there are elements that are standard and therefore predictable ;)

  3. This, like all player-piano technology, is cheating. It’s playing the piano with 88 fingers. That’s not how humans play. If you really want to claim that you’ve made a robot piano player that plays “like a human,” then you need to make one with the same limitations – 10 fingers in groups of 5 with maximum separations between adjacent fingers and between the groups.

    Player piano rolls have traditionally been “programmed” with super-human playing abilities – too many simultaneous notes for a human player – in some cases even too many for “four hands” playing (which is a thing – Pictures at an Exhibition has been arranged for piano-four-hands. It may have even been originally composed that way. I don’t remember). It’s nice and all, but it’s no better than composing for MIDI.

    1. There were probably people complaining that playing piano with four hands was also cheating. But if that’s acceptable, then three people sitting at each of three piano keyboards gives you 90 note capacity, with humans. You can do it like a bell choir, with each person playing only their ten notes, so hand spans become a non-issue as well.

          1. More like half a dozen arrangers. Doug Henderson comes to mind. Making piano rolls is, I would say, a cottage industry, with some of the duplicating equipment being located in the homes of people like the Malones of Turlock, California.

  4. I don’t mean to dis the guy’s work, it’s a beautiful machine, but it’s not that hard if your source material includes all the data necessary to reproduce the keystrokes. Call me when you can hand it a sheet of music and the output is indistinguishable from a human player.

    On the other hand, it opens the doors to piano music that can have more than 10 (finger) events per beat

    1. That basically isnt how sheet music works….

      It would have to also listen to recordings on the piece, decide how to play it, then use the sheets to create a meaningful version of the song.

      “It’s not that hard” – LOL!

      Sheet music isnt MIDI.

    1. And contrary to the impressive Disklavier performance of Rudess shown, Disklavier can play with dynamics as well.

      Our local Yamaha dealer showed up as a surprise at my father’s funeral with a upright Disklavier playing a recorded performance of my dad.. Not a dry eye in the house.

      1. Yeah this is just a transcription, not a recording. I don’t think they’d be able to accurately transcribe the individual velocities of every note for an accurate Rudess reproduction.

  5. I can understand this being a useful tool for studio _recording_ of live piano in the absence of a human pianist. But honestly at this point I hope only robots would be keen to go see a robot actually perform live for the musical/performance rather than technological value. That being said it doesn’t even perform yet – it just spits out a human’s previous performance so I’m just trying to understand which niche this fits into currently – apart from being a nice piece of engineering!

Leave a Reply to FrydaddyCancel reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.