Anyone who has through the process of learning to play a musical instrument for the first time, or listening to someone attempting to do so will know that it can be a rather painful and frustrating experience. [Alessandro Perini] apparently couldn’t get enough of the sound of a first-time musician, so he created a robot to play the melodica badly for hours on end, as demonstrated in the video after the break.
The project is appropriately named “AI’ve just started to learn to play”, and attempts to copy every melody it hears in real-time. The robot consists of the cartridge carriage from an old printer, mounted on a wooden frame to hold the melodica. The original carriage used a DC motor with an encoder for accurate movement, but since position accuracy was not desirable, [Alessandro] ditched the encoder. Two small trolley wheels are mounted on the cartridge holder to push down on the melodica’s key. A bistable solenoid valve controls airflow to the melodica from an air compressor. The DC motor and solenoid valve is controlled by an Arduino via a pair of LM298 motor drivers.
A host computer running software written in Cycling ’74 MAX listens to the melody it’s trying to imitate, and send serial commands to the Arduino to move the carriage and open the solenoid to try and match the notes. Of course, it keeps hitting a series of wrong notes in the process. The Arduino code and build instructions have been published, but the main Max software is only described briefly. [Alessandro] demonstrated the robot at a local festival, where it played YouTube tutorial snippets and jammed with a local band for a full 24 hours. You have to respect that level of endurance.
When you want to quickly pull together a combination of media and user interaction, looking to some building blocks for the heavy lifting can be a lifesaver. That’s the idea behind Max, a graphical programming language that’s gained a loyal following among anyone building art installations, technology demos (think children’s museum), and user Kiosks.
Guy Dupont gets us up to speed with a how to get started with Max workshop that was held during the 2020 Hackaday Remoticon. His crash course goes through the basics of the program, and provides a set of sixteen demos that you can play with to get your feet under you. As he puts it, if you need sound, video, images, buttons, knobs, sensors, and Internet data for both input and output, then Max is worth a look. Video of the workshop can be found below.
If you’re looking for the future of humanity, look no further than the first plasma generated in the Wendelstein 7-X Stellerator at the Max Planck Institute for Plasma Physics. It turned on for the first time yesterday, and while this isn’t the first fusion power plant, nor will it ever be, it is a preview of what may become the invention that will save humanity.
For a very long time, it was believed the only way to turn isotopes of hydrogen into helium for the efficient recovery of power was the Tokamak. This device, basically a hollow torus lined with coils of wire, compresses plasma into a thin circular string. With the right pressures and temperatures, this plasma will transmute the elements and produce power.
Tokamaks have not seen much success, though, and this is a consequence of two key problems with the Tokamak design. First, we’ve been building them too small, although the ITER reactor currently being built in southern France may be an exception. ITER should be able to produce more energy than is used to initiate fusion after it comes online in the 2020s. Tokamaks also have another problem: they cannot operate continuously without a lot of extraneous equipment. While the Wendelstein 7-X Stellerator is too small to produce a net excess of power, it will demonstrate continuous operation of a fusion device. [Elliot Williams] wrote a great explanation of this Stellerator last month which is well worth a look.
While this Stellerator is just a testbed and will never be used to generate power, it is by no means the only other possible means of creating a sun on Earth. The Polywell – a device that fuses hydrogen inside a containment vessel made of electromagnets arranged like the faces of a cube – is getting funding from the US Navy. Additionally, Lockheed Martin’s Skunk Works claims they can put a 100 Megawatt fusion reactor on the back of a truck within a few years.
The creation of a fusion power plant will be the most important invention of all time, and will earn the researchers behind it the Nobel prize in physics and peace. While the Wendelstein 7-X Stellarator is not the first fusion power plant, it might be a step in the right direction.
[vtol] is quickly becoming our favorite technological artist. Just a few weeks ago he graced us with a Game Boy Camera gun, complete with the classic Game Boy printer. Now, he’s somehow managed to create even lower resolution images with a modified typewriter that produces ASCII art images.
As with everything dealing with typewriters, machine selection is key. [vtol] is using a Brother SX-4000 typewriter for this build, a neat little daisy wheel machine that’s somehow still being made today. The typewriter is controlled by an Arduino Mega that captures an image from a camera, converts it to ASCII art with Pure Data and MAX/MSP, then slowly (and loudly) prints it on a piece of paper one character at a time.
The ASCII art typewriter was recently shown at the 101 Festival where a number of people stood in front of a camera and slowly watched a portrait assemble itself out of individual characters. Check out the video of the exhibit below.
This is a second generation Manta, a touch-based controller with visual feedback made to use with Max/MSP. The hexagonal size and the patterns seen in the video after the break remind us of the arm-based computers the Predators sport in the movies. Like the previous generation, this controller can tell not only which of the 48 sensor you’re touching, but how much of your finger is touching it. The sky is the limit on extensibility with this type of data, but for now you can just try out the pre-built plugin and see how it goes. New with this rendition of the Manta is the use bi-color LEDs which adds another lay of interaction with the PC to which this is tethered.
The Winduino II uses fins to pick up the movement of the wind and translate it into music. Each fin is attached to the main body using a piezo vibration sensor. The signals are processed by an Arduino housed inside and the resulting data makes its way to a computer via a Bluetooth connection to facilitate the use of Max/MSP for the audio processing. Included in the design is an array of solar panels used to keep the battery for the device charged up. Hear and see this creative piece after the break.
[Robert] wrote a program using Max/MSP that lets him make music with his guitar hero controller. There’s another video after the break where he walks through the various features but here’s the gist of it. This works on Mac and Windows and allows a sort of ‘live play’ or midi mapping mode. In the midi mode each key can be configured to do your bidding. His example uses the pick bar to scroll through different samples and the green button the play them or the red button to stop.
The live mode us much more involved. In the software you choose the type of scale and the key you’d like to play in. This makes up for the controller’s lack of enough frets to make it a chromatic instrument and these settings can be adjust from the controller. There is an up-pick offset that makes the upward movement of the pick bar a different note than the downward movement. The motion control can also be used as an input. He demonstrates pitch bending and cutoff using that method.
This looks like a lot of fun. He needs to team up with [Joran] to add drums to the mix, forming a much more creative rock band than you can buy in the store.