DIY Programmable Guitar Pedal Rocks The Studio & Stage

Ever wondered how to approach making your own digital guitar effects pedal? [Steven Hazel] and a friend have done exactly that, using an Adafruit Feather M4 Express board and a Teensy Audio Adapter board together to create a DIY programmable digital unit that looks ready to drop into an enclosure and get put right to work in the studio or on the stage.

The bulk of the work is done with two parts, and can be prototyped easily on a breadboard.

[Steven] also made a custom PCB to mount everything, including all the right connectors, but the device can be up and running with not much more than the two main parts and a breadboard.

On the inside, the Adafruit Feather M4 Express board works with the audio board over I2S, a standard for sending serial digital audio between chips. Working with the audio itself is done with the Teensy Audio Library, providing a fantastic array of easy-to-use functions for processing and manipulating digital audio streams.

Together, all the right pieces are in place and [Steven] provides the code for a simple tremolo effect as a glimpse of what’s possible with the unit. Interested in going a bit further? [Steven] shares additional details about what’s involved in writing a custom effect from scratch using the Teensy Audio Library.

As mentioned, I2S is where it’s at when it comes to working with digital audio at the chip level, and our own Jenny List can tell you everything you need to know about I2S, a useful protocol that has actually been around since 1982!

Reading Data From A CD, With A Microscope

There was a time when electronic engineering students studied the audio CD, for all its real-world examples of error correction and control systems. There’s something to be found in the system still for young and old though, and thus we were intrigued when we saw [Peter Monta] reading the data from a CD using a microscope.

CDs encode data as so-called pits and lands in a spiral track across a metalised surface, with a transition from pit to land signifying a logic 1 and a missing transition signifying a 0. Reading a section of the raw data is achieved in the first part of his write-up, but in the next installment he goes further into retrieving more data through stitching together microscope pictures and writing some code to retrieve data frames. He’s not quite at the audio playback stage, but he’s planning in the future to spiral-track a full image to rip an entire disc.

There are plenty of CD drives around to read audio the conventional way, but the techniques here still find a use where less ubiquitous media has to be read. In the last decade for example there was an effort to read the BBC Domesday Project from the 1980s, as it became clear that few of the original readers survived in working order.

Audio Old And New Meet In Perfect Harmony

There’s an uneasy meeting in the world of audio between digital and analogue. Traditional analogue audio reached a level of very high quality, but as old-style media-based audio sources have fallen out of favor there’s a need to replace them with ones that reflect a new digital audio world. To do this there are several options involving all-in-one Hi-Fi separates at a hefty price, a cheaper range of dongles and boxes for each digital input, or to do what [Keri Szafir] has done and build that all-in-one box for yourself.

The result is a 1U 19″ rack unit that contains an Orange Pi for connectivity and streaming, a hard drive to give it audio NAS capability, plus power switching circuitry to bring all the older equipment under automation. Good quality audio is dealt with by using a Behringer USB audio card, on which in a demonstration of how even some digital audio is now becoming outdated, she ignores the TOSlink connector.

The rear panel has all the connectors for power, USB, network, and audio laid out, while the front has an array of status lights and switches. We particularly like the hand-written lettering, which complements this as a homebrew unit. It certainly makes the Bluetooth dongle dangling at the back of our amplifier seem strangely inadequate.

If audio is your thing, we had a look at some fundamentals of digital audio as part of our Know Audio series.

How The BBC (Still) Sends Audio To Transmitter Sites

Running a radio station is, on the face of it, a straightforward technical challenge. Build a studio, hook it up to a transmitter, and you’re good to go. But what happens when your station is not a single Rebel Radio-style hilltop installation, but a national chain of transmitter sites fed from a variety of city-based studios? This is the problem facing the BBC with their national UK FM transmitter chain, and since the 1980s it has been fed by a series of NICAM digital data streams. We mentioned back in 2016 how the ageing equipment had been replaced with a modern FPGA-based implementation without any listeners noticing, and now thanks to [Matt Millman], we have a chance to see a teardown of the original 1980s units. The tech is relatively easy to understand from a 2020s perspective, but it still contains a few surprises.

In each studio or transmitter site would have been a 19″ rack containing one of these units — a card frame with a collection of encoder or decoder cards. These are all custom-made by the BBC’s engineering department to a very high standard, and use period parts such as the familiar Z80 microprocessor and some Philips digital audio chips, which followers of high-end consumer audio may recognize. As you’d expect for a mission critical device, many of the functions are duplicated for redundancy, with their outputs compared to give warning of failures.

The surprise comes in the NICAM encoder and decoder — it’s a custom LSI chip made exclusively for the BBC. This indicates the budget available to the national broadcaster, and given that these units have in some cases been working for over 35 years, we’re guessing that the license payers got their money’s worth.

You can read about the original switch-over in 2016, and a little more about NICAM, too.

Retrotechtacular: 1990s CD Mastering Fit For A King

Before it was transformed into an ephemeral stream of ones and zeroes, music used to have a physical form of some kind. From wax cylinders to vinyl discs to tapes of various sizes in different housings and eventually to compact discs, each new medium was marketed as a technological leap over the previous formats, each of which justified incrementally more money to acquire.

But that’s the thing — each purchase resulted in you obtaining a physical item, which had an extensive manufacturing and distribution process behind it. And few artists demanded more manufacturing effort than Michael Jackson in his heyday, as revealed by this in-depth look at the CD manufacturing process for The King of Pop’s release of the HIStory double-disc set in 1995.

The video was produced as sort of a love letter to Michael from the staff and management of the Sony Music disc manufacturing plant in Pittman, New Jersey. The process is shown starting with the arrival of masters to the plant, strangely in the form of U-matic videocassettes; the 3/4″ continuous loop tape was normally used for analog video, but could also be used for recording digital audio. The digital audio is then sent for glass mastering, which is where the actual pits are created on a large glass disc under cleanroom conditions. In fact, much of the production process bears a strong similarity to semiconductor manufacturing, from the need for cleanrooms — although under less stringent conditions than in a fab — to the use of plasma etching, vapor deposition, and metal plating operations.

Once the master stampers are made, things really ramp up in replication. There the stamper discs go into injection molding machines, where hot polycarbonate is forced against the surface under pressure. The copies are aluminized, spin-coated with UV-cure lacquer, and sent on down the line to testing, screen printing, and packaging. Sony hired 40 extra full-time workers, who appear to have handled all the tedious manual tasks like assembling the jewel cases, to handle the extra load of this release.

As cheesy as this thank-you video may be, it was likely produced with good reason. This was a time when a Michael Jackson release was essentially a guarantee of full employment for a large team of workers. The team was able to produce something like 50,000 copies a day, and given that HIStory sold over 20 million copies, that’s a lot of workdays for the good folks at Pittman.

Continue reading “Retrotechtacular: 1990s CD Mastering Fit For A King”

Mythbusting Tidal’s MQA Format – How Does It Measure Up?

MQA is an audio format that claims to use a unique “origami” algorithm, promising better quality and more musicality than other formats. At times, it’s been claimed to be a lossless format in so many words, and lauded by the streaming services that use it as the ultimate format for high-fidelity music. With the format being closed source and encoders not publicly available, these claims are hard to test. However, [GoldenSound] wasn’t born yesterday, and set out to test MQA by hook or by crook. The results were concerning. (Video, embedded below.)

To actually put the format through its paces, the only easy way available was to publish music to the Tidal streaming service, which uses the format. [GoldenSound] went this route, attempting to get some test files published. This hit a brick wall when the publishing company reported that the MQA software “would not encode the files”. The workaround? [GoldenSound] simply cut some audio test content into the middle of an acoustic track and resubmitted the files, where they were accepted without further complaint.

Testing with the content pulled from Tidal, [GoldenSound] found concerning evidence that the claims made around MQA don’t stack up. Significant amounts of added noise are often found in the MQA-processed files, and files served from Tidal are clearly not lossless. Additionally, MQA’s “blue light” authentication system, designed to guarantee to listeners that they’re listening to a identical-to-studio release, is demonstrated to be misleading at best, if not entirely fake.

Upon writing to MQA to get a response to his findings, [GoldenSound]’s test files were quickly stripped from Tidal. The company eventually disputed some of the findings, which is discussed in the video. The general upshot is that without open, transparent tools being made publicly available to analyse the format’s performance, it’s impossible to verify the company’s claims.

We’ve had fun looking at audio formats before, from the history of MP3 to musing on digital audio at truly ridiculous sample rates. Continue reading “Mythbusting Tidal’s MQA Format – How Does It Measure Up?”