Tearing into Delta Sigma ADCs Part 2

In part one, I compared the different Analog to Digital Converters (ADC) and the roles and properties of Delta Sigma ADC’s. I covered a lot of the theory behind these devices, so in this installment, I set out to find a design or two that would help me demonstrate the important points like oversampling, noise shaping and the relationship between the signal-to-noise ratio and resolution.

Modulator Implementation

modulatorCheck out part one to see the block diagrams of what what got us to here. The schematics shown below are of a couple of implementations that I played with depicting a single-order and a dual-order Delta Sigma modulators.

schematicBasically I used a clock enabled, high speed comparator, with two polarities in case I got it the logic backwards in my current state of burn out to grey matter ratio. The video includes the actual schematic used.

Since I wasn’t designing for production I accepted the need for three voltages since my bench supply was capable of providing them and this widget is destined for the drawer with the other widgets made for just a few minutes of video time anyway. Continue reading “Tearing into Delta Sigma ADCs Part 2”

Tearing into Delta Sigma ADC’s

It’s not surprising that Analog to Digital Converters (ADC’s) now employ several techniques to accomplish higher speeds and resolutions than their simpler counterparts. Enter the Delta-Sigma (Δ∑) ADC which combines a couple of techniques including oversampling, noise shaping and digital filtering. That’s not to say that you need several chips to accomplish this, these days single chip Delta-Sigma ADCs and very small and available for a few dollars. Sometimes they are called Sigma-Delta (∑Δ) just to confuse things, a measure I applaud as there aren’t enough sources of confusion in the engineering world already.

I’m making this a two-parter. I will be talking about some theory and show the builds that demonstrate Delta-Sigma properties and when you might want to use them.

Continue reading “Tearing into Delta Sigma ADC’s”

Hackaday Prize Entry: Open Source FFT Spectrum Analyzer

Every machine has its own way of communicating with its operator. Some send status emails, some illuminate, but most of them vibrate and make noise. If it hums happily, that’s usually a good sign, but if it complains loudly, maintenance is overdue. [Ariel Quezada] wants to make sense of machine vibrations and draw conclusions about their overall mechanical condition from them. With his project, a 3-axis Open Source FFT Spectrum Analyzer he is not only entering the Hackaday Prize 2016 but also the highly contested field of acoustic defect recognition.

open_fft_machineFor the hardware side of the spectrum analyzer, [Ariel] equipped an Arduino Nano with an ADXL335 accelerometer, which is able to pick up vibrations within a frequency range of 0 to 1600 Hz on the X and Y axis. A film container, equipped with a strong magnet for easy installation, serves as an enclosure for the sensor. The firmware [Ariel] wrote is an efficient piece of code that samples the analog signals from the accelerometer in a free running loop at about 5000 Hz. It streams the digitized waveforms to a host computer over the serial port, where they are captured and stored by a Python script for further processing.

From there, another Python script filters the captured waveform, applies a window function, calculates the Fourier transform and plots the spectrum into a graph. With the analyzer up and running, [Ariel] went on testing the device on a large bearing of an arbitrary rotating machine he had access to. A series of tests that involved adding eccentric weights to the rotating shaft shows that the analyzer already makes it possible to discriminate between different grades of imbalance.

The HackadayPrize2016 is Sponsored by:

1024 “Pixel” Sound Camera Treats Eyes to Real-Time Audio

A few years ago, [Artem] learned about ways to focus sound in an issue of Popular Mechanics. If sound can be focused, he reasoned, it could be focused onto a plane of microphones. Get enough microphones, and you have a ‘sound camera’, with each microphone a single pixel.

Movies and TV shows about comic books are now the height of culture, so a device using an array of microphones to produce an image isn’t an interesting demonstration of FFT, signal processing, and high-speed electronic design. It’s a Daredevil camera, and it’s one of the greatest builds we’ve ever seen.

[Artem]’s build log isn’t a step-by-step process on how to make a sound camera. Instead, he went through the entire process of building this array of microphones, and like all amazing builds the first step never works. The first prototype was based on a flatbed scanner camera, simply a flatbed scanner in a lightproof box with a pinhole. The idea was, by scanning a microphone back and forth, using the pinhole as a ‘lens’, [Artem] could detect where a sound was coming from. He pulled out his scanner, a signal generator, and ran the experiment. It didn’t work. The box was not soundproof, the inner chamber should have been anechoic, and even if it worked, this camera would only be able to produce an image or two a minute.

8×8 microphone array (mics on opposite side) connected to Altera FPGA at the center

The idea sat in the shelf of [Artem]’s mind for a while, and along the way he learned about FFT and how the gigantic Duga over the horizon radar actually worked. Math was the answer, and by using FFT to transform a microphones signals from up-and-down to buckets of frequency and intensity, he could build this camera.

That was the theory, anyway. Practicality has a way of getting in the way, and to build this gigantic sound camera he would need dozens of microphones, dozens of amplifiers, and a controller with enough analog pins, DACs, and processing power to make sense of all of this.

This complexity collapsed when [Artem] realized there was an off-the-shelf part that was a perfect microphone camera pixel. MEMS microphones, like the kind found in smartphones, take analog sound and turn it into a digital signal. Feed this into a fast enough microcontroller, and you can perform FFT on the signal and repeat the same process on the next pixel. This was the answer, and the only thing left to do was to build a board with an array of microphones.

4x4[Artem]’s camera microphone is constructed out of several modules, each of them consisting of an 8×8 array of MEMS microphones, controlled via FPGA. These individual modules can be chained together, and the ‘big build’ is a 32×32 array. After a few problems with manufacturing, the board actually worked. He was recording 64 channels of audio from a single panel. Turning on the FFT visualization and pointing it at a speaker revealed that yes, he had indeed made a sound camera.
The result is a terribly crude movie with blobs of color, but that’s the reality of a camera that only has 32×32 resolution. Right now the sound camera works, the images are crude, and [Artem] has a few ideas of where to go next. A cheap PC is fast enough to record and process all the data, but now it’s an issue of bandwidth; 30 sounds per second is a total of 64 Mbps of data. That’s doable, but it would need another FPGA implementation.

Is this sonic vision? Yes, technically the board works. No, in that the project is stalled, and it’s expensive by any electronic hobbyist standards. Still, it’s one of the best to grace our front page.

[Thanks zakqwy for the tip!]

FPGA Powers Blazingly Fast LED Matrix Audio Visualizer

[Sam Miller], [Sahil Gupta], and [Mashrur Mohiuddin] worked together on a very fast LED matrix display for their final project in ECE 5760 at Cornell University.

Real time!
Real time!

They started, as any good engineering students, by finding a way to make their lives easier. [Sam] had built a 32×32 LED matrix for another class. So, they made three more and ended up with a larger and more impressive 64×64 LED display.

They claim their motivation was the love of music, but we have a suspicion that the true reason was the love all EEs share for unnaturally bright LEDs; just look at any appliance at night and try not be blinded.

The brains of the display is an Altera DE2-115 FPGA board. The code is all pure Verilog. The FFT and LED control are implemented in hardware on the FPGA; none of that Altera core stuff. To generate images and patterns they wrote a series of python scripts. But for us it’s the particle test shown in the video below that really turns our head. This system is capable of tracking and reacting to a lot of different elements on the fly why scanning the display at about 310 FPS. They have tested display scanning at twice that speed but some screen-wrap artifacts need to be worked out before that’s ready for prime time.

The team has promised to upload all the code to GitHub, but it will likely be a while before the success hangover blows over and they can approach the project again. You can view a video interview and samples of the visualizations in the videos after the break.

Thanks to their Professor, [Bruce Land], for submitting the tip! His students are always doing cool things. You can even watch some of his excellent courses online if you like: Here’s one on the AVR micro-controller.

Continue reading “FPGA Powers Blazingly Fast LED Matrix Audio Visualizer”

A Better Spectrum Analyzer for your Rigol Scope

The Rigol DS1000 series of oscilloscopes are popular with hobbyists for good reason: they provide decent specs at a low price. However, their spectrum analysis abilities are lacking. While these scopes do have a Fast Fourier Transform (FFT) function, it’s limited and nearly useless for RF.

A FFT plotted by the PyDSA tool and a Rigol oscilloscope[Rich] wanted a spectrum analyzer for amateur radio purposes, but didn’t want to build his own sampling hardware for it. Instead, he wrote PyDSA, a software spectrum analyzer for Rigol DS1000 oscilloscopes. This tool uses the USB connection on the scope to fetch samples, and does the number crunching on a far more powerful PC. It’s able to plot a 16,000 point FFT at two sweeps per second when run on a decent computer.

PyDSA is a Python script that makes use of the Virtual Instrument Software Architecture (VISA) interface to control the scope and fetch the sample data. Fortunately there’s some Python libraries that take care of the protocol.

[Rich] is now able to use his scope to measure amateur radio signals, which makes a nice companion to his existing Teensy based SDR project. If you have a Rigol, you can grab the source on Github and try it out.

Visualizing the Fourier Transform

If you do any electronics work–especially digital signal processing–you probably know that any signal can be decomposed into a bunch of sine waves. Conversely, you can generate any signal by adding up a bunch of sine waves. For example, consider a square wave. A square wave of frequency F can be made with a sine wave of frequency F along with all of its odd harmonics (that is, 3F, 5F, 7F, etc.). Of course, to get a perfect square wave, you need an infinite number of odd harmonics, but in practice only a few will do the job.

Like a lot of abstract concepts, it is easy to understand the basic premise and you could look up any of the mathematical algorithms that can take a signal and perform a Fourier transform on it. But can you visualize why the transform works the way it does? If you can’t (or even if you can), you should check out [Mehmet’s] MATLAB visualization of harmonic circles. If you don’t have MATLAB yourself, you can always check out the video (see below).

Continue reading “Visualizing the Fourier Transform”