Agreeing By Disagreeing

While we were working on the podcast this week, Al Williams and I got into a debate about the utility of logic analyzers. (It’s Hackaday, after all.) He said they’re almost useless these days, and I maintained that they’re more useful than ever. When we got down to it, however, we were actually completely in agreement – it turns out that when we said “logic analyzer” we each had different machines, and use cases, in mind.

Al has a serious engineering background and a long career in his pocket. When he says “logic analyzer”, he’s thinking of a beast with a million probes that you could hook up to each and every data and address line in what would now be called a “retrocomputer”, giving you this god-like perspective on the entire system state. (Sounds yummy!) But now that modern CPUs have 64-bits, everything’s high-speed serial, and they’re all deeply integrated on the same chip anyway, such a monster machine is nearly useless.

Meanwhile, I’m a self-taught hacker type. When I say “logic analyzer”, I’m thinking maybe 8 or 16 signals, and I’m thinking of debugging the communications between a microcontroller, an IMU, or maybe a QSPI flash chip. Heck, sometimes I’ll even break out a couple pins on the micro for state. And with the proliferation of easy and cheap modules, plus the need to debug and reverse commodity electronics, these logic analyzers have never been more useful.

So in the end, it was a simple misunderstanding – a result of our different backgrounds. His logic analyzers were extinct or out of my price range, and totally off my radar. And he thinks of my logic analyzer as a “simple serial analyzer”. (Ouch! But since when are 8 signals “serial”?)

And in the end, we both absolutely agreed on the fact that great open-source software has made the modern logic analyzers as useful as they are, and the lack thereof is also partially responsible for the demise of the old beasts. Well, that and he needs a lab cart then to carry around what I can slip in my pocket today. Take that!

35 thoughts on “Agreeing By Disagreeing

    1. I was there and knew some of the guys in the book. My group (Comms and Networking) was under West. I worked at DG for 14 years, and of those, only a few nights or weekends. When it happened, it was because the schedules were overly optimistic or there were serious technical problems. Any company that expects night and weekend work as a regular thing is in trouble.

  1. Now that we’ve heard one side of the story, do we get to hear it from the other side?
    I, personally, wonder what he thinks about those pocket analyzers’ complex-triggering ability [or lack thereof?]… for… uhm… reasons.

    1. There is no need to hear the other side. People who think Logic Analyzers are obsolete are either short sighted or cross eyed. Even Keysight still makes new logic analyzers. and with “4Gb/s and 136 channels per module” I guess it’s not going to be within the average Hacker’s budget, but I’m sure there are markets where these things are indispensable.

      1. Those kinds of logic analyzers are invaluable for DRAM debugging, which is really the only wide parallel bus in use today. Unfortunately 4Gb/s is barely enough to cover DDR5.

        For the SerDes buses, like PCIe or Ethernet *GMII, it depends on how far up the protocol stack you’re debugging. If you’re having link level issues, you’ll need a six-figure scope with diff probes to check SI and tune equalization, ideally one with a decoder for your protocol. If the issue is farther up the stack, a similarly expensive protocol analyzer is the new logic analyzer. In both cases, the buses are high enough speed that you’re going to have a bad time if you didn’t design your hardware with appropriate test points and midbus probe sites.

        But even large systems still have tons of applications where a Saleae or other 100MHz 8-16 channel logic analyzer is helpful. Need to check that the 10 or so separate resets and rails are sequencing correctly? Put a probe on each of the power good signals and the resets. Is your VRM not going to the right voltage when the CPU commands it over SVI? Use the logic analyzer to see what’s being commanded over the bus. Firmware not executing? Hook the analyzer to the SPI boot ROM and trace the firmware fetches.

  2. I used logic analyzers back then as well. And the ones that could halt on a particular address or other clue and show the instructions being executed and all that good stuff. They were invaluable for finding what bugs were doing and quickly. Leasing an HP analyzer during development paid for itself many times over.

    Today with JTAG on everything or other serial debug protocols, GDB under Linux and all that, it is a much different landscape. However, I do not find ANY of the software GUIs and software based logic capture devices for PC’s to be as easy to set up and use as the HP. (I have the same problem with my Rigol DS1054 – horrible UI.)

    Here is my desk at Information Appliance, 1980-something. https://digibarn.com/friends/jef-raskin/slides/canon-cat/A%20-669%20CHARLESO.JPG

    1. I cut my teeth on an HP 16500C tracing embedded CPU code execution. That, or maybe the (Agilent) 16702B, which had a full fledged PA-RISC workstation inside, were the pinnacle of logic analyzers.

      However I now use Saleae extensively. Although it doesn’t have the infinitely programmable triggers of the HP analyzers, it’s very easy to use. If I can’t get it to do what I want, I can easily export data and write my own post-processor in Python. The infinite depth capture capability makes it possible to capture everything and sort it out later, something HPs couldn’t do.

    2. Tektronix 7D01/DF1 Logic Analyser. When minicomputers were starting to be built with microcode and the first of the 16-bit microcomputers. No more console switches, just a monitor/boot rom of some kind. If a CPU board appeared dead. Monitor the program counter and hit the reset button. Usually the code missed a jump instruction that indicated a faulty input line/chip. Scope the chip and replace. Problem solved.

  3. I have used large logic analyzers from HP and Tektronix, but they were tricky to set up and often it was difficult to interpret the information. I bought a Saleae as soon as it was available. The ability to decode information into hex or ASCII makes it a must-have tool in my lab. The price was right, too. I’ve seen many older LAs on Ebay, but often the seller doesn’t have the cables or you need a “mainframe” of some sort to hold the LA module.

  4. “His logic analyzers were extinct or out of my price range, and totally off my radar.”

    A cheap FPGA and the built-in logic analyzer cores will, for the most part, blow the doors off of most low-to-mid logic analyzers for a fraction of the price.

    1. I just had this come up last weekend myself. I have an 8ch usb and 8ch handheld analyzer which these days are my goto staples.
      A discussion about making a new hardware addon for an older computer (an apple2e, so yea I guess a “retrocomputer”) marks the first time in about 20 years I wish I had a 32 channel analyzer, and whoo boy is the selection small and expensive!

      I pondered how feasible making one might be, using an rp2040 or perhaps looking into what the latest esp32-s3’s are up to these days. (I only didn’t consider an FPGA as I don’t have any laying around)
      I don’t know if this use case counts as “mid range” but does seem to be the cheapest and most flexible options, at least until you get up into crazy fast sample rates.

      1. I’d call that low range. There are RP2040-based logic analyzer designs out there, claiming 100 MHz. Pretty much any major FPGA out there should hit 200 MHz easily and 400 MHz without breaking a sweat using DDR capture.

        The three main benefits of using an FPGA are that

        1) for the major ones (Xilinx/AMD, Altera/Intel, Lattice) the software’s basically built for you already. I mean, they’re not *fantastic* or anything but they’re very usable – and they’ve got stuff like built-in graphing as well.

        2) you can just use the FPGA itself to do protocol decoding, complicated triggers, and data selection.

        3) “channel counts” are absurd. I mean, those silly EBAZ4205 ex-bitcoin miners from China will get you 60+ channels.

    1. I am a physicist. I think we can agree we are scientists.

      During my degree I encountered no less than four different definitions of the Fourier Transform. And in my five year stint in biochemistry I encountered no shortage of unclear and diverse terminology.

      Plus, [Eliott] is an economist. And I am not sure, but [Al] sounds like an EEE.

        1. I don’t think errors like this are only in your field, but similar due occur in all technical fields. You learn to just live with it. I was at a conference in NYC a few years back and chatting with a few newly acquired friends. We were discussing an issue in communications at a specific technical level. After participating for a few minutes, it seemed that the conversation was going off track. So I asked, “Wait. My definition of xxx is this…, what is yours?’ Well, all 8 people had different definitions that could be applied, but not in the conversation. We all laughed over it, and then continued on the agreed to path. Just my step back and look…

  5. The trouble with the stuff from the “big kids” (HP, Tek) besides being very expensive, is their big corporate marketing people ruined he products by making them modular and forcing you to buy many costly add-ons to have a useful system. And that went for disassemblers too. Everything was closed systems and you were limited to what disassemblers they chose to supply.

    Then the affordable but underpowered “software logic analyzers” came along a decade or so ago. Trouble is many of these were more like “data loggers” feeding some nice appearing software. These were logic analyzers designed by people that never used a real analyzer. The could present beautiful (and colorful) timing diagrams…that were both deceptive and incorrect. Poor triggering abilities, and no glitch detection.

    For example, if you were investigating if the WE# pulse was occurring too soon and preceding the CE#, you’d never catch it. Both were sampled on a too-slow logging clock and drawn (presented to the user) as coincident…just like in the pretty picture in the databook, but ultimately disguising their true relationship.

    The FPGA stuff has the speed and ability to correct many shortcomings, but once again lacking support for things beyond the basics the limited staff of these small companies supply. We need good hard, and decent software, and ultimately open source (hardware and software) so end users can extend the systems to fill in the gaps.

  6. Reminds me of when I was in the seminar and chatting with some folks. During the seminar, we were discussing a topic, which escapes me now, but all of a sudden, it’ sounded, funny so I said “My definition of XX is this, what’s yours?” We wound up with 7 different definitions for the same thing. It happens.

  7. Why do Americans/Brits say ‘heck’ instead of ‘hell’ these days? I mean it means the same thing right? And ‘hell’ isn’t in the original language of the religious texts either, so that can’t be any part of the argument.
    And places like YouTube specifically say you can say hell all day without any effect on your monetization or anything.

    Also/alternatively: Why do I expect logic or common sense from religious folks?

    1. Lemme put it this way, regardless of religion, we still share a language… The use of words has meaning.
      You know the story of ‘The Boy Who Cried Wolf’? Now, imagine an entire society continually ‘crying wolf’ by calling things like going through the line at the DMV ‘Hell’ [pretty much the strongest word in our language describing a horrible experience].
      Now, imagine going through an excruciating headache that makes you stumble, slur your speech, lack words, and see spots. Now imagine going to the ER and trying to be taken seriously amongst all the ‘wolf criers’ in the waiting line who say on a scale of 1-10 their 3 is 100, and that they’re ‘going through Hell.’
      Words *should* have meaning, regardless of religion. Our society has done quite a lot of ‘crying wolf’ as to destroy meaning where meaning is necessary. Maybe it’s FINALLY getting the picture?

  8. So Al Williams is simply wrong. A logic analyzer is a device that can capture logic signals.
    Logic signals still exist and still need to be captured to be analyzed. Especially in embedded devices.
    I love my Saleae. I use it for high speed DDR octospi, medium speed SPI/I2C, low speed UART signals, GPIO pins toggled in interrupts and on parallel signals. I wouldn’t call that a “simple serial analyzer” as it can measure precise bit and byte timing and captures parallel data.
    I wrote my own decoder (HLA, high level analyzer) for our proprietary serial protocols and the same python code can also do encoding and is used to emulate devices on the PC using a USB-serial cable. This way we can quickly analyze and replay faults in our machines.
    tl;dr Logic signals still exist and we need logic analyzers to see them.

Leave a Reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.