Learning Obsolete Technology

Tom Nardi and I were talking about his trip to the Vintage Computer Festival on the podcast, and he admitted to not having been a retrocomputer aficionado before his first trip. But he ended up keying some binary machine code into some collection of archaic silicon, and he got it. In the same episode, the sound of the week was a Strowger switch — the old electromechanical “brain” of telephone switching centers of old. The sample I used was from Sam of Look Mum No Computer on YouTube, who got one for his museum and thinks it’s just awesome.

Why do people like this kind of old (obsolete?) tech? It’s certainly not because it’s overwhelmingly capable — the giant old switch is replaced easily by a stack of silicon, and don’t even get me started on the old blinkenlights computer that Tom was keying on. In both of these cases, the people are significantly younger than the tech they’re playing around with, so that rules out nostalgia. What’s left?

I think it’s that sometimes the older technology is more immediate, more understandable, more tangible, and that resonates with people. In a time when we all have wonder devices that can do anything, programmed in languages that are pleasant, using libraries that are nothing short of magical in terms of making difficult things easy, understanding how things work down to the ground is a rare commodity.

But it’s a strange position to find ourselves in, technologically, where there’s almost necessarily a trade-off between the usefulness and functionality of a device with the ability to understand fundamentally how it works.

33 thoughts on “Learning Obsolete Technology

    1. If you’re a coder, there’s an appeal to learning the layers underlying your previous knowledge. Because these machines are built into the current machines, the same way human bodies are built from single-celled organisms.

      1. So true. It’s almost a dying art in some aspects. There’s a zen in building from the ground up, (sometimes at least) even if it’s the least efficient route to the end goal.

  1. For me I think the thing with old tech is its not abstraction on top of abstraction so you can’t know how the little box of magic does its stuff, in many cases with propriety layers in the abstraction stack too so the actual underlying functions are entirely opaque.

    Digging through API to link a-b-c creates a working d, but it doesn’t really give you any real understanding of HOW d happens, or any ability to make d happen if the API isn’t available and you want to work with the hardware directly.

    Plus even with that mythical thing of entirely open hardware the modern silicon with its web of co-processors is so complex you just can’t know how each and every internal part of the whole links up – its too much for most if not all of us to really keep track of. The old school stuff is simple enough you could pretend to be the computer and complete a processor cycle on paper albeit slowly..

    1. I agree. It was a relatively simple in concept. Take DOS on the o’ PC. It was OS -> hardware. You could interface directly with the hardware (processor, graphics card, etc.) as desired. No complicated memory access. You just wrote/read from addresses as you pleased. When your program ran, it was all by itself on the hardware. You had to dig a bit and understand what is going on…. None of the ‘abstraction’ layers found in OS’s today nor multitasking to speak of unless you wrote it. It was ‘fun’ and still is…. Why I like programming the RP2040 and such. Or even Ultibo for the RPI gets you ‘closer’ to the hardware… and of course other SBCs/Arduino style. You have ‘control’ (mostly) without the ‘abstractions’.

      Of course on the flip side the standardization of abstraction layers makes the programmers life easier and he/she can be a bit more productive and more complicated things can be accomplished without the ‘distractions’ of what is underneath. And more secure as most computers are ‘connected’ now-a-days.

      1. You’re way too high.

        Why not a computer where you enter binary by switches? At least my KIM-1 offered a few useful routines, and a monitor.

        But neither were particularly useful, other than learning about comouters.

        Operating systems came later.

        1. You’re missing an intermediate step, I believe.
          Remember the original Tron movie? There were socalled “monitor” programs. Kind of a cross of a simplistic firmware, a debugger and a loader routine. The 8-Bit Sharp MZ series had them in ROM, for example.

        2. LOL. I do have O.G.’s PDP-11/70 front panel for that… I think, at my age, I am ‘past’ entering a program with switches. Trying to just enter the boot loader sequence into the PDP is enough for me…

          1. My point was “where do you stop”? The post I replied to thought MSDOS was the best scenario. But it’s still an operating system.

            By the time I got an Apple II in the early nineties, I didn’t the stamina to get good at it, and it offered nothing that I couldn’t do better with other computers

            If we spend all our time going for simple, where does that leave the present, and future?

            There was a recent post about using CP/M. Why? It wasn’t about learning, it was abkut using. An inferior OS, and no connectivity. I lived through that (well I skipped CP/M and used Microware OS-9 which was more advanced), I don’t want to go back. There’s nothing educational about using C on a 2MHz computer with two 5.25″ floppy drives, except to learn how agonizingly slow it is. I did that in 1988, and promptly gave up on C.

            Ham radio is doing the same thing, some loud people fixated on the 1930s, as if nothing has changed. Go back fifty years and big advances, but that’s too complicated and hard.

            I’d argue some of the problem isn’t that current technology is “too complicated”, but that the world has moved on. 47 years ago, we understood computers because there were magazines and books explaining them. Now the focus is on using. Same with ham radio, the technical stuff is elsewhere, so one has to look for detail about direct digital synthesizers and SDRs. PLL synthesizers weren’t hard for me to understand fifty years ago when I was 12, but if it’s not veing explained in simple terms on a regular basis in a magazine, of course you’ll stick with analog VFOs.

            What we get is a lot of explanation of the simple stuff, rather than about the advanced stuff.

        3. Define useful – that phone switching mechanism, the 8bit -maybe 32bit Z80 style computers are all very very capable of doing something useful, in the case of the phone switch it can only take a pulsed input and use it to connect two lines, with your early simpler computers you can do anything you can do on a modern computer, the only meaningful thing at the hardware level a modern computer does that the older ones do not is run really fast (yes they have more ASIC elements inside them to further aid running x really fast but really its the gate switching high clock speed and being 64bit that is the key differential).

          Which is where something like Dos becomes a good example, its a great initial framework to build your bespoke program on – direct enough access to the hardware, simple understandable abstractions to make your life easier, the code more portable etc. Very few programs that exist now don’t have an equivalent in DOS. But its still possible to actually comprehend every step of the how its working, which then lets you do things like be sure its secure, bend the way its working to a new task efficiently.

          You flat out can’t do that on a modern system, even a stupidly slow one that is entirely opensource – nobody is able to take in and comprehend all the many many abstraction layers, you end up being the UEFI BIOS specialist, or the I trust that BIOS level works as it should, because I do, for instance – low level driver on the device (that the OS likely doesn’t even know exists as this is the onboard firmware) or work on boot loader, the kernel power management interface, a device driver of the sort an OS does interface with (that likely depends on other device drivers, as there are these fancy bus architectures etc), or perhaps you work on the Micro-code or management engine type stuff etc.

          By the time you get to a userspace program you have absolutely no idea how it works, or if its only doing what it is supposed to be, debugging is often the nearest thing to impossible as once you figure out its not an error in your code but one of the many many layers its passing through, many of them quite likely propriety you are at the mercy of others.

          It is not that there are not good explanations for how things work in a modern system, as usually there are (though some propriety secrets in the mix too), its that you have to translate through 8, 9, who knows how many different sets of explanations for each of the various layers – in effect you need to speak fluently more languages full of odd very technical minutia to be able to comprehend why something is failing, or taking a very long time, is not secure etc.

          It really is that current tech is “too complicated” and in many ways there isn’t a need for it to be so, with the switching speed increases and really good efficient compilers you can do stuff on the Arduino’s little atmega chips or the Pi Pico’s RPI2040 that you couldn’t do on supercomputers at the same speed not all that long ago, and better/faster/more reliably than you can do on your brand new monster Ryzen or Intel 12th gen CPU – as you don’t have to waste time and power going through all those abstractions before you get to the bare metal hardware! All the many many abstractions do is allow sloppier practices to function and more companies to put out their own brand of whatever that works just different enough to be legally distinct, and thus needs lots of extra help to then match whatever your highest level OS elements expect.

  2. “Why do people like this kind of old (obsolete?) tech?”

    I think you sense the wonder and adventure of the people that at the time put some if their souls into the machines.

  3. Some time ago, I went to see the silent film “Phantom of the Opera” at Davies’ Symphony Hall, with an organist playing the score and providing a few foley effects. In essence, it was a reproduction of what a night at the cinema in the 1920s would have been.

    What was the biggest surprise to me was that there was a brief bit of full color footage included, but even more impressive was that some of the frames were painstakingly hand-tinted and others had a color-wash applied.

    Most of us born after a certain age remember early cinema only as monochrome, but that’s purely an artifact of monochrome television, where such color effects would have been impossible. This sort of thing was even happening in the 1940s – Jacque Tati’s film “Jour de Fete” was shot on monochrome film, but they hand-tinted the french flag, and this is preserved in the Criterion Collection DVD.

    Early films did have some use of color. It just wound up being a technological cul-de-sac that almost nobody remembers today.

  4. I just LOVE all that old technology and science. I have a collection of old cameras, a device for self-administered electrotherapy that still works, and many old books that targeted DIY crowd, some digitalized, some more traditional. Some of them are in my native language, polish, and were written when Poland was a communist country under soviet influence, so basically we had almost nothing. So for example I have chemistry books for kids and teens showing, how to prepare hard to get chemicals from more common ones, and how to perform (sometimes dangerous) experiments with them. From my first electronics book I learned about protons, electrons and Lenin, also how vacuum tubes and transistors work. I don’t own it, but I read a book that showed how to build model steam engines, make entire model train set, including making powered tracks with strips of sheet metal and a simple jig for bending them, and even a basic washing machine using an old barrel, some other parts and an electric motor. I also have some english books from XIX and early XX century that cover everything from math and geometry aids, scientific instruments to clocks. My dream is to build some of this stuff, but I lack space, time and money, at least for now…

  5. Arthur C. Clarke famously said, “Any sufficiently advanced technology is indistinguishable from magic”.

    When people don’t understand the technological underpinnings, technology becomes “magic”. People don’t understands it, so they make bad decisions on its proper use. Advances become difficult or impossible since no one can see how to fix or improve it. A technological “priesthood” can form to monopolize it for their own gain.

    Technology needs to be more like a ladder that must be climbed to be appreciated. To truly learn and understand something, you have to start with the basics. Earlier simpler tech is the way to learn how things work.

    To paraphrase an old saying, “He who does not learn a technology is doomed to misuse it.”

      1. Ship has sailed.

        Some of the worlds largest companies use Javascript on the server. Huge steaming piles of Javascript.
        Not in a browser, on the server, where they had a choice. One tool, world’s a nail.

  6. Leaving the fundamentals of a technology behind for abstractions is part of the evolution of every technology. Shakespeare did not know the etymology of all the words he used or all of the functions of how English was built. He would have either been less prolific or less good as a result of chasing some mythic version of expert knowledge. Abstraction is the act of standing on the shoulders of giants and leveraging the work of those that came before that did great work (with the limitations they had) that allows us to build on top of it.

    If your goal is recreation and you enjoy experiencing old technology because it is fun to you, that is one thing, but if your goal is productivity and building something better techically, this feeling of needing to “understand the whole stack” is counterproductive. Turing wouldn’t still be using a Turing machine if he was alive today.

    1. “Leaving the fundamentals of a technology behind for abstractions is part of the evolution of every technology.”

      Is that really ‘progress’, though? If a society does loose the technological skills of their forefathers? The ability to understand and fix the underlying mechanisms? In about any sci-fi novel there’s that one race/society that used to be great and flourishing once, but forgot the basics and began to fall apart and degenerate. Personally, I think that complexity has nothing to do with ingenuity per se. Just think of these native tribes. They seem primitive at first glance, but contain the wisdom of thousands of years. They have skills about agriculture, medicine, can build living bridges from tree roots etc.

      1. You’ve made comments about tubes being better. But when I said something about older books, you disagreed.

        There’s a difference between understanding the past, and living in it. Charles Kitchin went back to the twenties and researched regen and superregen receivers. The result wasn’t another tube project, he had solid state receivers. And instead of the primitive receivers from back then (they were primitive not just because they were regen and superregen, but they had as few components as possible), he added a buffer between the antenna and detector, and voltage regulation. Extra circuitry that added little to the cost or size in the solid state era. He also played with the waveform of the quench oscillator in superregens, claiming narrower operation. Superregens had become black box, no real details except “they are wide”. So endless articles about hi-q circuitry, but never dealing with what amou ts to a modulated oscillator.

        Lots can be learned from early tutorials about SSB. Later, there was less detail because it was commonplace. Though if you go in between, AM is separate from SSB, as if two completely different modes. The ARRL Handbook onky in the past ten or twenty years tried to consolidate it. This isn’t about building tube SSB rigs, though that 75 metre transceiver in the 1971 Handbook shows how to do a rig without bilateral stages, or lots of switching.

        Some complain that SDR is too complicated, a cheat because it uses a computer. But what’s an SDR but a phasing rig with the audio phasing done digitally? People who bypassed theory see things differently.

        What I keep seeing is people wanting to go back in time to “when it was simpler”. They aren’t talking about learning from the past, but living in the past. A simplker time of one tube receivers. I still can’t grasp why people built up a culture of tubes being simpler. Likewise people want to run old computers rather than learn from them, even thiugh the rhetoric is that early computers got you closer to the metal.

        1. “You’ve made comments about tubes being better. But when I said something about older books, you disagreed.”

          Ah, was it about TV sets and old TV books?
          – If so, yes, I remember saying that the old books may lack some information. However, I didn’t mean to say they’re useless.

          Rather, thr contrary is the case, i think. Early books, say from the crystal radio days, were meant with practical people in mind; they had beautifully drawn pictures and ‘simple’, but true descriptions.

          What I meant to say: Old TV books from the 50/60s lacked the improvements that came later.
          In the 1970s/80s, the same old CRT tubes were still in use and production, but sometimes coupled with newer circuits designs, also. Say, 12v power supplies, comb filters or plastic chassis. The old books do contain all the basic circuits, but maybe not MOSFETs or other, new paradigms to drive a CRT. Things like PWM (pulse width modulation) rather didn’t exist in 50s books, for example. Anyway, these are just hypothetical scenarios.

          Personally, I think that new and old is no contradiction. I’ve used an EF95 tube and a 4MHz Crystal to build a receiver for Digital Radio Mondiale. It worked fine as an RF fronted for a soundcard. The 7KHz IF was enough for Dream software to decode RTL Radio. The same thing didn’t work well with a bipolar transistor, because it couldn’t handle the strong signal coming rrom longwire antenna. The simple tube, however, could.

          That being said, I still think that a technology should only be used if it can be understood. Without basic knowledge about how things work, it’s us that’s inferior. This is a tragic issue that applies to humanity as a whole, I think. It would be favorable if our societies would at least be kind of on par with our technology, I think. In reality, I’m afraid, it’s far behind. For the moment, at least.

          Don’t get me wrong, I’m a philanthropic person and have hopes, but some comments and mindsets on the internet simply depress me. “With great power comes great responsibility” is an old saying perhaps, but it still holds some truth, I think. Using technology carelessly without knowing how it works can be dangerous. It’s like playing with fire.

          Of course, I’m not saying that artists and kids shouldn’t be allowed to make their own playful discoveries with technology. It’s just better if someone experienced is still around that can help and fix things up in case everything goes wrong.

          Or long story short: Let’s don’t make ‘Idiocracy’ come true.

  7. Whenever I’m working with components with date codes that precede my birth year, I get this unique feeling that’s hard to put into words. I’m interacting with something that existed before I was even an idea. The world was turning before me and people “figured it all out” before the internet was a thing. There’s a cool cross-generational connection when you interface with hardware from years gone by.
    The more you work with vintage components, the more you can begin to understand the “why’s” and “how’s” of our modern, abstracted technological landscape. Every great idea started somewhere and it’s really cool to peer into the engineering zeitgeist of previous eras.

  8. Limitations of more primitive computers, paradoxically, nurture creativity. The joy is in overcoming limits. Consider a retro platform your art medium and challenge yourself to push it to limits. Newborns are swaddled because they feel safer when their movements are limited; constraints can bring comfort. I recently pushed an Apple Lisa’s 1-bit sound output to limits by coding music in low-level machine language and have experienced moments of bliss.

  9. That’s a nice looking Nova 1200 you’ve got in the photograph!

    A friend once described the draw to the vintage computers (and other old technology as this): you can poke at the bits that make it tick. You can see the fundamental processes at work because they aren’t fully encapsulated in the proverbial black box, and if you know where to put a probe, you can see a computer… compute.

  10. I’d say it’s something like the difference between gardening and farming, topiary and hedge trimming, painting figurines and the Forth bridge. It being possible to care about the individual plants, or the individual leaves and twigs, the individual brush strokes, the individual bits… instead of dealing with them en masse.

  11. I absolutely loved learning to program the MSP430 in assembly. I was a CS major at the time, but I convinced one of the EE/CE teachers to let me sit in on his entry level embedded systems course, where we learned to program the MSP430. Instead of doing the assignments (since I was basically informally auditing) in C (with only one assembly assignment), I did all of the assignments in assembly, because I could and with only 40-some instructions, it wasn’t difficult. I also ended up reading through most of the datasheet and learning to do things like interrupts, that the course didn’t even cover. I already had significant experience programming on Intel, in higher level languages (almost 20 years of experience, but sadly that piece of paper is still important), but having a processor simple enough to understand almost completely all at the same time made a huge difference. I had tried to do assembly and interrupt stuff with Intel back in the DOS days, but the combination of lack of learning resources and high complexity (even with the 486) made for a huge and tedious learning curve. The MSP430 isn’t exactly obsolete, but it’s incredibly low end and as simple as a lot of obsolete stuff. And that made it so much fun to learn to use! With that basis, I was later able to move on to ARM microcontrollers with a sufficiently solid basis to learn quickly despite their higher complexity, and now I’m learning to use the RP2040, which has some absolutely awesome special features. And even in higher level languages, that path that started with the MSP430 has made me a far better programmer. (Working heavily in ARM assembly, and eventually teaching it at the undergrad level, gave me an understanding of C/C++ pointers that I never expected, and I thought I understood them really well before that.)

    Obsolete technology is fascinating precisely because it is so accessible, compared to the incredible complexity of modern hardware, but it is also valuable to learn, because it’s a metaphorical rung on the educational ladder that many people miss but that provides important foundational value.

  12. Great article. Ultimately, there is a familiarity for those of us who are older and it compels us to use old commands and programming languages we started out using. I am a Linux fan and work with it frequently since the system requires use of commands frequently. It takes me back to when computers were still sort of magical.

  13. Besides the value of learning all levels of the modern computing stack and the more visceral experience of working “closer to the metal,” there’s also value in learning about history for its own sake. The development of technology was not a linear sequence of progress—there were roads not taken, and the only way to find out about those roads is to look back.

    This has helped me at least once in my computing career: I needed a fast way to run algorithms that are not known at compile-time, and while JIT compilation with LLVM would be one way to do it, we needed to minimize dependences (LLVM is a huge dependency). In a retrocomputing group, I learned about Forth, which may well be the fastest interpreted language possible, though it isn’t pleasant (in my opinion) to write by hand. No problem: we needed it for automatically generated code, so it didn’t matter how arcane the language was. It was also easy enough to write a new interpreter in our environment (something like 1000 lines of C++, not big at all).

    This is not how Forth was used back in the day, and Forth has very little visibility these days. (It had its peak in the 1980’s on personal computers that were incapable of running a true compiler.) If I had not been curious about history for its own sake, I wouldn’t have stumbled upon this great tool.

    I’ve heard similar stories in other technical fields. Richard Feynman, for instance, said that his idea for Feynman Path Integrals came from the outdated textbook he learned calculus from. In his day, everyone else was using differential techniques, but because he learned from an old book, be knew some integral techniques that could be put to a new application. This wasn’t the reason he was reading the old books, but it has the side-benefit that he was exposed to more ideas than he’d have if he only learned what was considered cutting edge.

Leave a Reply to HaHaCancel reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.