Physical Computing Used To Be A Thing

In the early 2000s, the idea that you could write programs on microcontrollers that did things in the physical world, like run motors or light up LEDs, was kind of new. At the time, most people thought of coding as stuff that stayed on the screen, or in cyberspace. This idea of writing code for physical gadgets was uncommon enough that it had a buzzword of its own: “physical computing”.

You never hear much about “physical computing” these days, but that’s not because the concept went away. Rather, it’s probably because it’s almost become the norm. I realized this as Tom Nardi and I were talking on the podcast about a number of apparently different trends that all point in the same direction.

We started off talking about the early days of the Arduino revolution. Sure, folks have been building hobby projects with microcontrollers built in before Arduino, but the combination of a standardized board, a wide-ranging software library, and abundant examples to learn from brought embedded programming to a much wider audience. And particularly, it brought this to an audience of beginners who were not only blinking an LED for the first time, but maybe even taking their first steps into coding. For many, the Arduino hello world was their coding hello world as well. These folks are “physical computing” natives.

Now, it’s to the point that when Arya goes to visit FOSDEM, an open-source software convention, there is hardware everywhere. Why? Because many successful software projects support open hardware, and many others run on it. People port their favorite programming languages to microcontroller platforms, and as they become more powerful, the lines between the “big” computers and the “micro” ones starts to blur.

And I think this is awesome. For one, it’s somehow more rewarding, when you’re just starting to learn to code, to see the letters you type cause something in the physical world to happen, even if it’s just blinking an LED. At the same time, everything has a microcontroller in it these days, and hacking on these devices is also another flavor of physical computing – there’s code in everything that you might think of as hardware. And with open licenses, everything being under version control, and more openness in open hardware than we’ve ever seen before, the open-source hardware world reflects the open-source software ethos.

Are we getting past the point where the hardware / software distinction is even worth making? And was “physical computing” just the buzzword for the final stages of blurring out those lines?

69 thoughts on “Physical Computing Used To Be A Thing

  1. I remember in the late 1990s wanting to control something extremely basic with a Windows computer. Back then, there was no obvious way to do this. Maybe you could control the lines of a parallel port or something, but what language would you use? Did you have to write a “driver”? There was no way to even really know what questions to ask. And if you did ask questions, it was on some online forum where your ego was sure to be bruised (ChatGPT eliminating all that has completely changed the game!). I remember I found a fluorescent digital display on the street in Brooklyn and figured out it had a conventional parallel bus. I wired it to the pins of a ISA board in a way that somehow allowed me to control it with 8088 out() instructions. When I managed to get a readable message to appear on it, it was hugely satisfying. About eight years later I discovered the Arduino, suddenly doing these sorts of things became easy.

    1. Simliar experience here. In high school which was… pre internet pretty much, “controlling” a thing was pretty much wiring up a super simple circuit through a 9v battery and switch to a light bulb. LEDs were hard to come by. I bought some random components from a garage sale that in retrospect had some simple ICs (555s etc) and was amused by wiring up the various pins to a battery and got exactly zero anythings to happen. Radio shack intro to electronics books were impenetrable and circuits therein (a burglar alarm? Really?) useless to a 15 year old anyway. With no access to any information at all, the idea that robots or whatever even worked at all was a miracle. No idea even where to start to understand. Unless you knew an old dude or your dad was an engineer it was impossible.
      .
      Now I bought a picAxe dev board and am up and running relatively frustration free. This is a tremendous time to be alive and I’m partially thankful I got to see the “before days” to put it in perspective .

      1. Despite all the advances, one of the things I wanted to do back then that still sort of seems impossible is making a device that shows up automatically when you plug it into a USB port. Maybe, say, I want to make my own custom mouse with amazing haptics. I think that might be doable fairly easily with some microcontrollers, but how about making my own open source printer? How might I write the computer-side software to control it? Maybe there is an Arduino-like system for doing this, but, I’ve yet to find any that don’t involve a steep learning curve. I’ve generally bypassed this issue by using, say, ESP8266s to talk directly to a web API on a server (something I know how to write code for) allowing a device to function as a networked device, but the idea lingers in my mind that custom building my own peripherals isn’t really supported even after all this time by an ecosystem as rich as, say, Arduino.

        1. The trick is you need a microcontroller having a native USB Device peripheral – not using a CH232 or FTDI chip. Then you have to get the descriptors set up, configure USB endpoints, and then your micro can be any USB Device you want it to be. A USB Device that falls under any of the standard device classes should not require drivers to function.

          The USB spec can be pretty daunting to read and understand, for me it was the descriptors in particular that were confusing. It helps to look at the descriptors of other devices you have that are like what you want to build (e.g. a mouse). On Linux this can be performed with lsusb -v. (There are other useful options for lsusb, the man page is helpful).

          In my case I was trying to get a correctly functioning USB Audio Class Device with as many channels as I had resources for. I was eventually successful using a STM32F072 and have 2 playback, 4 record channels at 24-bit 48kHz.

          1. Trigger warning: Discussion of USB Device Descriptors.

            I’m typing this on a custom mouse/keyboard device of my own construction, and I think I would straight-up have to re-lose the same five hours again if I lost that file.

        2. This isn’t quite what you describe, but I made a Bluetooth library for human interface devices (HID) for the micro:bit: https://makecode.microbit.org/pkg/bsiever/microbit-pxt-blehid . It lowers the barrier for creating custom mouse/keyboard/media-control devices. Here’re some demos, including using the micro:bit’s accelerometer as a mouse: https://youtu.be/n4J5GN72N_4 . My favorite applications so far include low-cost customizable accessibility devices, a Minecraft controller, and a MaKey-inspired fruit keyboard.

      2. “Radio shack intro to electronics books were impenetrable and circuits therein (a burglar alarm? Really?) useless to a 15 year old anyway. ”

        Was a classic example, though. Next to a house telephone/morse apparatus, I think. Because it’s just a closed circuit.
        Had been in electronic books since the 1950s or so.
        But back then it used an electro-mechanical bell or a buzzer.
        Or a good old 6v incandescent lamp (bicycle light).

        More ambitious versions did cause alarm if the circuit was opened instead of closed (if wire to sensor was cut by burglar).

        Here in Germany, in early 90s, the red LED (standard model) had been very affordable and common.
        Yellow and green LEDs were available, too.

        Blinking LEDs were available since 80s or so and did cost more.
        Some tinkerers used it as a simple time base instead of an 555.
        They used it to drive an transistor base, which then did the rest.
        A matrix could be built using a couple of them, caused very funny blinking patterns..

        Blue LEDs weren’t common until early 2000s, I think.
        “White” LEDs became more of an phenomenon in the 2010s, I think.

        1. Precisely. Those classic examples were teaching you a principle of operation that could be applied elsewhere. When we were kids, we used the inverted burglar alarm circuit to trigger a laughing skull toy for a “haunted house” at the school fair.

          Another one was the classic “combination lock safe” where a false combination would lock you out for ~1 minutes. I’ve used that circuit since to add a simple delay to a device to prevent the user from activating a mechanism twice in short succession. That was simpler than trying to get access to the firmware of a commercial product.

    2. My two cents: Also, the USB interface complicated things a lot at first, and then drammatically simplified it. When it appeared, it became clear that the computer would no longer be able to control anything except through some proprietary custom IC. But then microcontrollers with integrated USB appeared en masse and everything became extremely simple like never before.

    3. “I remember in the late 1990s wanting to control something extremely basic with a Windows computer.
      Back then, there was no obvious way to do this.
      Maybe you could control the lines of a parallel port or something, but what language would you use?”

      Simple. QBasic, was included in MS-DOS 5/6 and Windows 9x.
      Or debug.com, for the masochists. ;)

      Dozen of electronic books had been written with Turbo Pascal 6/7, QuickBasic 4.5 and Visual Basic example code.
      Visual Basic 3 was a top-seller.

      No drivers needed. Programming worked via a) direct access such as port address or b) device name (COMx, LPTx) or c) standard libraries provided by the development system (Turbo Pascal etc).
      Windows applications written in Visual Basic could use third-party VBX extensions (some sort of DLLs) written for that purpose.

      On DOS-based Windows, such as Windows 3.x and Windows 95/98/Me it was easily possible to access hardware directly.
      On Windows NT, PortTalk was needed to restore this functionality.
      The clean way was to use system devices, though.
      Serial ports and gameport had been supported, parallel port had no such API.

      Here in Germany, Delphi was popular, too. Circa late 90s/early 90s.
      There had been electronic books with samples for Delphi 3 and up.
      Delphi 3 and 7 were included for free in books like Delphi for Kids.
      Earlier releases had Borland Pascal 7 and Delphi 1 included, which worked on DOS/Win 3.1.

      A DLL named “PORT.DLL” , written in Delphi, had gained popularity in early 2000s here.
      It allowed “bit-banging” on both Windows 9x and NT, so no direct port writes were needed to be performed by the user application.
      The DLL was usable from within VB5/6, too, I think.

      Speaking of, 32-Bit Visual Basic 6 used to be the #1 programming language on Windows 9x by late 90s.
      It had been loved by hobby programmers and electronic tinkerers.
      Because it allowed rapid application development (RAD).
      It was the successor to 16-Bit Visual Basic 3, so to say.
      Only downside was that VBX file support had been dropped.
      Instead, OCX libraries had been used now. Or real DLLs (*.DLL) with a bit of work.

    4. There were lots of people in the 80’s and 90’s controlling LEDs, motors, fuel injectors, power steering pumps, central door locking, the list goes on. I was one of those people. The idea that it was in any way “new” in the early 2000’s is a bit silly. Maybe that’s when hobbyists woke up to the possibilities.

  2. I mean yes and no, right?

    This always existed, but the common Windows PCs weren’t focused on controlling machines. There were tons of industrial standards, that’s how people controlled robots in factories, and interfaced with scientific equipment.

    In the early days, it was actually a lot easier for hobbyists to get into it with cheap home platforms like the C64 being repurposed to control things. There’s all kinds of weird projects out there developed for commercial purposes with C64s and other early PCs for specific machine interface applications

    It was specifically the specialization of the Windows ecosystem on word processing and other kinds of typical home use software that led to the mainstream deviation from “physical computing,” but that doesn’t mean it ever went away.

    1. The low-priced, small ZX81 also had been used for electronic projects.
      It had been mentioned for that purpose in articles of Elector type of electronic magazines.
      The Z80 core made it easy to understand. Besides 6502, the Z80 had many books been written for.

      Then there had been robot arm interfaces for C64, Apple II and IBM PC.
      Fischertechnik had them, I vaguely remember. In the 80s.

      In the early 90s, there was the Kosmos Hi-Tec PC interface for Windows PCs (3.1 and 95).
      Was a little bix for parallel port. Allowed measuring voltages and currents up to 9 or 12v I think.
      The graphical example programs had been visually appealing (simulation of analog meters etc).
      It was meant as an add-on to our Kosmos XN1000/XN2000 electronic construction sets here in Germany.

      On DOS platform, many homebrew projects for Turbo Pascal and QuickBasic/Turbo Basic/Power Basic had existed.
      It was common to build interfaces with parallel port, serial port and gameport.
      To control running lights, motors, relays cards, robots..

      Sound Blaster was used, too.
      As tone generator and for software-based decoding of RTTY, Morse and Packet-Radio signals. As a DSP so to say.

      Then there had been microcontrollers as part of early electronic construction sets.
      Heart was a simple microcontroller with a hex keypad, datasette port and 7 segment screen.
      Could be used to build a timer, a digital temperature display etc.
      1970s stuff, in short. Like a small version of an KIM-1 or COSMAC ELF/COSMAC VIP.

  3. I had never hear that term, but I suppose it’s generational. “We” had been controlling hardware with our hobbyist computers in the 70’s and 80’s because non-networked computers are a lonely isolated world indeed.
    I do suspect the rapid progress of video, storage, and especially networking did color the general perception of computers as being ‘interactive TVs’ in the later 80s through the turn of the millennium. So maybe getting back to interfacing with the real world was a bit like breaking out of the simulation of ‘the matrix’.
    But interfacing digital computation machinery with analog reality was always the one truth.

    1. To be fair, “we” in the 90s had done similar things on humble DOS PCs with serial, parallel and gameport.
      Turbo Pascal and QuickBasic (or cut-down QBasic) were popular. Visual Basic 3 on Windows 3.1/95 was popular, too.
      It allowed using QBasic/QB45 code to be mixed into VB code.
      The circuits were usually being built on vero boards, just like in the 80s and before.
      – Speaking of the 80s, GW-BASIC and BASICA on 8088 dinosaur PCs were predecessors to QB/QBasic.
      There had been DIY interfaces on IBM PC already when the C64 was still young.

      I’m speaking under correction, but Amiga and Atari ST users had used Amiga Basic, GFA Basic and whatnot to control electronic circuits.
      The phenomenon wasn’t limited to 8-Bit micros such as C64, Apple II or ZX Spectrum.
      Electronic hobbyists had existed on 16-Bit “home computers” just as well.
      They had more capable development systems, also.
      And bigger storage to save data on (useful for dataloggers etc).

  4. Take a look at old steam trains. Their running gear were effectively mechanical computers controlling the timing of valves, length of stroke, and other functions. The linkages alone became quite complex and multidimensional over time.

  5. It’s like people are forgetting that we had line following robots and do-it-yourself pen plotters, and all sorts of mechatronic gadgets built by hobbyists since the 80’s.

    the combination of a standardized board, a wide-ranging software library, and abundant examples to learn from brought embedded programming to a much wider audience

    BASIC Stamp and PICAXE came out in the 90’s.

    1. The barrier to entry was higher with the BASIC Stamp and PICAXE; I remember it being outside my budget to get what I needed in the 1990s. Admittedly, back then I was making $6/hr babysitting servers overnight at an ISP, and now my financial health is far better. But you can now also buy Arduinos for $2 on AliExpress, and that would’ve been well within my budget even back when I was broke.

      1. Cost might have been against it, but the point of a standard board and a library of examples and functions is there.

        The PICAXE had Arduino’s basic idea: it had a bootloader that let you program it using a computer serial port, so you could just drop it on a breadboard and go from there without extra hardware. The hardware cost was minimal. That’s also how I got started with programming AVRs – using the direct parallel port programming method, so all you needed were the IC and some resistors on a breadboard to get started. LEDs were blinking within the hour, and I even made a simple “electric organ” by pulsing the out pins at different rates and touching a wire to a speaker to them.

        Using just the plain AVR chip cost something like 6 dollars, so I skipped the Arduino entirely until much later. AVR Freaks was the go-to source for information and programming examples, which got you everything from printing to terminal and controlling the registers, firing up interrupts to turn servomotors…

    2. In comparison, Arduino came to the scene around 2005-2008. The developers of Arduino themselves had used BASIC Stamp to learn about microcontrollers in school. On the general point of it, I got to play around with LEGO Mindstorms around ’99 in the school computer lab.

      Also, this “physical computing” is just a re-badge or a retcon of “embedded systems”, where the latter consist of both fixed circuit functions (e.g. logic chips) and programmable parts such as micro-controllers controlling some device. It goes way back.

    3. Some time ago in the early to mid 90s there was a 68HC11 dev board that I wanted. I had read an article about a fox hunt, and it briefly mentioned using this board and some debugging effort to get the fox working correctly. I knew nothing about programming, but I knew that I wanted to know more. The trouble was, $100 was a LOT of money to invest into something that I knew almost nothing about.

      My first experience with programming of any sort came a few years later, with TI BASIC on a graphing calculator in middle school. It wasn’t long before I ran into the ceiling with what I could do. I had also loaded zShell and a few games on there, and wondered how on earth it was possible for the games to run fast enough to even be playable, compared to my relatively simple programs that ran agonizingly slow in comparison.

      Fast forward to 2000, and I get introduced to the BASIC Stamp II at my first full-time job. I grew up a Hardware Guy, so this was more rewarding in every way. But it still wasn’t enough – even something like driving a multiplexed LED display scanned too slow to be effective.

      It wasn’t until 8 or 9 years later that I started messing around with assembly on a Parallax SX microcontroller (what the BS2-SX was based on) that I started getting real results. After that I was handed a book about C and had some interesting projects using a souped-up SiLabs 8051. From there I graduated to the STM32 family. After that my embedded development career really took off.

      Lesson learned: I should have bought that 68HC11 dev board, it would have been worth every penny.

      1. The point of an Arduino is to give you the examples and the libraries so you don’t need to learn or understand much about micro-controllers to get a LED blinking or a servo turning. The rest of the ecosystem around Arduino works much the same: you get modules with libraries that work with Arduino, so you don’t need to learn basic electronics to get your stuff working. That helps novices in, but also keeps them on the Arduino platform because you’re lifting them up the tree without teaching them how to climb.

        Without such background as yours, starting from Arduino, there’s a chance you would not have become an embedded developer at all because you would have hit a wall at some point where advancing further takes more learning than you have time for, so you’d have to pivot to some other career. If you’re a “hardware guy” looking into embedded systems, it actually pays to ignore the Arduino at first, so you don’t waste time backtracking to the basics. Then it becomes useful as a testing and development platform once you know what you’re doing.

    1. Back in the early 2000s, I put a 24 dB parabolic dish on the pole rotated by that rotator and used it to pick up WiFi from various houses in the valley. (I live on top of a mountain like an evil count.) Back then a lot of people hadn’t set up passwords, and when they did, they used WEP, which was easily cracked. Our internet at the time was 3 Mb/s download speeds, so when I needed more (this was in the days of file sharing!) I would sometimes take advantage of my capabilities. It would be impossible to do this today.

  6. The level of activity was much lower because stuff was expensive. And aside for Don Lancaster and Dr. Dobbs, information and parts were not abundant.

    Every MCU board cost no less than 5-10x a VIC 20.

    I designed and built an alarm system with entry and exit delays and an alarm time out using 555s and 7400 logic because in 1987 they were cheaper than any MCU solution by a factor of 100-200x

    Dev boards cost over 100x present costs. Or more! Until the Arduino scared the OEMs to death.

    So like most my age, I used 7400 series TTL and 555s. The enclosures cost more than the electronics as is still often the case.

    It’s not that we were not doing it. We simply chose cheaper solutions of which you appear ignorant. Read Don Lancaster’s books and write a follow-up.

    1. In the 1980s, well before I attempted any sort of interfacing to an x86 platform, I heavily customized a VIC-20, putting all sorts of devices and banked RAM in every scrap of memory (and there were lots of little scraps around the I/O, which was a little above the 32K address). I found interfacing to the VIC-20 relatively easy, because all you had to do was write some code that POKEd or PEEKed into memory locations, and then you could make LEDs blink or relays fire. When more complex computers running real OSes came along, this sort of hacking became more difficult (or more fraught, since your investment in the underlying computer was substantial, and everything you were doing risked damaging it).

    2. It was lower also because computers and programming itself wasn’t yet a hobby for the masses. Fewer people were doing anything related to programming and there was a sharper divide between “nerds” and “normies”.

  7. And here I thought “Physical Computing” with that image would be about purely mechanical automation in eg. chemical plants.

    Maybe “transistor valves” where a small stream from syphon overflow triggers the main valve to empty a full tank.

    Or mechanical / geared combinations of several weighting outputs from different containers to modulate their in or output or whatever…

    (making those examples off the top of my head – so no idea if anything like that actually exists).

  8. “In the early 2000s, the idea that you could write programs on microcontrollers that did things in the physical world, like run motors or light up LEDs, was kind of new”

    Oh. Good. Grief.

    I started controlling hardware with microcomputers in the early 1980s.

    Using C on 8bit micros.

    When I returned to that a third of a century later that was still the usual way.

    I was not only horrified nothing significant had changed, but also delighted at how quickly I was back up to speed.also

    The only change was “smaller cheaper faster”. Depressing.

      1. In the early 80s, I designed, built and programmed a ham radio repeater controller, using a Motorola 6802. We had used it in a work project, and had some leftover parts. I wirewrapped it. It even had a landline control interface with 300 baud serial protocol.

        1. Alicia Gibb’s thesis paper, about Arduino’s impact on the art world, is a great read about this. And there was enough pushback, from the old guard amongst her reviewers at Pratt, that it took a fight to convince them that Arduino was constituted an artistic medium. Computational art, by comparison, was pretty well established. I think “physical computing” versus “embedded systems” was an important term in that it drew focus away from the electronics engineering aspect, and refocused it on the extension “creative coding” (another buzzword of the time, which I think some even still use in their job titles) to the physical world. It’s telling that the Arduino IDE started as a fork of the Processing IDE, which was already getting pretty heavy traction amongst previously non-technical artists who were experimenting with algorithmic and generative graphics. Though I did mess with some electronics kits as I kid, I got my practical start with embedded system as adult, by way of proximity to the physical computing art scene. My neighbor, a mechanical engineering major at Pratt (an art school, with mechanical and architectural engineering being the full extent of their engineering programs), knocked on my door looking for help with his thesis prototype. “Hey Noah, you know about computers, right? I’ve got this thing called an Arduino and…” 18 years later… well, I live in Shenzhen.

      2. Speaking of context I’d like to comment on “oh woe is us we didn’t have LEDs until the 1990s”. In 1976 I lived in a DC suburb where the 7th floor folks from CIA tend to congregate and I could take my lawn mowing money into Empire Electronics on Rugby Avenue in Behtesda, MD and buy all the LEDs I wanted and have money left over for porn and food.

        If I’d wanted to go all in I could have bought an Altair. I’d like to see someone make a faithful reproduction of it but at today’s presumably much lower price. Always meant to do one of those Radio Shack kits with the transparent plastic top where the box is the perfboard. Never did though; I snoozed, I losed. Now it’s not possible. 哉!

    1. “I started controlling hardware with microcomputers in the early 1980s.” And I in the early 90s — maybe I’m a bit younger than you.

      I remember thinking, at the time, what all the fuss was with the whole Arduino thing. Couldn’t everyone just get the AVR chips and program them? How hard is it to make a parallel port programmer, or bootstrap up a TinyUSB? How difficult is it to learn a little C and type “#include” yourself instead of pulling it down from a menu? Etc…

      But I was wrong. The combination of the breaking down of multiple tiny barriers, having open compiler software, and selling a relatively cheap kit made a huge difference. And once the community took to it, it snowballed.

      So yeah. There was absolutely “physical computing” before the Arduino crowd buzzworded it. But it was the province of a few wizards and nerds, and this is no longer the case. It’s standard fare in grade-school / middle-school curricula now.

  9. “In the early 2000s, the idea that you could write programs on microcontrollers that did things in the physical world, like run motors or light up LEDs, was kind of new”

    Oh. Good. Grief.

    I started controlling hardware with microcomputers in the early 1980s.

    Using C on 8bit micros.

    When I returned to that a third of a century later that was still the usual way.

    I was not only horrified nothing significant had changed, but also delighted at how quickly I was back up to speed.also

    The only change was “smaller cheaper faster”. Depressing.

    1. This may turn into a rehash of the Four Yorkshiremen sketch, but Adam Osborne wrote a book in 1978 called Z80 Programming for Logic Design which was all about controlling stuff with a micro. I devoured it.

      You got to use C? It was assembler all the way for me. Memory was WAY too expensive to waste on HLLs.

  10. Hot take: But I dislike the popularity and trend of micropython/circuitpython on microcontrollers.

    They are simply too slow and too much of a crutch for people learning. Sure something like a keyboard/macropad/light blinking would probably do just fine with either of those. But the folks learning don’t understand how frankly slow they are. The other day I saw someone trying to do RF with circuitpython. Which is completely absurd when a pulse could be less than 30 microseconds.

    1. Don’t dislike it. Use it where it makes sense. Python is ‘not’ to slow for most things (that I do anyway). microPython on my Pico and Pico 2s just flies for what I need done. For example, I recently used a Robotics board that a Pico (or Pico 2) plugs into. You control two motors and the 8 servos via I2C. Plus you have the full GPIO pins available for other things (except two I2C pins of course) . More than fast enough for job I planned for it! I think micoPython/circuitPython is great. That said, for some applications like you talk about, just drop down to the ‘C’ SDK and your off and running. Use the proper Language for the job. Ie. Don’t force a square peg into a round hole. Not going to turn out well.

      For automation projects at College (80s) we used C64s with peeks and pokes, and a bit of assembly, to interface to electronics like my light following robot project with a 555.

      But it is true when parts like the Arduino came out it opened a whole new world. Now we have SBCs and micro controllers by the dozen. And relatively cheap too. Pico boards are like potato chips… Never have enough :) . One thing we are ‘losing’ with all this though is ability to map logic in hardware. Now we simply write a program to do the logic (if this switch in state 1, then turn this relay on).

      1. Sure, I agree use the right tool for the job, and you and I both know enough to do so.

        I write a lot of python when I don’t have time constraints (mostly on x86 though)

        I just think kids/beginners get sucked in with the repl and ease of python and think it’s appropriate for anything and everything (I see it often actually).

        Still I suppose getting people interested in experimenting/code is at least a partial win.

        1. The REPL is a huge part of Python’s charm. Type “set_led(1)” and LED goes on. Beginner coder instantly hooked.

          A REPL on embedded is frankly magical.

          When chips weren’t as powerful, and when it was the only REPL in town, I used Forth a lot. I do not regret having mostly swapped up to Python, even though it’s natively a bunch slower.

  11. I remember I stopped buying electronics magazines because they started to put a lot of microcntroller’s articles and less ‘traditional” electronics projects, so I was like ugh!, more gibberish (code) that I don’t understand, to write down on a PC that I don’t have, to put inside an expensive chip that can’t buy, even if I could I still need to buy an expensive programmer… electronics is dead!

    In the other hand, I think some knowledge is lost about those days, the internet used to host a lot of information and projects using the venerable PIC16f84. Not any more.

    1. Yep as I stated above … Knowledge of how to do relaying (logic) in hardware is being lost. Not sure if good or bad, but there it is. Let the micro-controller do it :) . One thing it is ‘good’ for, it does allow you to do much more complex logic ‘easily’ rather than bread boarding a bunch of chips and wires to make it happen!

    2. Same for me. I got sick of endless articles of “Build your own GPS! Just take this specialty unaffordable microchip and hook up power to it and you’ve built your own GPS!” Used to be actual articles in there about electronics, especially analog designs and the theory of operation, not just reference designs copied straight from the manufacturers’ data sheets.

    3. Oh yes, the PIC16F84(A)! It was very popular among radio amateurs.
      I’ve used it, too, at least a decade before the Arduino Uno arrived.
      The little chip was good enough as a morse ID keyer, as a simple protocol converter for old devices (such as AT keyboard protocol to XT and vice versa), as a decoder for time signals (such as DCF-77), etc.

  12. Maybe it is because I’m old enough to remember things “pre-1990”, but apparently many people never heard of TAB books : ISBN 10: 0704201712 ISBN 13: 9780704201712

    Ice-age iot ? See : https://www.ibm.com/think/topics/iot-first-device

    Getting computers to interact with the world was commonplace in the 1970s – we were even building computers with the first available microprocessors with articles appearing in magazines

    1. Reminds me COSMAC ELF and CHIP-8 articles in my dad’s old electronic magazines.

      CP/M computers had existed in the mid-late 1970s, too, though.
      My dad ran it on Sharp MZ-80K (Z80 based) via 8″ floppies, using a DIY floppy controller.
      He wrote various programs at the time, also for university.

      The original Digital Research CP/M did have one standard floppy format, which used drive geometry for 8″ floppy disks.
      Using it on 5,25″ floppies was possible, too, but wasted space.
      That’s why Osborne format and others became more popular, I suppose.

  13. Yeah, well I started doing computer programming before the computers had Operating Systems. EVERYTHING on the !BM 1401/1440, CDC 160 (A/G), and CDC 1604 required specific programming control in Assembly or machine code. I LOVED those days! I felt like I was in control of the system without Tonto there to tell me, “Kemo Sabe, him say…” to my equipment. I am grateful for the rise of Arduino and other controllers for making cool little automations, and I hope to spend the rest of my remaining years happily diddling along in blissful machine code.

  14. There are three earlier platforms that come to mind..
    – 8052 AH BASIC from early 80s. Interpreter had EEROM load/save ability, interrupts etc.
    Was released Freeware by intel, I think. incl. source code.
    Could be installed in other, compatible microcontrollers.
    – Conrad Electronic’s C Control Units (old and new) from the mid-90s (here in Germany)
    – The PIC16C84 from 1993, the spiritual predecessor to ATMega168/328 (Arduino Uno/Uno R3 chip).
    Was re-released as PIC16F84 and PIC16F84A. It was the #1 for embedded projects/hobbyist projects. Many hex files exist.
    Easy to program via both serial/parallel port (no EPROM programmer needed.)

  15. Funny I’m not even 30 years old yet

    But when I hear physical computer I think something that’s uses relays, or planetary gear sets and motors

    Then again I grew up knowing there was computer chips in pretty much everything

  16. Ah analog computer. More information on wiki at abstract link . it have several type of computer design however it already failed or scalable problem.
    A general computer which we have one also form this theory, it had been choosen ( according author). Due to it s simple and scalable and cheaper when compare with other.
    However because of the success of general computer, it leading to scale back to the other computer. General computer is slower than other.
    Thanks

  17. It’s a shame so few people seem to have discovered the C64’s User Port. It was just a bunch of bits of I/O that you could PEEK and POKE, including a data direction register. That’s all you need, right?

    I didn’t know where to get connectors that would fit the port, but the pads were spaced pretty far apart (3.96mm, I found out later), and all the signals I needed were on the bottom of the board, so I put a few layers of tape on top and just positioned some small alligator clips very carefully. I was able to light LEDs under program control, and I was off to the races!

    Next thing I did, having no understanding of coil inductance and the flyback effect, was to use those signals to directly run the coils on some Radio Shack reed relays. I figured out a reversing structure that I later learned was called an H-bridge, and used them to control motors of a primitive robot, with plywood wheels. It wasn’t fast and it wasn’t smart, but it was physical and it only cost me a few bucks.

    Somehow I don’t think I even blew up the pin drive structures in the 6526. I still have that C64, I should dig it out (now that I can source some connectors) and revisit that project 35 years later…

  18. I was a development engineer on a micro computer controlled ink jet printer. I specified pumps, pressure switches, level sensors, and the rest of the mechanical design and algorithm. A colleague did the programming. This was at IBM in 1976.

Leave a Reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.