Software Driving Hardware

We were talking about [Christopher Barnatt]’s very insightful analysis of what the future holds for the Raspberry Pi single board computers on the Podcast. On the one hand, they’re becoming such competent computers that they are beginning to compete with lightweight desktop machines, instead of just being a hacker curiosity.

On the other hand, especially given the shortage and the increase in price that has come with the Pi’s expanding memory endowments, a lot of people who would “just throw in a Raspberry Pi” are starting to think more carefully about their options. Five years ago, this would have meant looking into what you could whip together on an Arduino-based platform, either on actual Arduino hardware or on an ESP8266 or similar, but that’s a very different beast from a programmer’s perspective. Working with microcontrollers used to be very different from working with even the smallest Linux machines.

These days, there is no shortage of microcontrollers that have enough memory – both flash and RAM – to support a higher-level environment like MicroPython. And if you think about it, MicroPython brings to the microcontrollers a lot of what people were using a Raspberry Pi for in projects anyway: a friendly interactive programming environment that was free of the compile-here, flash-there debug cycle. If you’re happy coding Python on a single-board Linux computer, you’ll be more or less happy coding in MicroPython or Circuit Python on a microcontroller.

And what this leaves us with, as hackers, is a fantastic spectrum of choices. Where before there was a hard edge between programming C on an 8-bit PIC or an AVR and working with something that had a full Linux operating system like a Pi, it’s all blurry now. And as the Pis, the Jetson, and all the other Linux SBCs are blurring the boundary with more traditional computers as they all become more competent and gain more computer-like peripherals. Nowadays your choice is much freer, and the hardware landscape more fluid. You don’t have to let software development concerns drive your hardware choices, and we think that’s a great thing.

69 thoughts on “Software Driving Hardware

  1. I’ve used MicroPython on ESP32 in a 6 hour intro to electronics class. We had them use Thonny as the IDE, which makes connectivity for terminal and file copying seamless. None of them had any embedded experience going into it, and by the end most of them were building their own projects without much assistance from us. I don’t think that using a full-fat desktop OS on a Pi would have been any more effective.

    1. We use CircuitPython to teach the basics of interactive electronics to our Industrial Design students. It’s a blessing compared to trying to teach non-programmers C++ and more than adequate for 95% of their needs. Plus the Python you learn can be applied on your laptop.

    1. Android TV boxes turn out to be pretty good value. My dumpster dive, Amlogic s805 Quad Core has kept me amused to weeks, and the performance…incredible! Forget RPi5 (if Jason can ever make that happen), my next diversion will be the latest in mass market Android box, for which I expect to fork maybe $40!

        1. Most of them don’t have anything aside USB and maybe an internal TTL serial console port, but unless one needs really high speeds, GPIOs to read sensors and drive actuators can be added to pretty much every system in the form of a cheaper and smaller uC that accepts a command through serial (TTL or USB) and does something with the available pins. See the USB IO board by Hardkernel as an example.
          https://wiki.odroid.com/accessory/add-on_boards/usbioboard

      1. As a career-long C programmer, I’m having a hoot with MMBasic (a structured BASIC with modern constructs and extensive hardware support) on the Raspberry Pi Pico and Pico W. There’s a really great developer forum, *and* the language developers, on the “Back Shed Forum”.

  2. MicroPython – software to drive hardware to crawl. Just learn C already. Seriously, most projects based on RPi would run happily on either 8-bit microcontroller or 32-bit MIPS or ARM uC.

    1. People gonna have feelings… It’s an eternal discussion involving topic like complexity, lazyness, time spent on the developpment and so on. For myself i still enjoy C on 8 bit and i just don’t like the Python syntax. It’s not a hard language and really popular, but i just don’t like it…

      1. For me any language that uses white spaces as part of syntax is just a bad idea. Especially considering that any half-decent IDE or code editor can indent code automatically. And color the syntax. And recall all names as needed.

          1. Python is popular because it’s used for teaching programming, because it’s easy to embed in web based teaching tools and run scripts in a browser.

            This then has the hammer-nail effect: people who learn python don’t look at better programming languages because they already know python.

          2. Popularity and success come when someone or something arrive in the right place at the right moment, and can be completely unrelated to the implicit quality. Python is a very powerful language, but certainly not the best option in embedded world, and many makers use it because they already know it, not for intrinsic merits.
            Visual Basic itself has been extremely popular decades ago, and it was a huge pile of smoking crap in every possible field except that it could make the average Joe User appear as a programmer and snatch a well paid job.

        1. Quite some years ago I was interested in python. (It was still halfway the horrible V2 vs V3 debacle) and I found a bug in an Open Source project, so I put some effort into tracking down the bug. It turned out a few spaces were missing from the last statement which should have been in a for loop, so the statement was only executed once, and after the loop instead of in it.

          And the more I learned about python, the more I came to dislike it.
          The absence of declarations for variables and their types is another such horrible things. If you mistype some variable, then python just invents a new variable, which can take quite a lot of time to debug.

          And, oh, when you want to add items to a list (for example in a loop), you have to declare an empty list first, or python does not understand it. So that is quite inconsistent.

          And combined with some other nuisances, I concluded that python can be an “adequate” scripting language for small scripts, but it’s just not fit to write big programs in. There are just too many ways for bugs to creep in during maintenance of a program.

          1. Actually, modern Python supports typing as an opt-in now. You can have typed lists, sets, classes parameterized with types too. Not only does it allow editors to find errors, but some libraries (such as Pydantic) to get meta-data on fields of DTOs.

      2. C is a perfectly valid choice for a microcontroller. In this instance, you just need to have your compiler run from there, rather than compiling elsewhere and downloading a file.
        Oh, and you’ll need a file manager too. And if your C code includes libraries, a library of code to compile and link.
        And a command line interface, to invoke all these compile / edit / run / file management commands from.

    2. I know just enough C to do what I want to, and if I don’t know how to do something I can usually figure it out with research and trial and error.

      I got started with python maybe a year ago due to RfCat, and I gotta say, it is so easy to do basically anything with it. I would never claim my code is perfect or efficient (either with C or python), but I have become a huge python fan because it is so simple to get a project working with only a text editor, and it is very intuitive. Rather than having to compile, you simply do up a python script and run it. While C may be the better choice, I definitely understand the draw to use python. It is my go-to these days.

        1. Ok, good for you? For a complex project working with C these days you pretty much need vscode/vscodium to keep it all in order and you need to create projects.

          Not so with python today. I don’t even bother with an ide for even complex python projects.

      1. That’s what I like about Python. It might not be the best, or the most efficient, or the most “elegant”. But, out of the languages I’ve tried, Python gives me the shortest time and least effort between an idea and code that does what I want.

        I also like the fact that I can use a Python shell as a programmable calculator–a very powerful one.

    3. Yeah, but if what you want to do isn’t complicated, and you’ve got a microcontroller with MHz to spare, there’s no real downside to MicroPython then, is there? At that point it’s down to whatever language/environment is more comfortable to work in.
      (Haven’t used MicroPython in any projects myself, yet)

      1. “Not complicated” and “can be done with a slow abstract scripting language” do not necessarily overlap.

        Plenty of MCU stuff goes right down to the lower levels, and while the thing you’re doing is very simple – like “Read ADC, stream out of serial port” – they require speed, efficiency and deterministic timing. Then, if you really want to use Python, you need to dig up some libraries that are made for just that purpose, and then you’re down the Arduino route where you can’t do anything unless someone else has already done it for you.

        Whereas, if you know your C or ASM, and you can read a datasheet, you can just do it in a few simple lines of code.

        1. Based on what? Your tinfoil conspiracy theories? Do you also not trust phones, all major computer chip manufacturers, credit card companies…I could go on, but the point is made.

          If you had a company it seems like it would operate using, well, whittling maybe? But then again with people you still have issues with both QC and Security.

        2. In the late 90s, in my professional job, I recommended that our team use gcc, and a pointy-haired boss shot me down because “it can’t be any good if it’s free”. How can you argue with stubborn ignorance?

    1. I wonder, why you want such a SBC in industrial environment?

      Also at least once a month I see an ad for “new, awesome” industrial computer. It’s either RPi-killer or x86 platform. Usually advertised as “all you can interface” buffet with harder hardware for those hard industrial conditions. Are those platforms actually used by anyone?

      1. > I wonder, why you want such a SBC in industrial environment?

        I assume you’re asking about the temperature range here. There’s dozens and dozens of applications that fall into that range. A car – especially in nordic countries? A glasshouse environment? Environmental monitoring “outside”, or “inside” in a freezer? etc. etc. etc.

        1. I know, why you need an extended temperature range in industrial setting. I’m asking, why would anyone use an ARM-based SBC barely running a bloated Linux distro? On the basic level there are PLC controllers, and various other systems, including SBCs designed specifically for industrial use. That is, with extended temperature range and more immune both to environment and EMI.

          You don’t use a toy to operate an expensive industrial machine.

          1. I don’t understand your question. You’re the only one who mentioned using a bloated linux distro that barely runs. Arm can run qnx, vxworks, yocto, and just about everything else that somebody might want to run on arm. I personally would never use a broadcom SOC unless they have started making the reference material available, but I have no problem with SBCs or SOMs as a category.

          2. The reason you use a single board computer running a big operating system to accomplish a very simple task is because the development team was given three weeks to do it, and the team consists of freshly hired graduates because they’re cheap.

            You don’t ask what is the proper solution, you ask “Where’s the nearest store that sells a Raspberry Pi?”

  3. Why anyone would even consider using a RPI in any project just blows my mind. Sure you may find one in stock somewhere today to start the project, but then you’ll be waiting for months before you find another.

  4. My Raspberry Pis are currently sitting on a shelf, gathering dust. I just don’t know what to use them for at this point. Every Pi project I had was eventually replaced by cheaper, purpose-built solutions.

    Personally I go with the flow and keep with the times. Right now machine learning is the new hotness, so if it doesn’t support add-on GPUs then I’m not interested.

    1. You’ll find all the computers you need for machine learning on AWS and such like, rent them for very cheap, so why would anybody want to own one unless they’re using it to it’s full capacity everyday?

      1. I do own a powerful server for machine learning but I don’t run it 24/7. I do use it to full capacity when I am running it, especially when running inference on other peoples’ models.

        The benefits of owning your own hardware are that it’s cheaper over time, it’s still available when cloud services are down, you don’t have to compete with other users for available compute time, you can choose your own GPUs from alternate brands like AMD and Intel, and if set up properly the performance far exceeds what you can get from the cloud.

        If you know your way around PC hardware and have some coding experience then you have no reason to utilize cloud compute. Some models can even run on consumer laptops and gaming PCs, no specialized hardware necessary.

        I’ve worked extensively with both on premise and cloud throughout my career. Cloud is not the answer to every question of where to run workloads, and most definitely not the answer to cost savings.

      2. You’d be surprised how quickly you can burn through cloud credits when using inference in practical ways in your daily life. It’s much better to be able to run locally without limits.

        For example, if you want to create a convincing fake photo of the Pope using Stable Diffusion you’ll need at least a few dozen iterations to dial it in. Prompt engineering is not straightforward and you will inevitably generate a lot of junk before you’re satisfied with the output. That’s why services like Midjourney charge you per GPU hour and put monthly limits on fast GPU access, as they know their customers are generating more than they can practically use.

        Even if the cloud services are able to scale up and down accordingly to save cost, the hardware will spend way more time crunching numbers and creating useless outputs, negating those savings.

      1. Heck no. I use inference on various models all the time to speed up my daily workflow. There is no value for me to be doing it slowly on cheap hardware. If it takes too long to produce outputs you’re better off completing the tasks manually.

        Besides that, I need a good GPU for my other hobbies as well, like when using Dream Textures in Blender.

  5. for a short period of time there were machines that worked instantly when they were powered on.
    Please allow me to explain:
    – 80 years ago, a radio needed to warm up before they could be used (vacuum tubes needed to warm up)
    – 40 years ago, you switched on you radio, instant sound
    – today, you switch on your smart-radio, first it needs to boot (maybe even update), then it load all sorts of shit from the internet (if that’s available, if not it takes even longer) and if all is OK you might enjoy some sound. And if you want to switch it of you need to do that properly, don’t pull the plug, or otherwise it won’t start properly any more.

    I have no problems with modern functions being overtaken by software, but why does it always seems to be a crappy version of software that never properly updates. Or even worse, no longer is supported any more so you need to discard it because it becomes unsafe/unreliable while technically there is nothing wrong with it.

    1. When my daughter was a toddler, I often grabbed the digital camera when she was doing something cute. But by the time the camera booted, she was doing something else, and maybe in another room.

      1. A DSLR can sit ready to go after booting for a pretty long time without running out of battery, and while mine takes a bit longer to boot because it has to load an extra third party firmware, it’s still much faster than that. I think it was just the earlier or cheaper point and shoot digital cameras that were so much slower than their film predecessors. Maybe the mirrorless ones nowadays are slower, I haven’t tried one.

    2. At work we have a Rohde & Schwarz lab power supply. It’s a great power supply 3 channels, but it takes 20 seconds to boot and it has the fans blowing on max while booting. Apparently it runs Linux for some reason.
      I recently bought a new smartphone because my 5 year old smartphone took 5 whole seconds to boot the camera app and would often crash before I could take a picture. But by 9 year old PC with Windows 10 works reasonably well though it’s starting to show it’s age. Nowadays an i3 has higher performance than my i7. It shows that Android slows down hardware quicker than Windows.

    3. Reading this, first thing that came to mind was that my vape pod (meh.. i know) literally boots for 5 secondsband feels the need to display a logo before you can use it and if you keep it in standby it drains the battery in 2 days, not even trying to think about modern cars needing to boot full OSs and fuel caps/sockets needing to be motorized etc.

  6. People should concider flashforth if they want to program their microcontroller in a live environment. Its interactive, small footprint and have nearly assembly language execution speeds. flashforth.com
    ps easy syntax, everything seperated by whitespace is a function

    1. Looks like asembly too. A martian version. So, for people not versed in the language, it is a significant barrier that a more “verbose” language like python ( or even Pascal, if possible ) would not present.

      But for those fluent in forth, I can see that it would give some good results.

      1. You get fluent by using it, like any language. There are great microcontroller based examples on the website as well as places like arduino-forth.com. The syntax is easy everything seperated by whitespace is a function, no syntax sugar. If you don’t like something, redefine it.

  7. Having used microptython for the last 2 years, I’d say it’s great for simple project but the speed is a big issue if you’re trying to do anything complex. Luckily it’s not too hard to add C for the heavier lifting…

    1. Micropython also seems to be quite memory hungry in an ESP32 environment – hungry enough that at the point you reach any sort of complexity, resources are exhausted.

      1. From what I can see Micropython is useful with the Pi Pico to quickly test interfacing with peripherals or programming the PIO.

        The REPL interface is quite useful for that, without needing to compile/upload etc.

        Once the basics tests are successful, it’s better to make a nicely tested module in C, and move on to the next sub problem.

  8. “And what this leaves us with, as hackers, is a fantastic spectrum of choices.”

    bah humbug i hate choices

    no seriously man! making a choice is doing labor.

    i know i sound my age but experiencing my choice to reinvent CMSIS, and experiencing my choice to reinvent the RP2040 SDK (which itself reinvents CMSIS!) has now got me doing the labor of imagining my future choices, and there’s too many of them and too unclear motivations.

    13 years ago i had a project to use a pic18 and an hd44780 to process a model airplane control signal (~1kHz PWM) and the constraints of the available timer/PWM/interrupt/RAM made all my choices for me. i didn’t have any choices. well, okay, i chose needless reinvention there as well. but the limits at least kept me from using an interpretted language with faked POSIX threads to implement the ISR as a busy loop running on the second core, with a “map” statement thrown into the critical section just for the lols.

    now the RP2040 can do anything. every clumsy decision i make, i ask, “can i spare 10 bytes? can i spare 10,000 bytes?” yes yes yes. i can do anything and now i have to choose. it is too many choices. why not pull in some bloated library? why not?

    alright, you tell me. why not?

  9. I think the hacker community still has a gap between full multitasking OSes and single-thread microcontroller projects. The blurring of languages is great, but I haven’t seen the hacker community really embrace any kind of tasking model or RTOS that competes with Linux. Maybe that’s fine – Linux is for complex multitasking and sharing hardware resources after all, but I’d like to see the hacker community try a more embedded OS on these microcontrollers.

    As for _which_ embedded OS, there’s no shortage. ESP-type boards are usually already running some kind of FreeRTOS (with the Arduino environment running in a single task). Things like Zephyr are up-and coming industrial solutions. However, my favorite is probably RTEMS. Great software history, clean code, optional POSIX interface, highly configurable, runs on everything between an STM32F1 and a full-fat x86 processor. With community adoption, I think it could become the powerhouse it deserves to be.

  10. I just switched a sound project from a Pi Zero to Pi Pico. No cost savings with the hardware, but software is much easier with just a python file over maintaining an entire system image.

  11. Ironic, given that the Broadcom chip of the original Raspberry Pi was derived from a design for TV set-top boxes, generally to display information and provide a little interactivity for hotel television systems.

    Some time way back around the turn of the century, I recall settling my hotel bill and checking out via the complicated remote control and the room television…

    This is why the Raspberry Pi had a relatively powerful GPU that booted the whole system, the CPU was a small adjunct processor to do the network protocols and drive the UI.

  12. c/c++ 1 buggy, 2 malware vulnerable and 3?software modules longer than one page of code in violation of Boeing hardware engineers’ software standards technologies an issue with nanocomputers?

    Other software technologies needed … in addition to c compilers, of course.

  13. I’ve built PIC based projects professionally and as a hobby for years. PIC manufactures a wide range of micro controllers with a wide range of capabilities. Later I bought a series of PI computers with my last being the first quad core model. And that quad core is a very capable machine. The PI reminded me of the Timex Sinclair of the early 1980’s, only 100X better and much much more stable.

    But I’ve passed my PI computers on to my youngest Brother. He’s 60 and I’m 63, at least I’m 63 for a few more days. He loves them and he has various PI projects running around the house. And I’m back to PIC projects. Why? Because PIC projects are simpler. PIC projects are easy to design if you understand basic analog IO design and can do a PCB layout. With a little bit of code you can even drive a display and coding itself is easy with a C or Basic compiler. And the neat thing is I can actually adapt a lot of low cost IO designed for the PI to a PIC design. I’d even argue that except something with a GUI interface, if you can do it with a PI I can do it with a PIC!

Leave a Reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.