Modern technology builds on abstractions. Most application programmers today don’t know what a non-maskable interrupt is, nor should they have to. Even fewer understand register coloring or reservation stations for instruction scheduling, and fewer still can explain the physics behind the transistors in the CPU. Sometimes tech starts out where you need to know everything (programming a bare-metal microprocessor, for example) and then evolves to abstraction. That’s where [WerWolv] wants to get you for writing USB code using the recent post USB for Software Developers.
Many USB tutorials assume you want to know about the intricacies of protocol negotiation, information about the hardware layer, and that you are willing to write a Linux kernel module to provide a driver. But thanks to abstraction, none of this has been absolutely necessary for many use cases for a long time.
While the post focuses on Linux, there is libusb for Windows. We presume the same principles would apply, more or less.
Interestingly, the target device for the tutorial is an Android phone in bootloader mode. We thought that was strange at first, until we read the rationale. You can easily get your hands on an Android phone if you don’t already have one. The device is simple. Plus, it is unlikely you already have drivers installed on your system that would interfere with your tutorial driver. Makes sense.
After that, it is pretty straightforward to use libusb to find the phone, determine what you can do with it, and communicate with it. Sure, the phone’s “fastboot” protocol is simple, but that’s just like using a TCP socket. You may implement a fancy protocol on top of it, but that doesn’t mean sockets are hard to use.
We’ve looked at simplified USB drivers before. Of course, for some applications, you can bend a USB serial port to handle something a bit more complex.

” Most application programmers today don’t know what a non-maskable interrupt is, nor should they have to” What ??? they most certainly should know!
This really does depend on what sort of applications are being written. Having worked in software development for a number of different companies most of my work was writing GUI front-ends to relational databases. Never in that time did I need to know what an NMI is. It was more important for me to understand the way the software was going to be used and the business of the company for whom I was working. There will be roles where understanding NMIs is important but it really isn’t for a lot of developers. I’m happy that abstraction layers exist and are also that there are people willing to develop them
If you don’t work on the bare metal or scheduler layer then I agree with the article: you don’t need to know.
As soon as you get your thread and memory allocation from a kernel, why would you care about things like the watchdog?
Unless it’s a high reliability embedded system of course, then you do need to know.
I agree with Mike…you most certainly do. Software developers who do not understand the basics of processing hardware, do things like: writing a serial port driver that polls the serial port in a loop…on a multi-user operating system…that has to run 60+ copies of that driver…and then wants to blame someone else because the machine hits 100% CPU utilization after only one copy starts running.
But vast numbers of software developers are never asked to write that sort of software. I could as well say that software developers who do not understand the UK pensions reforms of 1997 will end up writing programs that pay people the wrong value pensions and could be liable to legal action. Writing serial port drivers is one problem domain that requires certain skills and knowledge, writing financial software is a different domain and requires different skills and knowledge. Both sorts of developers are needed and making sweeping statements about what skills they should all have seems like gatekeeping to me
I agree with both of you, the abstraction shouldn’t need you to understand deeply the inner working but if the abstraction is failing doing it’s job of protecting you, then yes, you will need to understand it to get better (or just good) performance.
I’m still learning to code “correctly”, but the more I learn, the more I feel that the languages got stuck at the level of C:
Not that C is perfect or what ever, in general I do believe that new languages gives more “vocabulary” to the programmer to better express what he wants to do and thus should help the compiler to better optimize the code or just allow a fellow programmer to better understand what was written.
But I also think that a language can influence the way you program with the tools that it provides. Languages like Smalltalk or Objectiv-C (I have no experience in them, just read about it), force you to work in an asynchronous way.
Doing something synchronously can probably be done, but it is not the main path.
The user should be given abstractions and tools that help him to get the best performance or optimisation that he needs for its application. It should be obvious when something isn’t efficient, why it isn’t and what is the cost of doing so.
Not having the big picture in programming is what hinders user to just get a sense what is wrong. And I feel visualisation plays a big role in that in other domains but is completly absent in programming (or at least not standardize in the tools provided :) )
The PS/2 thing?
“What ??? they most certainly should know!”
Nah. Vibe Coding is the new Honey Badger.
Right! You don’t have know anything, even about coding … At least that appears to be the goal of the vibers. Let the AI do it…
As above it depends on your career path. Mine was real-time systems, so did have to know. If you are a web builder using js, or business apps with Cobol or Java then not necessary or an SQL person…
To contine, as couldn’t edit… I did have to know about NMIs, scheduling, and all that goes with RT programs and writing say a serial interrupt driver in assembly… Part of the job.
Haven’t had to do a USB driver yet…
Note I said “application programmers.” I would argue that most do not need to know this as opposed to embedded programmers or OS developers, etc.
I agree. As a hobby, I started OS Development years ago and have even written a few books on my progress, including a 725+ page book on this very subject of USB and the hardware associated to communicate with external devices. As far as this low-level USB programming is concerned, much lower-level than the subject of the posted article, every interrupt is maskable, whether is was the old 8259 PIC, the some-what modern (IO)APIC, or a more modern MSI-X, they are still maskable. You have to get pretty low-level to get into NMIs, a much lower-level than this article uses, even lower than the actual USB hardware such as the xHCI controller.