Ask Hackaday: Learn Assembly First, Last, Or Never?

A few days ago, I ran into an online post where someone pointed out the book “Learn to Program with Assembly” and asked if anyone had ever learned assembly language as a first programming language. I had to smile because, if you are a certain age, your first language may well have been assembly, even if it was assembly for machines that never existed.

Of course, that was a long time ago. It is more likely, these days, if you are over 40, you might have learned BASIC first. Go younger, and you start skewing towards Java, Javascript, or even C. It got me thinking, though: should people learn assembly, and if so, when?

Assembly language is a text representation of the machine code your CPU executes

I’m no stranger to assembly languages, but I’m not sure I know a modern and defensible answer to this question. You can find plenty of tutorials, of course (including some from me). There are plenty of cases where a few lines of embedded assembly can make a big difference in a program. Debugging a bad compiler can also require assembly chops. So it seems that at least some people still need to learn assembly. That leaves the question of when to learn it and, as a corollary, who needs to learn it.

My traditional answer would be that everyone should learn it as soon as possible. But that could be because that’s how I did it years ago. I’m not convinced that’s the right answer for everyone today. However, I’ll make my case.

The Case for Early

If you are satisfied writing code to validate zipcodes in Javascript, you probably don’t need to learn assembly. But if you want to really be a top programmer, you are going to have to confront it sooner or later. Learning it early has some benefits. If you understand what’s really going on at the CPU level, a pointer in C doesn’t seem like a tough concept. Being able to look at the output from a compiler and understand what it means is often illuminating when you are trying to learn something new.

However…

The only problem is that modern assembly language is difficult. CPU instruction sets are strange, and there are issues with CPUs that do out-of-order execution. Then there is relocatable code and other details that are easy to trip on and not very useful to know much about.

Even if you can’t afford an Altair, you can get a replica. Or just stick with software emulation

So there are two ways to go. First, pick an older CPU. Something like the Z80, the 1802, or the 6502 isn’t that hard to learn, and there are a ton of resources available. Don’t have any hardware like that? Who cares? Use an emulator. You can probably even find some that run in the browser and have excellent debugging capabilities not available in the real hardware. I’ve programmed on dozens of CPUs, and they are all pretty similar. Given the oddness of the 1802, I might not recommend it even though I love it myself. It is, on the other hand, very simple to learn. The PDP-8 or PDP-11 are other good candidates, although some of how minicomputers do things are rarely seen today.

Or, pick a machine that doesn’t exist. Most of these were made for learning, so that’s a plus, and many of them also have modern emulators. If you were to pick one, I’d suggest Mix. The books from Knuth are classic and use Mix. They influenced everyone, so you’ll see echoes of Knuth’s ideas in every computer system you ever touch. (That isn’t a quantum computer, at least.)

Just don’t go too far back. Programming the EDSAC, TUTAC, or the 4004 is probably not representative of modern computing at all. Honestly, none of these CPUs are. But they can help set the stage for tackling more advanced processors if needed. Like the old adage: You have to crawl before you walk, and walk before you run.

The Case for Late

You could easily argue the other side, too. Maybe assembly language makes more sense once you understand why you need pointers to make linked lists and why conditional jumping is a thing. If you’ve been programming for a while, some ideas like hex numbers and addresses might already be familiar.

Probably the wrong time to learn it, though, is when you have an intractable bug, and you suspect the compiler is optimizing your code incorrectly. Or at the midnight hour before a deadline when you need to shave that extra little bit from your code.

What Do You Think?

This is, after all, “Ask Hackaday,” so tell us what you think in the comments. Make your case for why no one needs to learn assembly. Or tell us your opinion about the best processor to learn. Learn in the browser? We don’t suggest WebAssembly, but maybe you disagree. What are the best online resources and books? What’s your favorite story about that time that knowing assembly saved the day?

(Digital rain effect from [Rezmason]’s generator.)

153 thoughts on “Ask Hackaday: Learn Assembly First, Last, Or Never?

    1. I was going to make a ZX81 comment too. BASICally, it was such a slow computer (and one I also loved), that learning assembly was essential. More correctly though I think most of us learned machine code since assemblers needed at least 4kB. On the positive side, the ZX81 user manual listed the entire Z80 instruction set along with character set in an appendix.

      In reality, it was a two-stage process. I learned BASIC on the ’81, and then typed in magazine listings that included machine code into REM statements using a Hex loader (and sometimes a decimal loader), and only after that started to learn it. At first, machine code seemed like magic!

      10 REM xxxxxxxxxxxxxxx
      20 LET ADDR=16514
      30 CLS
      40 PRINT ADDR
      50 INPUT X
      60 POKE ADDR,X
      70 LET ADDR=ADDR+1
      80 GOTO 30

      1 192 02 LD BC,704
      Loop:
      197 PUSH BC
      62 138 LD A,138 ;Checkboard/Black.
      215 RST 16 ;print chr$(A)
      193 POP BC
      11 DEC BC
      120 LD A,B
      177 OR C
      32 246 JR NZ,Loop
      201 RET

      RUN
      Type the numbers one at a time, in turn.
      Type STOP to stop entering machine code.
      Delete lines 20 to 80.
      20 RAND USR 16514
      RUN

      (Tested).

    2. My first computer was a ZX81 too
      I learnt the rudiments of Z80 assembly early on using it. It is unlikely that with modern CPU architecture, it would be a route today. But there are plenty of 8bit microcontrollers to target still (including Z80)

  1. I’m biased by my background, but I would say fairly early.

    Learn any language (it pretty doesn’t matter what, back in high school they used Basic to good effect). Then learn about algorithms, what they are, how to think about them, how to evaluate them (Big O notation for complexity and how fixed overhead can matter more for small tasks). included in this, spend tome on algorithms that are now simple functions in most languages like the different ways to sort a list (and different list types, etc.

    Then learn assembly so you can understand what’s really going on under the covers, what sorts of things the hardware can actually do, and what you have to build from.

    Then go on to other languages, databases, networking, etc

    1. I have the feeling that I’m right at the point of my journey where its time to learn algorithms.
      With a knowledge of C and Python (and a little bit of assembly), do you have some recommendations for learning algorithms? Preferably books, but online is good to.

      1. The traditional algorithms texts are a bit heavy (for example: Cormen, Rivest, et al) but quite good.
        I lean toward Sedgewick, as it has been kept up to date and there are a LOT of free materials on line.
        Th eO’Reilly Algorithms in a Nutshell is ok.
        The AHo, Hopcroft, and Ullman text has been around a long time and kept updated, and steps back from some of the math that Sedgewick brings in.
        Jon Bentley’s Programming Pearls (vol 1 and 2) are pretty much a must read.

      2. My all time favorite is Wirth’s “Algorithms + Data Structures = Programs”, but I doubt it’s in print any longer. Next up on my list is Sedgewick’s “Algorithms in C”, and although its not an algorithm text, “Design Patterns” by Gamma, Helm, Johnson, and Vlissides is a fine reference to have handy for object-oriented programming. The great thing about algorithms is that they are timeless, so if you can lay your hands on a used copy of Wirth or Sedgewick you’ll have at hand a reference you can use through your entire career.

        1. @Bob A: Easily found used second hand. Pretty cheap $15 bucks or less in good condx. incl. CONUS shipping. Check the usual sources such as: amazon.com, thriftbooks.com, abebooks.com, et-cetera.
          Title: Algorithms + Data Structures = Programs (Prentice-Hall Series in Automatic Computation) First Edition
          Author: Niklaus Wirth
          Publisher: Prentice Hall; First Edition (January 1, 1976)
          Language: English
          Hardcover: 366 pages
          ISBN-10: 0130224189
          Remember the HaD effect: May sell out after this post. Be patient, there are plenty out there for those who can wait a bit.

      3. Why did you steal my name? :)
        Cliff and Bob mentioned Sedgewick and Wirth. I have both and both are excelent. But The Book on the subject is series by Knuth, The Art of Computer Programming.

      4. I love this community for all the stuff you can learn here.
        Prof. Wirth is actually Swiss and it looks like his books are still in print in German.
        Nevertheless I think I will start with a used copy or the online version.
        Thank you all for the recommendations!

        (I hope this reply ends up in a useful position)

      5. “Grokking Algorithms: An Illustrated Guide for Programmers and Other Curious People”

        I found it book store – seemed so good I bought it but than I had no time to really read it. But illustratuons looked pretty straight fforward.
        Also mentioned Wirth’s “Algorithms + Data Structures = Programs seems to be a often recomennded classic.

  2. When I learned programming you didnt get a choice: it was assembler or wiring up your 407 to do limited accounting on a set of very sorted hollerith cards if you weren’t in a place that used union keypunchers.

  3. I am following Casey Muratori’s “Computer, Enhace!” series, and he uses straight x86 assembly for learning purposes. The jump from 8086 to modern x86 cpus is not that step and you end up with a faily good understanding of the architecture, good enough to read dissasemblies and follow what the compiler is doing.

  4. I learned Macro-11 after taking the standard (for the time), languages in college: Fortran, COBOL, and BAL-360. So, not first, but pretty early. I later learned a number of other assembly languages. I have to say that as a professional developer knowing assembly has helped me many times with such things as debugging, and optimizations.

    1. My BAL 360 instructor had a mantra he kept repeating to us students,
      “If you can’t do it in Assembly, it can’t be done!”
      It took a while for that to sink in for some of the students.

      1. College was PL/1 then BAL on a 370. Later I got a free acct on a Decystem-20… That assembly I *really* enjoyed. Ralf Gorin’s book “Introduction to Decsystem-20 assembly language programming” was sublime. Sample jsys code like: “I enjoy assembly language programming, because it is fun!”. On a Dec-20, you can EXECute the reg’s (see Emacs’s original search kernel). Also built in HW deref on ptrs. Arb char lens. etc etc etc.

    1. On the contrary, it is quite fun, especially what you see can be done in one instruction cycle. Its incredible. it just means you need to check the ISA for an instruction that does what you want or you’re leaving performance on the table. It also means being aware of necessary alignment, instruction ordering, etc, but it’s quite fun imo. To each their own :)

  5. The only assembly i wrote was for a univac. The assembly class. in college. But I had to sortra understand 8088/8086 back when debugging C mean using ADB. To find slow code you printed your C code with the generated assembly.

    I want to learn 6502 for the hell of it, or program a HP calculator.

      1. I wrote a 2K program in Mnemonics on a Tektronixs development system for a 6805. It cured me forever from becoming a code writer of any kind. Although the final result was like magic, the process was most frustrating. Events like stack overflow, PIA setup, and understanding the ramifications of the WAIT state caused many hurdles. Even the floppy gave out once and I could see light through it. I am very proud of the accomplishment, similar to a new hardware design, but could not imagine spending my life staring at a CRT.

    1. In offended me in school that they wanted us to write assembly for the 1802 that would do a divide by zero to trap, dump, and that was your output. So, of course, I figured out how to do the whole thing. Turned in the first program and was told just stop coming to class.

  6. For me, it was FORTRAN first, followed by assembly for the CDC Cyber 6400.

    You really can’t call yourself a hacker if you aren’t familiar with assembly for at least one processor. You don’t need to be fluent, but you don’t really understand a computer without getting a grasp of the assembly level. I have lost count myself — MC68000, ARM, x86, z80, 6502, sparc, and no doubt others I forget.

  7. Although I learned BASIC first, I got into 6502, 6802 and 68000 assembly fairly soon after that. It greatly helped me to understand pointers in Pascal and C. For that reason alone I think it is still good to learn assembly (though not necessarily for any Intel processor, those are just weird and confusing).

    1. We may be the same age.

      Yup, basic on a timeshare system in jr.high.

      6502 and 1802 assembly at home, then 8085 at college.

      Z80 and 68000 assembly at home, pascal, C and pl/1 in college.

      Learning different architectures helps in knowing why RISC vs. CISC arguments made sense.

    2. Motorola 68000 with Amiga OS was for me the best environment for doing assembly. The OS was organised as a framework of sorts and very accessible, it was easy to do even GUI applications with it. Also the Motorola assembly language was way cleaner than Intel.

      1. The 68000 had a nice instruction set, essentially a PDP-11 grown up a bit with separate address and data register sets. In the late 1970s, I wrote an assembler for it in the intersection of the 68K and PDP-11 instruction sets using PDP-11 macros to convert the 68K instruction mnemonics so that it would assemble on the PDP-11, producing a PDP-11 executable. The resulting executable was able to assemble itself on the on the PDP-11 producing a 68K executable that ran on the 68K providing self development support in the early days before there was an OS and development environment.

  8. I taught myself assembly first when I was a teen so I could program some pic16f84’s I got as free samples from microchip. I moved on to c within a year or two of that. Fast forward a few years and freshman year of undergrad I took a course in java and then in my 3rd or 4th year took bruce land’s course on embedded microcontroller programming. In grad school though I ended up mostly doing a lot of simulations in matlab, daq stuff in labview, and a mix of dsp/real time fpga hardware in the loop stuff. Now I work in medical devices designing embedded hardware and writing firmware.

    1. Lol, same here :D
      I’m nowadays amazed how on earth teenage me could learn assembly pretty much on my own, without really any support and with building my own dev board simultaneously, and the internet was way different back then.

  9. I learned machine code before I learned assembly, simply because I has no access to any documentation. Which made learning my very first assembly language pure joy – It was so much easier. But of course with access to documentation, I did not have access to an assembler so I still had to translate from assembly into machine code by hand. I ended up learned about seven flavours of assembly for various architectures. Next was LISP, followed by Pascal (It was horrible). I think that C was next and it brought back back happy memories of assembly language, but was so much easier, it was a pure pleasure to learn C. And then I think I learned B.A.S.I.C. (Beginner’s All-purpose Symbolic Instruction Code), which was just weird compared to everything else I has learned up until then. I ended up learning about 30 computer languages and stopped. My go to Language has always been C, nearly all the power of assembly language if you think about what you are doing.

    In saying the above I am now thinking about “At Last The 1948 Show – The Four Yorkshiremen Sketch” – https://www.youtube.com/watch?v=VKHFZBUTA4k

    1. I think that on the programming language wall at the computer history museum in Nottingham that BASIC is essentially a island for it self.
      C on the other hand is the root for almost countless number of languages.

        1. I do not know of a computer museum in Nottingham. The two I do know about and both are well worth the visit are The Museum of Computing in Swindon and The National Museum of Computing at Bletchley Park.

  10. Started out with a commodore 64 learning basic. Then moved onto assembly for 6502 and HC11’s
    Way back then when memory was limited I would say its a must to learn assembly. But IMHO now a days most controllers have decent ram and flash. back then we were dealing with 8MHZ clocks, and so assembler got you many advances over basic and C/C++. But nowadays with them typically at 160-400 mhz, and C/c++ compilers much better. Not really sure what you will gain going the assembly route.

  11. I’d have said start with a high level language and how to put together algorithms. Once you have a reasonable grasp of them then Assembler is an extra if you want to make your code more efficient (understanding what the compiler is generating and why) or create custom blocks of code for speed.

    For the most part modern optimising compilers, faster processors and more RAM make up for why we learned Assembler 30+ years ago.

    1. I disagree with this as a blanket statement.

      leaning assembly gave you an understanding of the types of things that the computer can do, and the types of things that you have to use a bunch of commands to do.

      I agree that a lot of current programmers don’t understand efficiency and just count on throwing money at it to solve their problems, and that can work to a large extent. But those programmers who understand what’s going on under the covers end up being able to write stuff that scales (both up under load and down to smaller processors, mobile devices under load, etc) even when they don’t write a line of assembly to do so.

      1. Years ago the place where I worked, we was taking raw radar data and displaying it on Unix workstations. We were using an old Data General, an Alliant, 2 Symbolics/Pixars, and a number of Sun workstations. An engineer in our group, using Sparc assembly, managed to get a display running in a window of his Sun IPC, bypassing most of the forementioned equipment. It was huge step in our labs productivity.

  12. It depends… Most people probably don’t need to learn assembly. Just use Node Red, Scratch, Python or whatever high level language fills the bill. That said, all college Computer Science majors definitely should be exposed to computer architecture and of course, in the process, assembly language. Professionally I think it wise to have the background. I personally like assembly language programming for fun… But then I made a career out of writing real-time/normal applications as an applied CS graduate. Delt with several processors from the lowly Z80, 68xxx, x86, on-up to todays wiz-bang CPUs.

    1. Oh, forgot to add, it should be introduced fairly early. But not to early as you need general computer ‘architecture’ under your belt first. At least by the second year of college. I don’t exactly remember when I took it, other than it was an early class 07:00 or 08:00 hours class! Seems like it was full semester too. VAX assembly.

    2. Basic was my first language used in High School. Pascal was the main language at College. Never touched assembly until college. And then I was off and running because those years assembly was fun to use on x86 PCs to get some performance from them. Even to draw lines, circles, etc. All in assembly. Fun times back then.

  13. Well, directly addressing Mr. Williams question, I would advocate for early. I wouldn’t start with assembly language. I would start with javascript maybe, or python, maybe even Kotlin (but not Java). I would get involved with C as soon as possible (but not C++ or C#). Then I would get my hands on assembly language. Honestly the ARM is simple and clean and that would be my top recommendation. I would avoid x86 like the cursed plague that it is. Maybe learning C after assembly would be best.

    So – start with Python, then ARM assembly, then plain C. After that, the world is your onion.

    8 bit processors? Not unless nostalgia is your thing.

  14. Personally, I say ‘forget assembly’– start with an old (70’s era) text on microprocessor design. Once you’ve covered that (understanding the role of the PC, the ALU, registers, etc), assembly will suddenly start to make a *whole* lot more sense.

    I feel without the understanding of underlying processor design, coming to it from a high-level language blind it will seem *incredibly* arcane.

    I had to learn ASM first on the PIC, because unfortunately at that time it was the only free offering (the C compiler is super pricey). It is okay to learn this way too, because you are likely working on a simple project with physical outputs that can give you at least *some* sense if your project is working.

    But in later years I took the EdX MitX 6.004 series (quite unfortunately, this series was only offered *once*, which is really quite a shame because it was truly excellent. So hard, but I learned so much from that class) online. There, in that context, you start to *really* understand the who what and whys of how assembly works.

    1. “..Personally, I say ‘forget assembly’– start with an old (70’s era) text on microprocessor design. Once you’ve covered that (understanding the role of the PC, the ALU, registers, etc), assembly will suddenly start to make a *whole* lot more sense….”

      I agree. Understanding Computer Architecture is the key.. Programming is just the final step to fit it all together..

  15. I learned MACRO-11 on the job. Not well and not thoroughly, just enough to customise drivers and to lock some of the features in a newly generated operating system instance.

    I learned BASIC first, in the 1970s, but had to pick up bits of COBOL, FORTRAN, Pascal and C/C++

    In the early 1980s, I learned 6809 assembly in order to do major hardware hacks on a Dragon 32

    Right now, I’m trying to fit my head around MCS-51 assembly because I’m designing my own system from the chips up, and what is already available is a bit … BASIC.

  16. I would not touch Knuth’s MIX with a ten foot pole. It is one of the things in his books that I would say is fundamentally botched. By trying not to be like any real machine, well he succeeded and you end up tangled in all kinds of useless abstractions that might amuse a mathematician (and surely did), but won’t help teach a neophyte programmer anything and will just be an additional hurdle.

    1. Right. Even Knuth himself says you shouldn’t use MIX now. IIRC (and it’s a long time since I read TAOCP) MIX requires using self-modifying code to handle subroutine calls. (Correct me if I’m wrong.)

      He has produced an update called MMIX for the new volumes (or at least the fascicles).

    2. Knuth’s books drove me up the wall in college. The math was brilliant, the coding gave you a choice of
      MIX (a horribly unnecessarily complex assembler language) or pseudocode that was mostly GOTO spaghetti.
      At the time they were written, it would have been a much better choice to use ALGOL instead of the pseudocode (it was, after all, written to describe algorithms in journal articles more than to actually use on computers), and some simple clean assembler like PDP-8, or at worst, a subset of IBM 360 (which was my second assembly language.)

      Later in the 80s I used a bit of 68000 assembler (which I wrote by compiling C code and tweaking the code until the assembler used the right sizes of variables so all the math happened in 32-bit.)

      1. I mostly agree with you on the criticism of Knuth’s choice of language. It even felt odd at the time. I do think I understand some of his rationale though. He wanted to focus on the core concepts and didn’t want to get side-tracked by lots of boilerplate code that many real-life programming languages require, nor did he want to expose himself to ambiguities that can arise when using complex higher-level languages. And maybe, it was also just a sign of the times and people were still more comfortable with assembly as compared to higher-level languages.

        By and large, (M)MIX looked to me like an attempt to boil things down to just the part of the algorithm that Knuth wanted to elucidate at any given moment. As such, it does a much better job than (for instance) Pascal would have done. I have read contemporary books that used Pascal as a descriptive language, and things were way too verbose. You’d frequently encounter lengthy multi-page listings that completed distracted from the discussion of the algorithm. By comparison, Knuths writing style is famously compressed and would simply not have been compatible with these elaborate program listings.

        (M)MIX had a bit of a learning curve, but when you got past that point it did do the job. I don’t think it was the ideal choice though; but maybe, Knuth realized that too late and couldn’t change any more. For all its benefits as a small abstract language, it ended up getting bogged down with way too many limitations that frequently made the code unreadable. I have seen other books at the time pick different choices, and Knuth would have been well-advised to learn from them.

        ALGOL was indeed popular and did make for an OK language to convey concepts in book-form. But it did suffer from some of the same problems that Pascal had when used in this fashion. I still have some amount of PTSD from having read F.L.Bauer’s books. Concurrent-ALGOL is just odd! I think it was influenced by Bauer’s experience with lambda calculus — and that’s another really odd and poor choice for writing a book.

        On the other hand, I have seen other books use Lisp to great effect. It suffers from some of the same learning-curve problems that (M)MIX was burdened by, but it overall is the most concise language for demonstrating algorithmic concepts without requiring any boilerplate at all. And once you grok the basic concept, it is surprisingly readable. You can just focus on the narrative and don’t have to page through pages of listings. I suspect, it would have filled most of the specific needs that Knuth felt he had when picking a language, while being a lot more readable and accessible than (M)MIX. But I am sure others would disagree. It’s hard to pick a good language for discussing these type of concepts.

  17. Who are we talking about? “Everyone” should learn a bit about how computers work, including the logical fundamentals, machine language, assembly, higher-level languages, and algorithms (to name a few items). That is, everyone should get a little bit of knowledge about the lay of the land in order to know what exists out there.

    But as far as learning the details of something and how to wield it effectively, that’s another story. At this point, I’m assuming that “everyone” is “computer science students”. Again, for something like assembly, I’d put it after the introductory courses that have taught a bit about everything, since learning assembly does require you to know about computing fundamentals, and it is very useful to know about concepts of higher-level languages and algorithms to understand why certain things exist in assembly.

    Having covered the introductions, whether someone should learn details of assembly or details of higher-level languages first, that’s a matter of personal choice, I think. Should a course in assembly be required of CS students? If the introductory courses for CS that include assembly are more detailed than similar courses for the general student body, then perhaps not. If the intro course doesn’t require writing at least a simple program, then a course that does so (and perhaps covers more than just assembly) might be a good idea.

  18. Why not go with a PIC or ATMega product. Preferably the 8bit lines. Go ahead and learn some of the more obscure functions of the chips while you’re at it.

    Similar to the old 8bit micros in performance but a bit more current. And you can poke some I/O with it to do things.

    1. And since I didn’t address it in the comment, I’d go with “whenever you’ve got a programming task that’d benefit from it” for when to learn assembly. Unless you really like learning without a practical task, in which case, go nuts.

      1. Definitely good advice. Get some motivation and purpose in the mix rather than ivory tower education. I personally could never get too excited about emulators. You can go buy a blue pill (STM32F103 board) for $2.00 and have a 32 bit ARM to play with. There is my recommendation. There is magic (for me anyway) knowing my code is running on that hardware right there on my desk.

    2. My opinion (having written a LOT of PIC code): PIC is not good for beginners. Banking is a bear and the source of many errors. Low end has no proper stack. Some of the newer ones or the non-8-bit maybe. But 16Cxxx is not a great place for people to start, I think. AVR is better of the two.

      1. I have a number of PIC chips, and boards to program them, but have never done any thing more with them. The architecture just seems WRONG to me.

      2. As a beginner with a Commodore 64 in the 80’s, I learned all about memory banking. It’s just part of using some systems that were designed a specific way to maximize features while keeping cost low. So when I got into PIC assembly, the banking was just a normal thing that some 8-bit systems do.

  19. When applying power to a computer board, a small amount of assember code, which is machine specific, needs to run so that memory is allocated for C programs to run. For example, there needs to be a memory block for instruction call stack. Then other programs generated by software compilers can be supported.

    Programming languages trend to have a rising level of abstraction so that the programmer workload is managed and errors are reduced. For example, C programmers make mistakes with memory allocation so the Rust and Java languages imposes more discipline.

    It is better to first learn high level languages (HILs) so that Best Practices become habits. Let engineers fuss about performance and let programmers attend to usability, functionality, and such.

    1. heh i never thought of framing this as “how much assembly does the *entire system* need?”

      i was personally surprised to find that once i sat down to use C on an stm32 (without an SDK runtime library), i didn’t need even a single line of ASM. i wound up using 3 instructions for an in-line asm microseconds delay loop, just to count the instructions (gcc could have created it too, using __asm(“”) to avoid optimizing it away, but with less predictability). but initializing the stack frame, providing a power-on entry point, all that stuff turned out to be regular C on the stm32. i did wind up with a 26 line ldscript, which isn’t assembly but is basically at that level. and i also needed to like “const struct vectors_rec vectors __attribute__((section(“.vectors”))) = { … };” to define the initial stack and ISR table. __attribute__((section…)) is a gcc extension, not standard C. in hindsight, i could probably have avoided the C extension if i’d written my ldscript differently.

      to do the same non-SDK-runtime C on the rp2040 pico, i wound up needing 40 instructions of assembly. that’s still “not much” but it’s quite a bit more than zero. i did have variants during development that used much less assembly (just initialize the stack pointer and branch to main, 3 or 4 instructions). but once i learned how everything worked, i wanted to write the whole crt0 in assembly in order to guarantee that it could fit within 256 bytes for the flash bootloader. otherwise, the -O setting used in compile would determine whether it fits, whether it works.

      but the rp2040 also has a bootrom, which i haven’t even looked for the source code for, but i imagine that has a lot of assembly in it, especially as it tries to provide efficient implementations of primitives like memcpy.

  20. For your average person in school through college who isn’t going for a technical career but they have a hobby interest in programming: They should be encouraged to spend a few hours playing around with a simple-architecture assembly language and do some toy programs. Set it aside when it gets boring or frustrating.

    Anyone whose career depends on using a compiled language should probably spend at least 100 hours (less if they are quick studies) writing some small-but-complete programs that teach the essentials, along with some toy/practice exercises using a debugger’s disassembler feature.

    Tech-career-minded people whose career doesn’t depend on using a compiled language should be encouraged, but not required, to delve into assembly.

    Beyond that, the “early/late/never” decision is going to vary for each person. I would be a fool to make any recommendations.

  21. Started with basic and assembly (natural progression for C64 and Z8x machines in the 80’s, if you only had the manuals in the box), then moved on to C, C++, java, etc… and other processors (even did the 8085 in tech). Still prefer assembly and K&R C above all else, simply because they are “simple”…I’m older than 5 years and can decide for myself if I want to use pointers or not. I don’t need to spend 90% of my time trying to “convince” a compiler to do what I want it to do.

    I would suggest learning assemby as quickly as possible. Once you understand what, why, and how a processor works, no other processor or programming language will ever stump you. It seems to be the best kept secret (it shouldn’t be), but…the compute paradigm on this planet has not changed since the very first processor invented (that is now 70-odd years). If you understand one, you understand them all. Each new one just comes with different flavoured spice (terminology). An instruction is fetched from memory, decoded, does what it was told, save the result somewhere, and mostly set some flags…that and moving data back-and-forth. That is a processor.

    For someone starting out in assembly, I would recommend you stay away from the horrible architectures like x86 and ARM at first. You will most probably commit suicide if you try to take them on for your first try. Take Al’s advice, get a simulator, and pick a Z80 or 6502…something with a flat memory map, no segmentation, NO MMU, no multi-core, no branch prediction, no protection rings…just a flat processor and related architecture. Even PIC and AVR microcontrollers will do if you want to experience the hardware “feel” of coding in assembler. Once you manage this, and decide you want to take the assembly thing further, go for something more advanced. The basics, no matter which processor, will stay the same. And always remember: the processor datasheet and reference manual is your friend.

    1. What is horrible about ARM? It is 32 bit, you have a bunch of registers all the same, none of this A, B, C, DE, HL like in the nostalgic 8 bit processors. As long as you are writing user level code rather than supervisor code for an operating system, it is all clean as a whistle.

      If you want to add r0 and r1 and put the result in r4, you do:

      add r4, r1, r0

      You don’t have to deal with most of what is mentioned above unless you are writing an operating system. Stuff like branch prediction is transparent to the programmer.

      1. ARM is not as easy for someone that has never touched assembler before…and whether you have a choice over what you use or not, for a beginner that is trying to plug the pieces of the puzzle into the right slots, it can be very confusing when there are pieces of the puzzle left over when done. Let them first try on the simplest of processors, to instil the basics…baby steps.

        Not impossible, but I would not recommend ARM for your first foray into assembly. ARM might have started off as RISC, but I think they are decidedly not RISC any more on the high end of the range.

        1. I would suggest considering something designed for simple embedded things, arduino or Pi Pico.

          it’s hard to write a significant program in assembly, but being able to write something to turn on a LED, run a motor, respond to a button, etc is fairly easy and it gives you immediate feedback (as well as understanding exactly how simple computers really can be)

          arduino is popular, but it’s slow compared to other options.

  22. CS at a liberal arts school. We learned all aspects, from digital logic up to the highest levels of abstraction at the time 1990. If I were at a tech-focused school, I don’t think anything would have changed.

    1. My first “computer” was an AT&T Cardiac cardboard computer, but the first real program I wrote was assembly code for a Bendix G-15D at the local high school (I think I was in sixth grade). Not sure how much longer those computers were still available, as time-shared BASIC systems over acoustically coupled modem with an ASR-33 TTY (e.g. HP2000 or DG Nova) were replacing the old space heaters.

  23. As much as I would agree to learning assembler, I think, that most “developers” will get by being oblivious to what’s actually going on behind the scenes.

    With the adoption of high level languages and frameworks, most people don’t care. They just produce code (often crappy code because “computers are getting better, so why optimize?”). Even error handling is a fowl phrase. I see this on a daily basis.

    Had a “developer” claim, that the best programming language was the one with the most commands… hence, all embedded systems should run Javascript and not C/C++… One candidate claimed to be a C++ expert, but had never used C#, because he “never worked with compiled languages”…

    1. I fully agree that most “developers” don’t have a clue about what’s going on behind the scenes.

      But I will contend that all of the top tier developers have a pretty good idea. They may not understand all the way down (the number of people who can work in both hardware and software is EXTREMELY small), but the more they understand the further out their limits are

    2. And therein lies the root of much of the bloat and misery we deal with on a daily basis. But we are on the same page. I learned some things recently dabbling with C#. It is more akin to Java than it is to C++. I am not saying that is a good thing. There are several things amiss with the quoted comment by your “candidate”. Looks like you asked the right question.

      1. Believe me, the entire resume seemed fabricated. He even tried to dazzle with “Aerospace Engineer” and CNC.. Unfortunately, I’m a former Aerospace Engineer and have worked a lot with CNC.. And if you don’t even know terms like DUT or regulations in his own area of “expertise”…

  24. If you plan on doing any embedded programming, even embedded Linux, you should learn the basics of assembly after learning C, C++, RUST, or whatever else you plan to use. It aids greatly in debugging and helps you understand how the CPU works, something crucial when you’re that close to the metal. If you plan to do web or application programming, you should learn a simple or synthetic assembly as part of a systems architecture or similar course sequence. Again, this is to better understand how the CPUs work.

  25. Another thought here. I have learned more by READING and analyzing assembly language than anything else. I am thinking of various boot roms I have disassembled. I have probably had 10 times more experience with assembly in this way than I ever have had writing assembly code. Yes I am talking about reverse engineering and it is one of my favorite hobbies. Once you get a rom disassembled there is the potent draw of the puzzle that I (for one) cannot let go of. Highly recommended. And you see everything from amazing tricks to disasters of coding that somehow manage to work.

  26. People who program in assembly languages, develop over time collections of pieces of code that worked well when performing a certain task or operation. One could call them functions. Back in the day these functions were stored on pieces of punch tape, and programmers would copy the required ones, and clue them together, then copy to fresh tape and thus develop an entire program from those scraps of working code. If only there was a way to perform this task on a computer without compiling those scraps by hand. Hmmm, I think it actually might work…

  27. When I took THE programming course at Duke in the early 1960s, we actually wrote our first programs in machine code. The machine was an IBM 7072, a decimal machine with 10 digit + sign words – 10,000 of them. It supported integer (10 digit) and floating point (2 digit exponent, 8 digit mantissa). We programmed a few simple computations punching the machine words for the computations and data as well as their addresses on cards. Someone had written a simple monitor that would load the cards, execute the code and dump the machine’s registers and the memory area involved to a tape that would be printed on an attached IBM 1401 that had the card reader, printer, etc.

    After a week or so of this, we were introduced to the assembler and could now write our programs using symbolic addresses, mnemonics for the instructions, etc. Clearly, we were better off with assembler than with writing numeric codes and doing address calculations by hand.

    About the middle of the term, we were introduced to Fortran II and could now write mathematical computations that were relatively easy to understand. The machine had hardware support for Fortran “do” loops, an area in low memory in which the words were split into a limit and count part (5 digits each) so that a single instruction could perform the increment and test and branch function required by the loop. This enabled us to easily write substantial programs in support of other coursework.

    Having learned to program starting with the concrete and proceeding to increasing levels of abstraction had advantages. At any point, you had a pretty good understanding of what was actually happening in executing your computation. Some years later, I worked for the Naval Oceanographic office which had a 7074. Batch turnaround times were often several days and I was usually able to make corrections to the machine code deck of cards that resulted from compilation and avoid another compile and test cycle. For years, I made a point of learning assembly for every new machine that I used. At one point, this was required to get adequate performance from cpu intensive jobs but now seems to be unnecessary for most of the codes that I write.

  28. If you need to optimize processes because you are working on deep OS-level stuff (including microcontrollers) or have a need where every clock cycle of efficiency shaving counts, you could learn assembly.

    If you want to learn it “to feel more close to the machine”/for props/… then first do a machining (metalworking) and instrument maker workshop and build Babbage’s Analytical Machine. Also, before flicking on the light, learn how to wind a coil, take that same machine shop and build your own wind generator (hydroenergy or steam power is also allowed, if that’s more your interest). Feeling like eating a hamburger? Sow a field of wheat for the bun and get a calf and some sharp knives.

    (Says the person who owns 34 tape embossers just for fun)

  29. I like the get them hooked on programming first approach since if you start with Assembly you will drive away some very talented future programmers. you wouldn’t make an infant learn sentence structure before they are allowed to learn words, not that you could since you need words to explain grammar. I started with the sweet 16 interpreter and then moved to other basics and then C. Professionally, I loved going through my compiled C code looking for redundancies and then eliminating redundant instructions to speed things up. I’m not sure how that would work with modern compilers though. I’m retired ;-)

  30. When my dad initially tried to teach me programming it was on his heavily upgraded first gen IBM PC. What I can say from that experience is that you definitely do not want to be learning on late 70’s hardware in the early 2000’s. The idea of a simple programming environment is a good one but it is more important that the hardware be usable condition with accessable learning materials. That and don’t use books for versions of programming languages that are simultaneously too old and too new for any available programming environment you have access to.

  31. What about programming hex assembly for your old TI-83 calculator? It’s z80 based and you just need the memory map.
    My math teacher knew how, and I’m learning now because it’s self-contained and portable.

  32. Adding my 2 cents. In my case, programming started with Fortran IV followed by IBM 360 assembler back in the 70’s on punch cards. Then Basic on Altair followed by a plethora of other languages. In CS graduate school, one of my courses was processor design and microcode. From the latter, you learn how the assembler instructions’ binary is utilized in certain classes of processors.

    TLDR. I think assembler is good at some early point in one’s programming career.

    1. It’s like building with atoms – instructions abstract nothing, it’s right there in front of you. At least, on a 6502 or 68K it is, bc admittedly there are ‘microcoded’ CPUs, but pay no attention to that man behind the curtain!

    1. Not really. I routinely design CPU cores with often quite peculiar instruction sets (compute acceleration, so a lot of weird instructions, sometimes auto-generated by a complex optimisation process). And yet I rarely even bother implementing a human-writable assembly for such CPUs, going straight to building a compiler backend for a high level language. If even a CPU creator does not bother knowing its assembly, why would anyone else?

  33. TLDR: ANY (slightly more than trivial) assembly, even for a microcontroller, will help understand what’s going on if you need to manage memory in something like C. It’s probably good to have some experience with high-level programming first so you know some of the patterns you’ll be trying to implement. Starting in assembly, it might be hard to make out the forest for the trees, sort of thing.

    My programming experience has been varied and sporadic. Programming has never been part of my job description, but I’ve managed to work it in here and there “as time allows.” In college (Microcomputing Technology), at the turn of the century, I started with one class of (useless, hardcoded data, no file access) C++, and one class of (slightly less useless, but still no file access) VB6.Then some brief PBASIC low-level automation control during the sleep-deprived blur that was my second Associate degree (Electronics Technology.)

    A few years later, for work, I had to teach myself Pic assembly to program some in-house test fixtures. This was a bit of trial by fire where I learned about branch conditions while trying to figure out how to If-Then. I didn’t even know that you could compile from C! I just started reading the command set from the datasheet. I manually moved data into memory and figured out interrupt servicing. Google was very helpful, but couldn’t hold my hand. (Although, I did copy/paste a BCD conversion algorithm. 😬) My first project used OTP chips, and I only wasted two of them! Later we used flash chips. Around this time, I also taught myself VBA on some MS Access projects. (The company did their process tracking with in-house Access.)

    Ten years ago, I decided to go back to school and get an EE. The “intro to programming” class was Java. Then they dump you into C to see if you can swim. Most of the class was drowning when it came to memory stuff. After all the manual memory management in assembly, pointers made a lot more sense. I doubt I would have done as well in that course without the assembly.

    Didn’t graduate, and the next job is where I found Python. The “project” I learned from was a kludged together test stand with an XY robot. Some of the simpler scripts pulled me in, but after reading some books and PEP 8, the C# accent in the project was so thick I could barely read it. The slightly less kludged calibration program was as hard to read and “spaghetti code” doesn’t begin to describe it. It’s like someone sat down at a plate of spaghetti and TRIED to tie it into as many knots as possible. I had a lot of aspirations for that project, and managed to develop a lot of (unofficial) tools that made things easier, but it “wasn’t my job” and I had to settle for updating it to Python 3 on my way out the door. I hope the guys in Mexico are having fun with it.

    In the midst of the fun with Python, I tried to take advantage of some employer training and learn LabView (barf!) through an online course. (Which I wasn’t able to finish before moving on.) Knowing what kind of C was probably going on in the background helped, but it was the least successful programming endeavor of my life. I managed to make all kinds of test equipment jump through some basic hoops with Python and PyVISA. I even got some DAC boards to work in Python with NI-DACmx. I NEVER got close to controlling ANYTHING in LabView.

    Now I’m back to scraping together some “misc” time here and there to work on another MS Access project. I can’t help feeling things have moved on a bit, and my brain feels a bit stiff.

    1. Well, now I know not to try html tags in the comments as pseudo formatting/emotes. That wall of text (below the TLDR) should have been wrapped in “ramble” tags.

    2. i think your comment demonstrates everything i believe about the subject.

      there is no “should”. you learn what comes in front of you, either at work or at play. the ability to learn asm (“I had to teach myself Pic assembly”) is indispensable but familiarity with any particular instruction set is going to depend on your interests and responsibilities. what should they teach at school? it doesn’t hardly matter because it’s not gonna be what you wind up using 2 jobs down the road anyways.

      personally, i learned x86 assembly because there were a few years between when i had a PC and when i got borland turbo C++. it was a practical need, to realize my whims. later, i learned assembly more thoroughly because i wanted to know everything “all the way down”. an m68k asm class was a degree requirement — i thought it was great fun but a lot of my classmates experienced it as the first weed-out course for the BS in CS degree. but i also voluntarily took the VLSI class because i wanted to know even lower. now, i need to know assembly to work on compilers, but i’m finding, even the one i use every day, i am always in the reference manual. i don’t know assembly, i know reference manual.

  34. 6502 assembly and BASIC were my first two languages. Then C, Pascal, Java, and so on. I find assembly makes understanding everything else easier. It also reminds you what is happening at the register level. As others have said, memory management is always a nightmare, but with assembly you have better grasp of it.

    These days it pays to have x86_64 assembly and ARM assembly in your back pocket. That covers most of what you’d need in the real world unless you’re working with more esoteric hardware. I found learning assembly on Linux to be easiest – it runs virtually on the Mac, PC, etc. and the system calls make things more standardized than having to deal with Windows and Mac shenanigans. Once you get decent at all the basics, you can pick up the rest. C lets you inline assembly and I’ve found that helpful in a couple of microcontroller projects.

    I doubt these days that anyone is going to out-optimize a decent compiler, but who knows, if you have a custom algorithm you may be able to shave a few clock cycles here or there that really make a difference. Then again if you want to write an OS, assembly is your go-to.

  35. Back in 1967-68 in my senior year in high school,I had the choice of learning calculus or programmin, via book, assembly language for the IBM 1620 – I chose programming – and was astonished to learn that the college I went to had one. After that, PDP8 assembly, Z80, and so on – learning assembly allowed me to get down to the nitty gritty and gave me a leg up as I developed hardware and software drivers for all the machines that I programmed after leaving college.
    50 + years later, I never regretted learning assembly of any kind.

  36. Reverse engineering needs mandatory assembly understanding, it’s a fun/educative activity but not all the cpu instruction sets are nice. Various japaneese cpus (renesas) are a pleasure to reverse, x86 has become a mess, ppc a bit better but some instructions are not intuitive at all. Arm is not that bad….if only you reverse a single core cortexM. 8/16bit cpus are not friendly if they have bank switching to address more than 64KB, but the worst one i found so far is motorola sx12, it has something like 3/4 registers and needs 30 lines of code to make a simple division.
    Some times i’ve compiled code patches by hand with hexeditor, other times i load elf in ida to find the proper way to have the compiler to compile code properly.

    1. This is why I came to comment.

      Assembly is informative, but unnecessary for most “developers”.
      Unless you are getting deep in the weeds with an obscure microcontroller, you are better off letting the compiler do the work.

      But for reversing and other security work?
      Absolutely necessary.

      Start learning assembly basics once you pass programming 101 and the basic C(++/#) course.

      Heck, you can learn a bunch of useful assembly just by learning to use Cheat Engine to fiddle with games (single player…please).

      1. I agree, it’s unnecessary for most developers, where hw resources are not a problem, we are used to see 300MB printerdriversetup.exe, but when resources are limited (small mcus, like stm32), a C/C++ developer, in my opinion, should better disassemble his code to know that compiler is not a magician. For example i learned to use shift instead of div/mul where possible, use int if you really need variable to be signed, otherwise go with unsigned, and choose a size that perfectly fit(uint8_t, uint16_t for iterating small loops). Each instruction takes its specific cpu clock cycles, if the compiler choose to use an 8cycles instruction instead of another with 4cycles, it’s not its fault, it’s a developer that writed relaxed lazy code. But yes, solution is also to ask mom a bigger and faster mcu and be happy.

        1. I think that you are missing one point here.

          HW resources are far more of an issue than most developers realize.

          No, it’s not a problem when you run it stand-alone on your beefy dev machine, or on your personal high-end mobile device.

          but put that same program on a low-end mobile device, or in production where it’s being hit by thousands of users and you have a very different situation (you can say “it’s only money”, but that money that is being payed to a cloud provider could be paying you a bonus instead :-) There are times to trade development time for production costs, and times to go back and make existing code more efficient to save production costs)

          AI development is a matter of ‘can you fit this on a single machine, or do you require thousands of GPUs to run it’ (at the two extremes)

          This isn’t saying that you need to program in assembly, just that you need to understand efficiency, and learning assembly helps you do that.

    1. Also I disagree that modern cpus instruction sets are difficult and beginners should learn z80 or 6502. Frankly 6502 assembly and the likes makes it a massive pain to do anything more than 8 bit arithmatic, and while I’d say x86_64 is a huge beast to learn, imo something like aarch64 or risc-v is reasonably simple to pick up.

      1. The real pain with the 6502, looking back after getting used to C/C++ for some decades, is how truly limited the 6502’s stack is. With C ‘local’ or ‘automatic’ variables, the storage lives on the stack, and is only valid until the function returns. The 6502 stack will not help you there (much) at all, as it has no stack frame pointer, just the basic stack pointer, so it’s really only good for saving return addresses for JSR instructions. Even if you do horse around using $101,X type addressing to play with stack data, the total stack capacity is a mere 256 bytes. You pretty much have to resort to what amounts to global variables for any amount of data. In a way it was the perfect processor for the old BASIC mentality.

        If anyone wants to learn old assembly language, I’d advise go no further back than 8086 or 68K

  37. The first computer language that I learned was BASIC on an HP2000B, and have always felt that was a fine place to start. My second language was IBM 1130 assembler, which afforded me a deeper understanding of how a computer actually worked, knowledge that has served me well over the years.

    No matter the processor I’m targeting, I always make it a point to learn its assembly language. It’s simply that debugging a high-level language like C would be much more difficult without an understanding of the generated code. I’ve also had many occasion to write assembler code where no high-level language would do, for example memory-management unit code for PowerPC.

    1. I learned IBM 1130 assembler from Bork’s book, so that I could make some patches to the operating system. We had the IBM source codes on microfiche, so I could research in the library for a few hours, learn how it worked, and record the addresses. Then I wrote the new lines of assembler, and then translated them to machine language, and punched up the patch.

      Worked great!

      I then went on to writing new versions of the printer driver, with buffering limited to available program memory. Suddenly our IBM 1403 printer was running at near fyll speed all the time, and the student programming job queue moved along almost twice as quick.

      The next year I wrote a dynamic disassembler, which used the “single step” level 5 interrupt handler to grab each instruction, as it ran, along with all of the registers, and and, if a ceetain console switch was up, then printed the corresponding assembly language mnemonics on the console printer. It was a great help in debugging a few nasty problems. One particular case was for a Cobol compiler that was written by an IBM SE, and released. It was trashing your open disk file under some circumstances. Turned out to be a disk buffering error, which was misusing an operating system resource. A shared the fix with the author, who had apparently been pulling his hair out over this issue.

      That was the fall of 1969. By the summer of 1971, there were two staff programmers, and we started on an upgrade to the IBM 1130 Fortran II compiler. By that December we were ready to realease the EMU-IBM 1130 Fortran IV compiler to the IBM 1130-1800 community.

      My next assembler work did not happen until the IBM PC came out, and I found its assemley language was similar enough to the IBM 1130 that it was easy to get started. This was very helpful during my years doing electrical controls engineering: C for algorithms, assembler for machine interface and debugging. But the debugging became much easier as the PC hardware advanced, and the developer tools took advantage.

      I had learned Fortan II earlier, but IBM 1130 was my first professional language, though I also wrote plenty of statistical programs in Fortran during those same years.

      After I left the university, with two degrees, in math and business, I used Cobol, RPG, a lot of PL/1, then more Cobol, some IBM 360 assembler, a bunch of APL, then taught myself C, which I used during my engineering work, then Visual Basic as Windows replaced DOS, and finally some C++, and then Java. Also did a lot of Foxpro and dBase 2 applications for PCs.

      Went back to school, working with high power, ultrafast lasers, and switched to science/engineering for my last twenty years of work. Now a couple of my kids are professional programmers, and I am completely out of the field. Unless you need an algorithm designed to do something!

      1. Ever heard of the “ripple move”, Peter? Was all the rage in my day.
        The ability to move a whole string of arbitrary length using a single!!! H/ASM MVI instruction.
        Memories!!!

  38. “If you want to build a ship, don’t drum up people to collect wood and don’t assign them tasks and work, but rather teach them to long for the endless immensity of the sea.” — Antoine de Saint Exupéry

    I’d say understand why the student (perhaps you) wants to learn any of this in the first place. If it’s to program gcode, make blinky lights, or to build a distributed sensor network from scratch there will be some common tools and teaching tools but the useful sequence will be different. The short version – learn assembly if you need it for a utilitarian purpose, or if you’re just what’s going on under the hood…or if you just long for “the power”.

  39. Learning assembler has been well worth it for me during a 40 year programming career.

    Perhaps the easiest way to go about it is to write a program in C and then run it through a disassembler to get the equivalent assembly code. Do piece wise optimization by revising a subroutine. You can link the assembler version of a C subroutine and run it with other C code just fine.

    Another easy approach is to learn Forth and then how to implement Forth words in terms of assembler. That platform supports a very interactive experimentation environment that makes learning assembler fun.

  40. These comments are taking me back. I was hand coding 68xx machine code on hexadecimal keyboard first. But really took off with my first programming book, Microsoft MASM. X86 code on PC compatibles was fun, and interesting.

    TSRs were the norm then. Did my own printer drivers when necessary. My first “network” was 5 PCs connected via serial ports running a multi-screen color terminal program I wrote in assembly.

    But you haven’t lived until you’ve done OO assembly. ;-)

    Of course, I moved on to C. And now stick mostly with Python.

  41. First, a bit about what I learned, starting a long time ago in high school. (A rich alumnus rented an IBM 1130 for general student use; public school). In two years, I learned in I think this order: Fortran IV (not the full version; the 1130 couldn’t support it), IBM 1130 assembler, LISP, and APL. I also learned a bit of RPG, and some other languages that I really didn’t get use. Now, I mostly write in C++ and some Python, and anything else they need me to do.
    I would suggest getting started with a higher-level language first, to concentrate on programming itself and algorithms. Then learn some assembler, probably after enough C/C++ to get the idea of memory allocation.
    But it is a good idea to understand what is going on underneath. And sometimes you need it. I haven’t written any assembler in some time, but a recent job involved debugging C++ that was optimized, and that meant trying to figure out what the assembly code was doing and trace it back to the C++ statements. Not easy, but necessary.
    And I have to say where I work now, the young’uns appreciate my experience and ask for help. Nice people. I can help easily find errors based on behavior because I made all the errors over many years, but I learned from them. Thus, I can figure out what is going wrong much faster than they can.

  42. These days one finds understandable instruction sets in embedded! Especially the classic MSP430 is nice, as a descendant of the legendary PDP-11, and also the RISC-V integer base RV32I, or maybe RV32IM which adds multiplication and division instructions, is a small and clean target to learn.

Leave a Reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.