When we write about retrocomputers, we realize that back in the day, people knew all the details of their computer. You had to, really, if you wanted to get anything done. These days, we more often pick peripherals and just assume our C or other high level code will fit and run on the CPU.
But sometimes you need to get down to the bare metal and if your desire is to use bare metal on the RP2040, [Will Thomas] has a YouTube channel to help you. The first video explains why you might want to do this followed by some simple examples. Then you’ll find over a dozen other videos that give you details.
Any video that starts, “Alright, Monday night. I have no friends. It is officially bare metal hours,” deserves your viewing. Of course, you have to start with the traditional blinking LED. But subsequent videos talk about the second core, GPIO, clocks, SRAM, spinlocks, the UART, and plenty more.
As you might expect, the code is all in assembly. But even if you want to program using C without the SDK, the examples will be invaluable. We like assembly — it is like working an intricate puzzle and getting anything to work is satisfying. We get it. But commercially, it rarely makes sense to use assembly anymore. On the other hand, when you need it, you really need it. Besides, we all do things for fun that don’t make sense commercially.
We like assembly, especially on platforms where most people don’t use it. Tackling it on a modern CPU is daunting, but if you want to have a go, we know someone who can help.
29 thoughts on “Videos Teach Bare Metal RP2040”
Mayby I’m too old, but I think youtube is not the right tool for this kind of tutorial…
YouTube is excellent for getting overview of something or a lot of “facts” in quickly but for in-depth knowledge its a poor choice
That said, its also excellent for getting started. For an absolute beginner to setup say, the SDK and toolchain and get started with finding the documentation.
Setting up sth. like that is still a 1000% better explained with a few screenshots, maybe screen-recording-apngs and text in between.
If only for search-, recycle-, reuse-, edit- and upgrade-ability.
People learn in different ways. For a lot of folks, watching these kinds of videos is like working together with another person to understand and learn the system.
I’m left thinking exactly the same about a great deal of stuff that appears here (including the old bit!).
If it moves, lights up or makes a sound, a video is great to show it in action.
But for learning technical detail give me a written article. (Yeah I know, longer to produce and harder to get ad revenue.)
The argument that written articles take “longer to produce” always surprises me.
To produce a good video, you have to write out a script, film the individual scenes, edit the video, dictate the text/descriptions, then edit the audio track into the video.
Producing a written article is the “write out a script” part. You’ve got that whether you do a video or a written article.
I fail to see how a video can be easier to produce than a written article – at least, if the video is supposed to be any good.
Monetizing is probably harder for a written text – I don’t know where you’d post stuff to get paid a fraction of a penny each time someone reads your stuff like YouTube does for views.
I think your phrase “good video” is key here. So many seem unplanned, off-the-cuff, and unedited.
And to seem that way but not be so terrible you immediately stop watching says lots of time in the edit or there is a script and lots of tripping over tripods… Either way just writing decent documentation seems like it will be substantially quicker.
yeah i was astonished that this ‘bare metal’ article is actually apparently at the level i like to program microcontrollers… (the past editions have been mostly using C with copious library/header support) but then it’s youtube? come on! i have bookmarked a bunch of stm32 resources and never once thought “gee i wish this was youtube instead of flat html”. when i get started on rp2040 this is not going to be one of the things that helps me.
A lot easier to copy code out of an html, too!
No kidding! This would be far more accessible in text form (it maybe makes sense to have a video of the blinking LED). Text you can read at your own pace and is searchable, so it boggles my mind that anyone would choose a medium without those table stakes features.
I suspect you’re right though and it’s generational. It alarms me though because I just can’t fathom what the mental furniture and thought process must be like for these young’uns. I didn’t expect the world to change out from under me to that extent but then I guess nobody ever does.
> I just can’t fathom what the mental furniture and thought process must be like for these young’uns.
Me (n?)either. But I’m asking myself if they “evolved” that way on their own accord or because capitalism trains them to (= YT monetization)…. :-/
While it’s great that Will Thomas is showing a quick and easy way to get started in assembler on an ARM Cortex-M CPU, I think this leads very quickly to a dead end, for the following reasons:
1) In his code examples, he calculates the offset address in his head. The only reason he would do this is that his assembler lacks the very VERY basic ability to do arithmetic on constants.
2) Again, problems with constants: to calculate the bit mask for GPIO 25, rather than just saying “1<<25", he generates code for the RP2040 to calculate this, using the lsl instruction. Again, manipulating constants is an essential feature for any assembler, because the target machine should never do a calculation when the result is always the same!
3) He runs into trouble in the blink video, because he has padded the code by two bytes to word-align it. Holy cow. How can you not include a set of alignment pseudo-instructions for an assembler that is needed for the architecture?
All of these, and a number of other misunderstandings evident in his first two videos are enough to tell me that Thomas has very little experience with assemblers. Sure, he'll probably figure it out in time, but to me this is just an example of the YouTube culture: I got it to work, so it's time to make a video. Already, from the first video to the second, he has changed how you specify the target architecture in his examples, so my guess is that none of his early examples are going to work, unless you go back in Github to the versions of his tools he was using at the time he made each video. No way I'm going to watch ten more of these.
Okay, flame on, I'm ready for it
But before that, let me just say, I will probably download his source code for his toolchain, because it is my intention to have a 1980s-style personal computer based on the RP2040 (or some other Cortex-M) that is fully self-contained. That is, eventually a full C toolchain, but for a start, a text editor, assembler, and linker, along with a command line interpreter and loader, so that I can type in programs and build them without a “host computer”. A number of people have already written interpreters for a number of languages, to run on Cortex-M systems, and this seems like a big waste to me, since it’s on machines like these that it still matters that your code executes efficiently.
But the problem is, yes, of course I already have access to the source code for a toolchain for Cortex-M. It’s called GCC, and it already targets armv6-m, and heck, already has a C/C++ compiler. But I really don’t think I want to try to port THAT monster to a system with less than half a megabyte of RAM and 2 MB of Flash memory. So maybe, just maybe, starting with his code would be a manageable task. Yes, I realize I’m just like that guy who would rather write his own operating system than learn how to write code that works in someone else’s, because that’s too complicated, but that’s kind of where I am.
90% of all ARM devboards and MCU use GCC. Even AVR use GCC.
It doesnt really matter if it has less that half megabyte of RAM.
You dont run GCC on the target,
just a note on this…when i got started with the stm32, i found it really frustrating to build a gcc cross-compiler. i ran into one of those cross-compiling catch 22s, a real failure of gcc’s configure scripts, and i just got angry and quit. i wanted to program at a low-level anyways. i definitely did not want to pull in any sort of runtime library, no libc or crt1.o or libgcc.a sort of stuff. so i just used asm, and i have become pretty comfortable with the limitations of that.
but for a recent project, i just decided to use C. and this time, it was as easy as “apt install gcc-arm-none-eabi” and i have an ARM compiler that can target the stm32. anyways, that brings me to the reason i’m writing this: it’s *exactly* like asm. there’s no runtime library, i’m still rolling my own headers full of I/O mappings and so on (i am reinventing CMSIS myself because that kind of individuality is my defect). i’m still using my own ldscript that uses C global arrays __attribute__((section(“.stack”))) and another __attribute__((section(“.vectors”))). i even have a little C function that executes on start-up that copies the gcc initialization section into RAM so i can have initialized global writable variables.
so i’m saying, even if you use C, you can still use exactly the same idioms as bare metal ASM. which is pretty slick. and ARM makes it pretty compelling…like for 8051 or PIC12 or even atmega, the kind of idioms that a C compiler needs to spit out are an awkward match…it’s easy to accidentally trigger a bloated composite 16-bit arithmetic operation to satisfy C semantics even when you didn’t need it. but ARM code from C usually winds up as simple as you could imagine when hand-coding.
Yes. I’ve done the same on the Rpi.
Maybe you didn’t understand what I meant by “self-contained”. Indeed, I DO intend to run the tool chain on the target machine. Even if this means I can’t have all the latest bells and whistles. Yeah, I know that’s not how it’s done; that’s why we call it “hacking”.
heh i was never that naive about assembler (ok, i was, but i was using dial up internet at the time), but i appreciate your analysis of “youtube considered harmful.”
by comparison, i have an stm32 page where i documented basically the same process (i assume) of assembly programming on the stm32…accumulating the necessary reference manuals and tools, figuring out how to link, how to initialize the system clock and i/o peripherals and get a blinky running. i wrote it like a diary. in the first entry, i gave version numbers for the tools i was using…and in the second entry, “i just learned about the -march= and -mcpu= and .syntax directives that i should have been using all along.” exactly the sort of content that is trivial in text but hard to just a vague sensation that the ground is moving in video.
i can’t imagine getting something out of that same experience but on youtube. i’m impressed that you managed to watch it :)
Is that info posted somewhere we could get a link to?
I second this request!
shoulda posted it first huh
Wait. This can’t be right. He writes a timing loop that does a subtract 1 and branch if not zero, and has it run through that loop a million times, and that that’s enough of a delay to cause the LED to blink about once per second. HUH? This is machine code, running out of RAM on a 120 MHz RISC CPU? Seems like the kind of performance I would expect to get out of a 16 MHz AVR. What am I missing here?
If I understand the RP2040 datasheet at 188.8.131.52 correctly, after boot the chip runs with the clock generated by the internal ring oscillator which is typically 6MHz.
My guess is that the SDK will initialize the clock to use the external clock and set dividers so that it will run at 125MHz.
I haven’t seen the video, but my guess is he didn’t setup the clock and is still running on the ~6MHz clock.
That agrees with what I gleaned from the boot ROM source code (at https://github.com/raspberrypi/pico-bootrom/blob/master/bootrom/bootrom_main.c). Assuming I’ve understood correctly: if it’s booting from USB it sets up a 48MHz clock because the USB needs that, but otherwise it leaves any clock setup to the program in Flash.
Oh good. Thanks.
“When we write about retrocomputers, we realize that back in the day, people knew all the details of their computer.”
Better to write about c/c++ 1960s software technology forced attempt on 4 nm or less nano computers?
1 Buggy 2 malware vulnerable.
Another update? https://www.prosefights2.org/irp2020/p012622/barf.wav
Please be kind and respectful to help make the comments section excellent. (Comment Policy)