The make
tool turns the big 4-0 next month, and we thought we’d start up the festivities early. In a two-part series, I’ll cover some of the make
background that I think is particularly useful, and then focus on microcontroller-specific applications. If you’re still cut-and-pasting a general purpose makefile to run your toolchain, hopefully you’ll get enough insight here to start rolling your own. It can be a lot simpler than it appears!
Just as soon as the C programming language was invented, and projects started to get a little bit bigger than a “hello world”, it became obvious that some tool was needed to organize and automate compilation. After all, if you’ve got a program that’s spread over a number of files, modules, or libraries, it’s a hassle to have to re-compile them all any time you make a change to just a single section of code. If some parts haven’t changed, you’re just wasting time by re-compiling them. But who can keep track of all of this? Make
can!
In fact, make
can do just about anything. It’s very good at handling generic rules to resolve complex dependency chains among files. It knows which files have changed along the way, and only re-builds whatever is necessary to get the job done. And it’s highly extensible — you can train make
, via carefully crafted makefiles, to do essentially anything. I once used make
to automatically e-mail out grade updates only to those students whose grades had changed.
In this age of IDEs and GUI-driven development, there’s something to be said for writing code with an editor, a compiler, and a makefile. But makefiles can be dauntingly complex. Have a look at the “standard” AVR makefile, for instance, written by Eric Weddington and Jörg Wunsch. I used that file for countless projects over probably ten years without seriously reading through and understanding it all — it just does too much. Similarly, [sudar]’s makefiles will compile Arduino code for you without touching the Arduino IDE. That’s great when you need it, but it weighs in at 1,500 lines of code.
A one-size-fits-all magic makefile isn’t a bad thing, unless you want to learn about how the make
system works. In that case, all of the generalities, special cases, and gilded lilies just get in the way. And I’ll claim that for many projects, writing a quick project-specific makefile is a good practice. It keeps things simple, and helps make sure that you know what’s actually going on.
So we’ll start here with an absolutely minimal makefile, and work our way up toward something that’s more reasonable for most projects. Along the way, I’ll work through a tiny subset of the tools that make
gives you. Why? Because most of the features are for such special (or general) cases that you’ll almost never need them in practice. My goal here is to get you understanding the make
system well enough that you don’t need an IDE to get your coding done.
Sometimes the Coolest Makefile is No Makefile at All
So what’s the most minimal makefile that you can use to compile your project? Would you believe none at all? The make
system has a tremendous amount of functionality built into it. I want you to think of a makefile as expanding or customizing that inherent functionality.
To show you what I mean, here’s “hello world” in C, written for my desktop computer:
#include <stdio.h> int main(void) { printf("hello world!\n"); return 0; }
Type that in, save it in a file called hello.c
and you’ve started programming. Next, you’ll want to compile and run it. Assuming that you’re using gcc
as your compiler, you could type something like gcc hello.c -o hello
and then run the resulting program (hello
on a Unix/Mac, hello.exe
on Windows) when you’re done.
Honestly, it wasn’t all that much to type, but how much cooler is typing simply make hello
? Lots cooler. But more to the point, it demonstrates that make
already knows how to do some simple stuff, like compile C code into an executable program. Even with no makefile
present, typing make hello
will run the following command for you: cc hello.c -o hello
— almost exactly what we did by hand above.
Rules all the Way Down
How does it do this? Make
has pre-defined rules for making files of different types from other files, and it’s pretty clever about chaining these rules up to make your project work out. So, for instance, if you need to build hello.hex
to flash into your microcontroller, and there’s a rule to make hello.hex
from hello.elf
, and a rule to make hello.elf
from hello.o
(and maybe some other object files), and a rule to make hello.o
from hello.c
, then make
will chain them all together to get the job done.
As an example, let’s write out an explicit rule so we can see how it works:
hello: hello.c gcc hello.c -o hello
The first line tells make
that our executable hello
depends on hello.c
. The tab-indented line that follows is the rule that tells make how to get from the dependency to the target. Indeed, if you wrote this into your makefile, you’d be just about where we started with no makefile.
Makefiles start to get interesting as the number of sub-parts of the project increase. Say we’ve split our code into two modules now, and we want to compile them together. One way to handle that would be to include the dependencies explicitly:
hello: hello.c extras.c gcc hello.c extras.c -o hello
But you can see that as we include more and more source files, we’ll have a lot more typing to do. We could define a variable to hold all of the source files:
SRC=hello.c extras.c hello: $(SRC) gcc $(SRC) -o hello
Our target, hello
, depends on all of the source files that are listed in the SRC
variable, and all of those are included in the compilation command. That’s a decent solution, and fairly readable, which is a virtue. But as we demonstrated when we used no makefiles whatsoever, make
already has built-in rules to handle simple cases like this. We should use them.
Customizing the Default Rules
Make
‘s default rules are called “implicit rules” in the docs, and now that we understand an explicit rule, let’s look into the implicit rules a little, and see about how to tweak them. For instance, the implicit rule that turns a C file into an executable looks like this: .
%: %.c $(CC) $(CFLAGS) $(CPPFLAGS) $(LDFLAGS) $(TARGET_ARCH) $^ $(LOADLIBES) $(LDLIBS) -o $@
(You can find this by running make -p
and looking through the resulting 1,500 lines.)
There’s a lot going on here, but the basic format of the rule is the same: a target (%) depends on a C file (or files) and then there’s a tab-indented rule that says how to get there. Only here, the %
is a wildcard, the special variables $^
and $@
refer to the dependencies and the target respectively, and everything else that you’d likely want to change is included via those “implicit variable” names that appear in all caps.
If you’d actually tried to run the makefile
-less version in the first section on Windows, it might have failed. Why? Because my Windows system doesn’t call the compiler “cc” — the default value for the variable CC
— but rather “gcc”. We can fix that up by defining the CC
variable to our compiler of choice.
Additionally, I like to compile with all possible compiler warnings enabled by default, passing the compiler the -Wall
flag. That’s exactly what the CFLAGS
variable is for. To always compile with warnings on, we simply define CFLAGS=-Wall
and we’re set. Finally, the way the implicit rule is defined supports multiple C-file dependencies, so we might as well list all of the files needed for our project in a single dependency statement.
Our improved makefile
, specifying a particular compiler, two dependencies for our compiled target, and the compiler warning flag, looks like this:
CC=gcc CFLAGS=-Wall hello: hello.c extras.c
Notice that we didn’t have to specify a rule to go with the target/dependencies because there’s already an implicit rule that matches this pattern. Since the hello
target it the first explicit target in the file, it gets made automatically, and we don’t even need to type “hello” every time. The result is that we can simply type make
and it runs gcc -Wall hello.c extras.c -o hello
.
For further customization, all of the explicit variables are documented, so you can read through that list if you’re curious about whether an option belongs in LOADLIBES
or LDLIBS
. And we’ll be working with these more next time, so stay tuned.
Getting Fancy
Notice how far we’ve gotten with just a three-line makefile
. You’ve been introduced to the implicit rules, and gotten a bit of the flavor of how to override the implicit variables that go along with them to customize the compilation.
Make
gets really interesting when cross-compiling and adding all sorts of custom targets to drive an embedded programming toolchain. We’ll get into those in the next installment, with minimal AVR and STM32 ARM Cortex makefile
examples. The goal is going to be maintaining the same sort of readability, but as Albert Einstein said, we’ll be aiming for “as simple as possible, but no simpler.”
Windows Postscript
If you don’t have make
installed on your Windows system yet, I had success with mysys2
. Follow the initial install instructions through the update, and then pacman --needed -Su base-devel mingw-w64-i686-toolchain
will get you everything you need to make
and compile in C, C++, and a few other languages. The console applications from the mingw
package have all the right PATH
variables set so that you can just type make
and it’ll run.
If anyone else has a favorite or streamlined way of getting a compiler and make
up and running on Windows, post up in the comments.
Elliot, I am curious, have you tried using gradle to build a C/C++ project? For someone like me who only dabbles in C/C++ it seems like it would be easier to get started.
That’s kinda bloaty comparing to make.
s/mysys2/msys2
Microsoft has started injecting adware and spyware into its critical patch updates…
and PC sales dropped below 2007 last year…
As far as I’m concerned as an investor, the company can die in a fire.
Never thought I see the day, but I now recommend people buy a Mac or make a *nix box.
Thanks for this very relevant comment.
That comment looks like something an experimental language bot would say.
maybe it was made using make
My favorite Makefile is a generic one that can be used for all my single-file programs located in the same directory. Something like:
CC=gcc
CFLAGS=-O2 -Wall -Wextra -std=c99
LDFLAGS=-lm
And then ‘make this’ and ‘make that’ and ‘make the_next_thing’ all use those options.
Similar here. I have a Makefile.template at the root of my development folder that gets included into all my other projects.
Tip, instead of “CC = gcc” use “CC ?= gcc”. This means you can easily overrule them from the commandline.
Yes, except I have yet to need to override it. :)
For the love of Pete, start using Ninja and forget about old & slow Make.
Thank you for recommending a tool that I’ve never heard of, without a link and with a such a generic name that I won’t bother trying to google it. And no reason why it would be better. (As make is a lot of things. But slow it is not)
Found it in < 10 sec.,
https://ninja-build.org/manual.html
thanks man, some people dont know this useful ‘new’ tool named google.
endearing.
I had not heard of it so I look forward to it filtering down in Linux
Slow? I have never found make to be slow.
Old? What is the matter with that, certainly nothing wrong with old in and of itself.
Searches for ninja reveal spyware eradicators, no software build tool, but ho hum,
I am loosing interest.
OK, found “Ninja” — a discussion comparing it to make called it “yet another make replacement the world really doesn’t need”, which sounds pretty apt. If you don’t comprehend make, you are going to be equally lost with a workalike replacement like Ninja. So learn how to use make. If you have a project with many thousands of source files and have some foggy notion that Ninja will save you time, go for it. But note that the linux kernel still gets built using make.
a useful article http://www.aosabook.org/en/posa/ninja.html
For windows, if you want to eat cake and keep it, you can install cmder. It’s unix style terminal for Windows bundled with most typical programs, make being one of them
http://cmder.net/
Makefiles are awesome. Unlike an IDE where all of your build options are scattered all over the place, you can find all of your build options by looking in just one place. Emacs+make+gcc+gdb has been my toolchain of choice for nearly 30 years now.
I always think of -Wall as being a wall for some reason.
While it is a good intro, I am afraid this very problem: “If some parts haven’t changed, you’re just wasting time by re-compiling them.” simply isn’t solved in the end of the article because with the makefile you will still ended up recompiling everything to reach your target. Make gets complicated as soon as header files are added. I have yet to see a good way of automatically generating dependencies. And then there is all those autoconf that generates makefiles that are even harder to read.
Make is getting so complicated that it has its own religion. Makefile for large projects are usually very complicated. Where you will see makefile including makefiles, nested several levels down. Often times written by multiple people with different philosophies. I mean, it is hard enough to debug programs. It is even more painful when having to debug Makefiles. It is like tracing a tree manually.
For me the whole point of having a GUI and using IDEs is to avoid as many unixisms as possible. Why would I want to type commands, when my IDE has buttons called “Compile”, “Compile and program”, “Compile all”, etc. Modern IDEs keep track of file changes and adjust compilation process to recompile only those files that changed. They deal with dependencies well enough, especially in embedded programming, where hardware limits the size of software. Also how much time it takes to recompile a big embedded project? Few seconds? Minute? Two? It will take more time to edit a makefile, especially for someone who just fell victim of this particular unixism. And IDE does it already, so the whole exercise is a waste of time.
I take a different approach and avoid anything that is not unix. IDE’s are slow and cantankerous (I am thinking of my experience with Arduino and Eclipse). Makefiles are clean and simple once you understand them — of course some people write nasty and unduly complicated makefiles. The same people probably write nasty and unduly complicated software and may well simply be nasty people — but that is probably taking it too far.
> in embedded programming, where hardware limits the size of software.
> Also how much time it takes to recompile a big embedded project?
> Few seconds? Minute? Two?
well, at the last embedded project I intensively took care of the makefiles, I could beat the IDE by factorS: I joined the project when a full build took around 1 1/2 hours (which is not forever, compared to projects on “real computers”) and after I left the makefile work I got it down to less than 10 minutes.
None of the IDE guy could give me a pointer where to start taking fine grained control of the build process, so I taught make to myself. At bits, it sure was a steep road, but the lesson I learned: make IS worth gettin used.
make is old? Not as old as internal combustion piston engines, which everyone insists on using…
> And IDE does it already, so the whole exercise is a waste of time.
what IDEs don’t do (well), is blend with build automation: in a professional environment you want a computer which periodically (nightly) or upon arbitrary defined triggers walks through the whole chain like 1. fetch the whole source off the repo 2. builds all the variants out of it 3. runs the collection of tests. I’ve never seen an environment where the night guard has an instrution list stating “launch IDE, open projectt xyz, click on button [Compile all]”…
It is definitely easier to insert lines like “make this”, “make that” into a build automation system, rather than “move mouse to button […]”
Thats not true but the reason why is not explained in the first article. One (of many) automatic test of make is that it only recompiles source files if the corresponding compiled file is older than the source file (by date). So if the source file hasn’t changed the date is older than the corresponding object file which means nothing changed at all and make doesn’t recompile the source file.
“it only recompiles source files if the corresponding compiled file is older than the source file (by date)”.
Gawd I hope Make uses more than time stamps in its default decision process. Hashes perhaps?
I’m no expert what make uses for decision making, but time stamps is one thing which works quite reliable. There are more things of course but i don’t know exactly what. I hope in the second article it will be explained a bit more in detail. :)
But i’m quite sure no hashes are calculated because make doesn’t create any history files for tracking.
What’s wrong with date stamps? If the source is newer than the obj, that’s a pretty good sign it needs recompile. How would a hash help here?
It uses timestamps, always has and probably always will. Always works fine.
Yeah. There’s nothing wrong with timestamps. That’s why git uses hashing too, because timestamps never, ever fail. Look, a pink unicorn! (And as for “that hardly ever happens” you do realize that “million-to-one chances happen nine times out of ten” is not actually a joke, yes?)
TImestamping fails alright. It fails miserably with revision control systems like Clearcase where you could update a source file, got an updated one but its timestamp is in the past and older then the obj file you recently built. As a result, make does not pick up the update.
make love
make: *** No rule to make target `love’. Stop.