Illumos is an OpenSolaris-derived Unix system, and no Unix is complete without a C compiler or two. And with a name like Portable C Compiler (PCC), you would think that would be a great bet to get up and running on Illumos. That’s probably what [Brian Callahan] thought, too, but found out otherwise.
PCC already generates x86 code, so that wasn’t the problem. It was a matter of reconfiguring the compiler for the environment, ironic since PCC probably started on true Unix but now won’t work with 64-bit Solaris-like operating system. According to the post:
It looks like some time ago someone added configuration for 32-bit x86 and SPARC64 support for the Solaris family. But no one ever tried to support 64-bit x86. So first we had to teach the configure script for both pcc and pcc-libs that 64-bit x86 Solaris
While there were some code changes, much of the problem centered around differences in tools available under Linux vs Illuminos. It is an interesting look at moving a tool over and gives you a taste of what life under Illumos might be like.
We always think we want to try one of the Illumos distributions or even Solaris proper. Reports of its death were apparently premature.
This is actually probably pertinent to the people over at Oxide. The rack they just started shipping runs on an illumos kernel.
Oxide is all-in on Rust, as the company name implies
They still have an Illumus distro as thei hyper visor.
They write as much code as they can in rust but there are still many edge cases where C is needed, like in the kernel itself. Afaik there are also still legacy userspace components in C.
That configure script script diff brings back memories.. in late 1990s and early 2000s, it felt like a lot of open source software needed such small tweaks to get them to compile on different Linux distributions and especially on Unix.
now i visit the land of small-tweaks-to-compile whenever i build something from 5 years ago on the same OS. in my world it’s no longer linux vs. sunos but rather linux 2023 vs. linux 2018.
for example, all of my personal projects i have had to one-by-one change my Makefile link stage from “gcc -lX11 -o foo foo.o” to “gcc -o foo foo.o -lX11”. 20 years ago, i knew linking was finnicky, so i’m sure they fixed a real problem but it just goes on the list of tiny changes that require human attention to rebuild.
the weird thing is in some of these projects i just put the libraries in LDFLAGS, and just used the default. so i would have hoped someone would have rewritten the default gnu-make rules instead of forcing me to add overrides. but i guess i probably didn’t use the correct variable or something.
anyways not a big deal but we are still in the world of tiny tweaks, and probably always will be.
Make is old and shows its age. GNU Autotools is a giant mess of slow scripts that are a major pita. I’ve switched to cmake and it is OK so far, I’ve just been too lazy to really read up on the docs for it. After 22 years as a developer, I still haven’t found a config/build system that is easy to write scripts for an isn’t so involved like your learning a new language.
yeah make is a hard problem!. i don’t have anything against cmake (haven’t used it, and it’s not suitable for some of my most pressing problems due to arbitrary requirements i have to meet), but i’ve used ant and gradle and rust cargo and so on and i have not enjoyed them.
i run into this problem a lot. it’s closely related to the test harness problem. i don’t live in the kind of world where i can use off-the-shelf CI tools (again, too many platform requirements). i’ve personally come to the conclusion that it’s best to start with something that is simple and broken and insufficient and then hack at it until it meets my needs. if i try to overdesign it from the start, it doesn’t go well, and if i try to use something that someone else overdesigned before i even started, then not only am i tied to their poor past choices, but i tend to be tied to their poor future choices as well.
better to just accept that it’s a hack and hack it hackily. i’ve become a make nihilist :)
i think it’s a kind of extreme example in my experience of a philosophy Chuck Moore pointed out. always be eager to refactor, but don’t prematurely factor. solve the problem in front of you, without anticipating future problems. then refactor at the drop of a hat when you do meet future problems. if you start with a “good design” that encompasses all of your future requirements, then you’ll be too beholden to that design to change it when you find out it’s wrong.
Bringing PCC to Illumos might have been a fun exercise, but it’s hard to believe it was actually needed. Illumos is written in C so it has to have a C compiler available; I would expect either one derived from the original Unix compiler, or a port of GCC or LLVM/CLang.
It used SUNpro for the longest time, but efforts to move to GCC were successful and that’s the reference compiler, with minor patches.
Libraries belong in the LDLIBS make variable. In automatic linker recipes those go at the end of the command line, and manual linker recipes they should also go at the end.
My comment was in reply to Greg A. Not sure why HaD’s comment system is being janky.
Is that because of Stallman being notoriously stubborn in accepting changes?
Not sure what Stallman has to do with this.
Using LDLIBS has always (AFAIR) been the proper way to link libraries in a makefile. Many linkers require including libraries at the end of the command line, even though some versions of the gcc linker have supported libraries before the object files on the command line.
hey awesome! thanks! :)