Personal Reflections On Immutable Linux

Immutable distributions are slowly spreading across the Linux world– but should you care? Are they hacker friendly? What does “immutable” mean, anyway?

Immutable means “not subject or susceptible to change” according to Merriam-Webster, which is not 100% accurate in this context, but it’s close enough and the name is there so we’re stuck with it. Immutable distributions are subject to change, it’s just that how you change them is quite a bit different than bog-standard Linux. Will this matter to you? Read on to find out! (Or, if you know the answers already, read on to find out how angry you should be in the comments section.) Continue reading “Personal Reflections On Immutable Linux”

Crunching The News For Fun And Little Profit

Do you ever look at the news, and wonder about the process behind the news cycle? I did, and for the last couple of decades it’s been the subject of one of my projects. The Raspberry Pi on my shelf runs my word trend analysis tool for news content, and since my journey from curious geek to having my own large corpus analysis system has taken twenty years it’s worth a second look.

How Career Turmoil Led To A Two Decade Project

A hanging sign surrounded by ornate metalwork, with the legend "Cyder house".
This is very much a minority spelling. Colin Smith, CC BY-SA 2.0.

In the middle of the 2000s I had come out of the dotcom crash mostly intact, and was working for a small web shop. When they went bust I was casting around as one does, and spent a while as a Google quality rater while I looked for a new permie job. These teams are employed by the search giant through temporary employment agencies, and in loose terms their job is to be the trained monkeys against whom the algorithm is tested. The algorithm chose X, and if the humans also chose X, the algorithm is probably getting it right. Being a quality rater is not in any way a high-profile job, but with the big shiny G on my CV I soon found myself in demand from web companies seeking some white-hat search engine marketing expertise. What I learned mirrored my lesson from a decade earlier in the CD-ROM business, that on the web as in any other electronic publishing medium, good content well presented has priority over any black-hat tricks.

But what makes good content? Forget an obsession with stuffing bogus keywords in the text, and instead talk about the right things, and do it authoritatively. What are the right things in this context? If you are covering a subject, you need to do so using the right language; that which the majority uses rather than language only you use. I can think of a bunch of examples which I probably shouldn’t talk about, but an example close to home for me comes in cider. In the UK, cider is a fermented alcoholic drink made from apples, and as a craft cidermaker of many years standing I have a good grasp of its vocabulary. The accepted spelling is “Cider”, but there’s an alternate spelling of “Cyder” used by some commercial producers of the drink. It doesn’t take long to realise that online, hardly anyone uses cyder with a Y, and thus pages concentrating on that word will do less well than those talking about cider.

A graph of the word football versus the word soccer in British news.
We Brits rarely use the word “soccer” unless there’s a story about the Club World Cup in America.

I started to build software to analyse language around a given topic, with the aim of discerning the metaphorical cider from the cyder. It was a great surprise a few years later to discover that I had invented for myself the already-existing field of computational linguistics, something that would have saved me a lot of time had I known about it when I began. I was taking a corpus of text and computing the frequencies and collocates (words that appear alongside each other) of the words within it, and from that I could quickly see which wording mattered around a subject, and which didn’t. This led seamlessly to an interest in what the same process would look like for news data with a time axis added, so I created a version which harvested its corpus from RSS feeds. Thus began my decades-long project.

Continue reading “Crunching The News For Fun And Little Profit”

The End Of The Hackintosh Is Upon Us

From the very dawn of the personal computing era, the PC and Apple platforms have gone very different ways. IBM compatibles surged in popularity, while Apple was able to more closely guard the Macintosh from imitators wanting to duplicate its hardware and run its software.

Things changed when Apple announced it would hop aboard the x86 bandwagon in 2005. Soon enough was born the Hackintosh. It was difficult, yet possible, to run MacOS on your own computer built with the PC parts your heart desired.

Only, the Hackintosh era is now coming to the end. With the transition to Apple Silicon all but complete, MacOS will abandon the Intel world once more.

Continue reading “The End Of The Hackintosh Is Upon Us”

The Hackaday Summer Reading List: No AI Involvement, Guaranteed

If you have any empathy at all for those of us in the journalistic profession, have some pity for the poor editor at the Chicago Sun-Times, who let through an AI-generated summer reading list made up of novels which didn’t exist.  The fake works all had real authors and thus looked plausible, thus we expect that librarians and booksellers throughout the paper’s distribution area were left scratching their heads as to why they’re not in the catalogue.

Here at Hackaday we’re refreshingly meat-based, so with a guarantee of no machine involvement, we’d like to present our own summer reading list. They’re none of them new works but we think you’ll find them as entertaining, informative, or downright useful as we did when we read them. What are you reading this summer? Continue reading “The Hackaday Summer Reading List: No AI Involvement, Guaranteed”

Back To The Future, 40 Years Old, Looks Like The Past

Great Scott! If my calculations are correct, when this baby hits 88 miles per hour, you’re gonna see some serious shit. — Doc Brown

On this day, forty years ago, July 3rd, 1985 the movie Back to the Future was released. While not as fundamental as Hackers or realistic as Sneakers, this movie worked its way into our pantheon. We thought it would be appropriate to commemorate this element of hacker culture on this day, its forty year anniversary.

If you just never got around to watching it, or if it has been a few decades since you did, then you might not recall that the movie is set in two periods. It opens in 1985 and then goes back to 1955. Most of the movie is set in 1955 with Marty trying to get back to 1985 — “back to the future”. The movie celebrates the advanced technology and fashions of 1985 and is all about how silly the technology and fashions of 1955 are as compared with the advancements of 1985. But now it’s the far future, the year 2025, and we thought we might take a look at some of the technology that was enchanting in 1985 but that turned out to be obsolete in “the future”, forty years on. Continue reading “Back To The Future, 40 Years Old, Looks Like The Past”

It’s 2025, And We Still Need IPv4! What Happens When We Lose It?

Some time last year, a weird thing happened in the hackerspace where this is being written. The Internet was up, and was blisteringly fast as always, but only a few websites worked. What was up? Fortunately with more than one high-end networking specialist on hand it was quickly established that we had a problem with our gateway’s handling of IPv4 addresses, and normal service was restored. But what happens if you’re not a hackerspace with access to the dodgy piece of infrastructure and you’re left with only IPv6? [James McMurray] had this happen, and has written up how he fixed it.

His answer came in using a Wireguard tunnel to his VPS, and NAT mapping the IPv4 space into a section of IPv6 space. The write-up goes into extensive detail on the process should you need to follow his example, but for us there’s perhaps more interest in why here in 2025, the loss of IPv4 is still something that comes with the loss of half the Internet. As of this writing, that even includes Hackaday itself. If we had the magic means to talk to ourselves from a couple of decades ago our younger selves would probably be shocked by this.

Perhaps the answer lies in the inescapable conclusion that IPv6 answers an address space problem of concern to many in technical spaces, it neither solves anything of concern to most internet users, nor is worth the switch for so much infrastructure when mitigations such as NAT make the IPv4 address space problem less of a problem. Will we ever entirely lose IP4? We’d appreciate your views in the comments. For readers anxious for more it’s something we looked at last year.

Why The Latest Linux Kernel Won’t Run On Your 486 And 586 Anymore

Some time ago, Linus Torvalds made a throwaway comment that sent ripples through the Linux world. Was it perhaps time to abandon support for the now-ancient Intel 486? Developers had already abandoned the 386 in 2012, and Torvalds openly mused if the time was right to make further cuts for the benefit of modernity.

It would take three long years, but that eventuality finally came to pass. As of version 6.15, the Linux kernel will no longer support chips running the 80486 architecture, along with a gaggle of early “586” chips as well. It’s all down to some housekeeping and precise technical changes that will make the new code inoperable with the machines of the past.

Continue reading “Why The Latest Linux Kernel Won’t Run On Your 486 And 586 Anymore”