What’s The Deal With Snap Packages?

Who would have thought that software packaging software would cause such a hubbub? But such is the case with snap. Developed by Canonical as a faster and easier way to get the latest versions of software installed on Ubuntu systems, the software has ended up starting a fiery debate in the larger Linux community. For the more casual user, snap is just a way to get the software they want as quickly as possible. But for users concerned with the ideology of free and open source software, it’s seen a dangerous step towards the types of proprietary “walled gardens” that may have drove them to Linux in the first place.

Perhaps the most vocal opponent of snap, and certainly the one that’s got the most media attention, is Linux Mint. In a June 1st post on the distribution’s official blog, Mint founder Clement Lefebvre made it very clear that the Ubuntu spin-off does not approve of the new package format and wouldn’t include it on base installs. Further, he announced that Mint 20 would actively block users from installing the snap framework through the package manager. It can still be installed manually, but this move is seen as a way to prevent it from being added to the system without the user’s explicit consent.

The short version of Clement’s complaint is that the snap packager installs from a proprietary Canonical-specific source. If you want to distribute snaps, you have to set up an account with Canonical and host it there. While the underlying software is still open source, the snap packager breaks with long tradition of having the distribution of the software also being open and free. This undoubtedly makes the install simple for naive users, and easier to maintain for Canonical maintainers, but it also takes away freedom of choice and diversity of package sources.

Continue reading “What’s The Deal With Snap Packages?”

USB-C Is Taking Over… When, Exactly?

USB is one of the most beloved computer interfaces of all time. Developed in the mid-1990s, it undertook a slow but steady march to the top. Offering an interface with good speeds and a compact connector, it became the standard for hooking up interface devices, storage, and even became the de-facto way to talk old-school serial, too.

In late 2014, the USB Implementers Forum finalised the standard for the USB-C plug. Its first major application was on smartphones like the Nexus 5X, and it has come to dominate the smartphone market, at least if you leave aside the iPhone. However, it’s yet to truly send USB-A packing, especially on the desktop. What gives? Continue reading “USB-C Is Taking Over… When, Exactly?”

Hunting Neutrinos In The Antarctic

Neutrinos are some of the strangest particles we have encountered so far. About 100 billion of them are going through every square centimeter on Earth per second but their interaction rate is so low that they can easily zip through the entire planet. This is how they earned the popular name ‘ghost particle’. Neutrinos are part of many unsolved questions in physics. We still do not know their mass and they might even be there own anti-particles while their siblings could make up the dark matter in our Universe. In addition, they are valuable messengers from the most extreme astrophysical phenomena like supernovae, and supermassive black holes.

The neutrinos on earth have different origins: there are solar neutrinos produced in the fusion processes of our sun, atmospheric neutrinos produced by cosmic rays hitting our atmosphere, manmade reactor neutrinos created in the radioactive decays of nuclear reactors, geoneutrinos which stem from similar processes naturally occurring inside the earth, and astrophysical neutrinos produced outside of our solar system during supernovae and other extreme processes most of which are still unknown. Continue reading “Hunting Neutrinos In The Antarctic”

Ask Hackaday: Are 80 Characters Per Line Still Reasonable In 2020?

Software developers won’t ever run out of subjects to argue and fight about. Some of them can be fundamental to a project — like choice of language or the programming paradigm to begin with. Others seem more of a personal preference at first, but can end up equally fundamental on a bigger scale — like which character to choose for indentation, where to place the curly braces, or how to handle line breaks. Latest when there’s more than one developer collaborating, it’s time to find a common agreement in form of a coding style guide, which might of course require a bit of compromise.

Regardless of taste, the worst decision is having no decision, and even if you don’t agree with a specific detail, it’s usually best to make peace with it for the benefit of uniformly formatted code. In a professional environment, a style guide was ideally worked out collaboratively inside or between teams, and input and opinions of everyone involved were taken into consideration — and if your company doesn’t have one to begin with, the best step to take is probably one towards the exit.

The situation can get a bit more complex in open source projects though, depending on the structure and size of a project. If no official style guide exists, the graceful thing to do is to simply adopt the code base’s current style when contributing to it. But larger projects that are accustomed to a multitude of random contributors will typically have one defined, which was either worked out by the core developers, or declared by its benevolent dictator for life.

In case of the Linux kernel, that’s of course [Linus Torvalds], who has recently shaken up the community with a mailing list response declaring an overly common, often even unwritten rule of code formatting as essentially obsolete: the 80-character line limitation. Considering the notoriety of his rants and crudeness, his response, which was initiated by a line break change in the submitted patch, seems downright diplomatic this time.

[Linus]’ reasoning against a continuing enforcement of 80-char line limits is primarly the fact that screens are simply big enough today to comfortably fit longer lines, even with multiple terminals (or windows) next to each other. As he puts it, the only reason to stick to the limitation is using an actual VT100, which won’t serve much use in kernel development anyway.

Allowing longer lines on the other hand would encourage the use of more verbose variable names and whitespace, which in turn would actually increase readability. Of course, all to a certain extent, and [Linus] obviously doesn’t call for abolishing line breaks altogether. But he has a point; does it really make sense to stick to a decades old, nowadays rather arbitrary-seeming limitation in 2020?

Continue reading “Ask Hackaday: Are 80 Characters Per Line Still Reasonable In 2020?”

Why You (Probably) Won’t Be Building A Replica Amiga Anytime Soon

Early in 2019, it  became apparent that the retro-industrial complex had reached new highs of innovation and productivity. It was now possible to create entirely new Commdore 64s from scratch, thanks to the combined efforts of a series of disparate projects. It seems as if the best selling computer of all time may indeed live forever.

Naturally, this raises questions as to the C64’s proud successor, the Amiga. Due to a variety of reasons, it’s less likely we’ll see scratch-build Amiga 500s popping out of the woodwork anytime soon. Let’s look at what it would take, and maybe, just maybe, in a few years you’ll be firing up Lotus II (or, ideally, Jaguar XJ220: The Game) on your brand new rig running Workbench 1.3. Continue reading “Why You (Probably) Won’t Be Building A Replica Amiga Anytime Soon”

Samsung’s Leap Month Bug Teaches Not To Skimp On Testing

Date and time handling is hard, that’s an ugly truth about software development we’ll all learn the hard way one day. Sure, it might seem like some trivial everyday thing that you can easily implement yourself without relying on a third-party library. I mean, it’s basically just adding seconds on top of one another, roll them over to minutes, and from there keep rolling to hours, days, months, up until you hit the years. Throw in the occasional extra day every fourth February, and you’re good to go, right?

Well, obviously not. Assuming you thought about leap years in the first place — which sadly isn’t a given — there are a few exceptions that for instance cause the years 1900 and 2100 to be regular years, while the year 2000 was still a leap year. And then there’s leap seconds, which occur irregularly. But there are still more gotchas lying in wait. Case in point: back in May, a faulty lunar leap month handling in the Chinese calendar turned Samsung phones all over China into bricks. And while you may not plan to ever add support for non-Gregorian calendars to your own project, it’s just one more example of unanticipated peculiarities gone wild. Except, Samsung did everything right here.

So what happened?

Continue reading “Samsung’s Leap Month Bug Teaches Not To Skimp On Testing”

Quantum Computing And The End Of Encryption

Quantum computers stand a good chance of changing the face computing, and that goes double for encryption. For encryption methods that rely on the fact that brute-forcing the key takes too long with classical computers, quantum computing seems like its logical nemesis.

For instance, the mathematical problem that lies at the heart of RSA and other public-key encryption schemes is factoring a product of two prime numbers. Searching for the right pair using classical methods takes approximately forever, but Shor’s algorithm can be used on a suitable quantum computer to do the required factorization of integers in almost no time.

When quantum computers become capable enough, the threat to a lot of our encrypted communication is a real one. If one can no longer rely on simply making the brute-forcing of a decryption computationally heavy, all of today’s public-key encryption algorithms are essentially useless. This is the doomsday scenario, but how close are we to this actually happening, and what can be done?

Continue reading “Quantum Computing And The End Of Encryption”