The WIMP Is Dead, Long Live The Solar Axion!

For decades scientists have been building detectors deep underground to search for dark matter. Now one of these experiments, the XENON1T detector, has found an unexpected signal in their data. Although the signal does not stem from dark matter it may still revolutionize physics.

Since the 1980s the majority of scientists believe that the most likely explanation for the missing mass problem is some yet undiscovered Weakly Interacting Massive Particle (WIMP). They also figured that if you build a large and sensitive enough detector we should be able to catch these particles which are constantly streaming through Earth. So since the early 1990s, we have been putting detectors made from ultrapure materials in tunnels and mines where they are shielded from cosmic radiation and natural radioactivity.

Over the decades these detectors have increased their sensitivity by a factor of about 10 million due to ever more sophisticated techniques of shielding and discriminating against before mentioned backgrounds. So far they haven’t found dark matter, but that doesn’t mean the high-end sensing installations will go unused.

Continue reading “The WIMP Is Dead, Long Live The Solar Axion!”

The ISS Is Getting A New WC

Every home needs renovations after a few decades, and the International Space Station is no different. This fall, they’ll be getting a new Universal Waste Management System (UWMS), aka a new toilet.

Though the news coincides with increased traffic to the ISS, this move stems from a more serious issue with bacterial contamination during longer-term space travel. Today’s ISS toilets already recycle urine back into potable water and scrub the air reclaimed from solid waste as it gets compacted and stored. The new UWMS will act more like a food dehydrator, reducing the water content as much as possible to save on space, and petrifying the poo to inactivate the bacteria.

The current commode on the American side of the ISS was designed in the 1990s and is based on the Space Shuttle’s facilities. It has a funnel with a hose for urine and a bag-lined canister with a seat for solid waste, both of which are heavily vacuum-assisted.

Though the current toilet still does everything it’s supposed to do, there is room for improvement. For instance, women find it difficult to engage both parts of the system at the same time, and almost everyone prefers the toe bars on the Russian toilet to the more encumbering thigh bars on the American throne. Also, the current commode’s interface is more complicated than it needs to be, which takes up valuable crew time. Continue reading “The ISS Is Getting A New WC”

USB-C Is Taking Over… When, Exactly?

USB is one of the most beloved computer interfaces of all time. Developed in the mid-1990s, it undertook a slow but steady march to the top. Offering an interface with good speeds and a compact connector, it became the standard for hooking up interface devices, storage, and even became the de-facto way to talk old-school serial, too.

In late 2014, the USB Implementers Forum finalised the standard for the USB-C plug. Its first major application was on smartphones like the Nexus 5X, and it has come to dominate the smartphone market, at least if you leave aside the iPhone. However, it’s yet to truly send USB-A packing, especially on the desktop. What gives? Continue reading “USB-C Is Taking Over… When, Exactly?”

Ask Hackaday: Are 80 Characters Per Line Still Reasonable In 2020?

Software developers won’t ever run out of subjects to argue and fight about. Some of them can be fundamental to a project — like choice of language or the programming paradigm to begin with. Others seem more of a personal preference at first, but can end up equally fundamental on a bigger scale — like which character to choose for indentation, where to place the curly braces, or how to handle line breaks. Latest when there’s more than one developer collaborating, it’s time to find a common agreement in form of a coding style guide, which might of course require a bit of compromise.

Regardless of taste, the worst decision is having no decision, and even if you don’t agree with a specific detail, it’s usually best to make peace with it for the benefit of uniformly formatted code. In a professional environment, a style guide was ideally worked out collaboratively inside or between teams, and input and opinions of everyone involved were taken into consideration — and if your company doesn’t have one to begin with, the best step to take is probably one towards the exit.

The situation can get a bit more complex in open source projects though, depending on the structure and size of a project. If no official style guide exists, the graceful thing to do is to simply adopt the code base’s current style when contributing to it. But larger projects that are accustomed to a multitude of random contributors will typically have one defined, which was either worked out by the core developers, or declared by its benevolent dictator for life.

In case of the Linux kernel, that’s of course [Linus Torvalds], who has recently shaken up the community with a mailing list response declaring an overly common, often even unwritten rule of code formatting as essentially obsolete: the 80-character line limitation. Considering the notoriety of his rants and crudeness, his response, which was initiated by a line break change in the submitted patch, seems downright diplomatic this time.

[Linus]’ reasoning against a continuing enforcement of 80-char line limits is primarly the fact that screens are simply big enough today to comfortably fit longer lines, even with multiple terminals (or windows) next to each other. As he puts it, the only reason to stick to the limitation is using an actual VT100, which won’t serve much use in kernel development anyway.

Allowing longer lines on the other hand would encourage the use of more verbose variable names and whitespace, which in turn would actually increase readability. Of course, all to a certain extent, and [Linus] obviously doesn’t call for abolishing line breaks altogether. But he has a point; does it really make sense to stick to a decades old, nowadays rather arbitrary-seeming limitation in 2020?

Continue reading “Ask Hackaday: Are 80 Characters Per Line Still Reasonable In 2020?”

Lonnie Johnson, Prolific Engineer And Hero To Millions Of Kids (Even If They Don’t Know It)

The current generation Super Soaker XP30. (Hasbro)
The current generation Super Soaker XP30. (Hasbro)

To be a child in the 1970s and 1980s was to be of the first generations to benefit from electronic technologies in your toys. As those lucky kids battled blocky 8-bit digital foes, the adults used to fret that it would rot their brains. Kids didn’t play outside nearly as much as generations past, because modern toys were seducing them to the small screen. Truth be told, when you could battle aliens with a virtual weapon that was in your imagination HUGE, how do you compete with that.

How those ’80s kids must have envied their younger siblings then when in 1990 one of the best toys ever was launched, a stored-pressure water gun which we know as the Super Soaker. Made of plastic, and not requiring batteries, it far outperformed all squirt guns that had come before it, rapidly becoming the hit toy of every sweltering summer day. The Super Soaker line of water pistols and guns redefined how much fun kids could have while getting each other drenched. No longer were the best water pistols the electric models which cost a fortune in batteries that your parents would surely refuse to replace — these did it better.

You likely know all about the Super Soaker, but you might not know it was invented by an aerospace engineer named Lonnie Johnson whose career included working on stealth technology and numerous projects with NASA.

Continue reading “Lonnie Johnson, Prolific Engineer And Hero To Millions Of Kids (Even If They Don’t Know It)”

Quantum Computing And The End Of Encryption

Quantum computers stand a good chance of changing the face computing, and that goes double for encryption. For encryption methods that rely on the fact that brute-forcing the key takes too long with classical computers, quantum computing seems like its logical nemesis.

For instance, the mathematical problem that lies at the heart of RSA and other public-key encryption schemes is factoring a product of two prime numbers. Searching for the right pair using classical methods takes approximately forever, but Shor’s algorithm can be used on a suitable quantum computer to do the required factorization of integers in almost no time.

When quantum computers become capable enough, the threat to a lot of our encrypted communication is a real one. If one can no longer rely on simply making the brute-forcing of a decryption computationally heavy, all of today’s public-key encryption algorithms are essentially useless. This is the doomsday scenario, but how close are we to this actually happening, and what can be done?

Continue reading “Quantum Computing And The End Of Encryption”

The Seedy World Of Message Serialization

Look, I’ve been there too. First the project just prints debug information for a human in nice descriptive strings that are easy to understand. Then some tool needs to log a sensor value so the simple debug messages gain structure. Now your debug messages {{look like : this}}. This is great until a second sensor is added that uses floats instead of ints. Now there are sprinklings of even more magic characters between the curly braces. A couple days later and things are starting to look Turing complete. At some point you look up and realize, “I need a messaging serialization strategy”. Well you’ve come to the right place! Continue reading “The Seedy World Of Message Serialization”