Stand by the shore and watch the waves roll in, and you’ll notice that most come in at roughly the same size. There’s a little variation, but the overwhelming majority don’t stand out from the crowd. On all but the stormiest of days, they have an almost soothing regularity about them.
Every so often though, out on the high seas, a rogue wave comes along. These abnormally large waves can strike with surprise, and are dangerous to even the largest of ships. Research is ongoing as to what creates these waves, and how they might be identified and tracked ahead of time.
Perhaps rather unexpectedly, on the 14th of March this year the GCC mailing list received an announcement regarding the release of the first ever COBOL front-end for the GCC compiler. For the uninitiated, COBOL saw its first release in 1959, making it with 63 years one of the oldest programming language that is still in regular use. The reason for its persistence is mostly due to its focus from the beginning as a transaction-oriented, domain specific language (DSL).
Its acronym stands for Common Business-Oriented Language, which clearly references the domain it targets. Even with the current COBOL 2014 standard, it is still essentially the same primarily transaction-oriented language, while adding support for structured, procedural and object-oriented programming styles. Deriving most of its core from Admiral Grace Hopper‘s FLOW-MATIC language, it allows for efficiently describing business logic as one would encounter at financial institutions or businesses, in clear English.
Unlike the older GnuCOBOL project – which translates COBOL to C – the new GCC-COBOL front-end project does away with that intermediate step, and directly compiles COBOL source code into binary code. All of which may raise the question of why an entire man-year was invested in this effort for a language which has been declared ‘dead’ for probably at least half its 63-year existence.
Does it make sense to learn or even use COBOL today? Do we need a new COBOL compiler?
You might have heard of the cochlear implant. It’s an electronic device also referred to as a neuroprosthesis, serving as a bionic replacement for the human ear. These implants have brought an improved sense of hearing to hundreds of thousands around the world.
However, the cochlear implant isn’t the only game in town. The auditory brain stem implant is another device that promises to bring a sense of sound to those without it, albeit by a different route.
The rise of streaming services on the Internet was a revolutionary shift when it came to the world of media. No more would content be pumped in to homes in a one-way fashion, broadcast by major conglomerates and government-run organizations. Instead, individuals would be free to hunt for content suiting their own desires on an all-you-can-watch basis.
It’s led to a paradigm shift in the way we consume media. However, it’s also led to immense frustration thanks to the overwhelming amount of content on offer. Let’s take a look at why that is, and some creative ways you can get around the problem.
The Paradox of Choice
Many find the masses of content on streaming services to be overwhelming to choose from. Credit: author screenshot
Traditionally, when it came to media, there were two major arms of delivery: broadcast, and home media. One might listen to the radio, or flick on the TV, or alternatively, spin up a record, or select a movie to watch on tape. If none of those options sufficed, one might take a walk down to the local video store to rent something more appealing.
Fundamentally, it was an era in which choices were limited. There were a handful of TV stations to choose from, and if nothing good was on, you could go as far as finding something watchable on tape or going without. Many will remember afternoons and evenings spent watching reruns or a Friday night movie that had been on a million times before. Some shows went as far as becoming legends for their seemingly endless replay, from The Simpsons to M*A*S*H.
As the Internet grew, though, the game started to change. Torrent websites and streaming services came along, offering up the sum total of the world’s cultural output for free, or for a nominal cost for those averse to piracy. Suddenly when it came to choosing a movie to watch, one wasn’t limited to the five or so films on at the local cinema, nor what was left on the shelves at the local video rental. Instead, virtually any movie, from the invention of the format, could be yours to watch at a moment’s notice.
With so many options on the table, many of us find it harder to choose. It’s an idea popularly known as the Paradox of Choice, a term popularized by US psychologist Barry Schwartz in 2004. When our options are limited to a select few, choice is easy. They can quickly be compared and ranked and an ideal option chosen.
Add thousands of choices to the pile, and the job escalates in complexity to the point of becoming overwhelming. With so many different choices to contrast and compare, finding the mythical right choice becomes practically impossible. Continue reading “The Joy Of Broadcast Media Vs. The Paradox Of Choice”→
Bulk material is stuff handled ‘in bulk’. One LEGO piece is a brick but 1,000 poured into a bag is bulk material. Corn starch, sand, flour, powder-coat powder, gravel, cat food, Cap’n Crunch, coins, screws, Styrofoam beads, lead shot, and gummy worms are bulk materials.
Applications abound where you need to move stuff in bulk. Selective sintering 3D printers, animal feeders, DIY injection molders, toner based PCB makers, home powder coating, automatic LEGO/domino/whatever sorters or assemblers, automated gardeners, airsoft accessories – handling bulk material is part of hacking. College science classes cover solids and liquids, but rarely bulk materials.
Most hackers just pray it works and tap the bin when it doesn’t. Industry does better, but the slang term “bin rash”, the long term result of tapping a 300 ton bin with sledgehammers (video), shows they don’t get it right all the time either. At the same time, it’s a fun area you can experiment with using kitchen items. So come along with us for a short series on the basics of bulk material handling. Continue reading “Handling Bulk Material: Why Does My Cat Food Get Stuck?”→
Artificial satellites have transformed the world in many ways, not only in terms of relaying communication and for observing the planet in ways previously inconceivable, but also to enable incredibly accurate navigation. A so-called global navigation satellite system (GNSS), or satnav for short, uses the data provided by satellites to pin-point a position on the surface to within a few centimeters.
The US Global Positioning System (GPS) was the first GNSS, with satellites launched in 1978, albeit only available to civilians in a degraded accuracy mode. When full accuracy GPS was released to the public under the 1990s Clinton administration, it caused a surge in the uptake of satnav by the public, from fishing boats and merchant ships, to today’s navigation using nothing but a smartphone with its built-in GPS receiver.
Even so, there is a dark side to GNSS that expands beyond its military usage of guiding cruise missiles and kin to their target. This comes in the form of jamming and spoofing GNSS signals, which can hide illicit activities from monitoring systems and disrupt or disable an enemy’s systems during a war. Along with other forms of electronic warfare (EW), disrupting GNSS signals form a potent weapon that can render the most modern avionics and drone technology useless.
With this in mind, how significant is the threat from GNSS spoofing in particular, and what are the ways that this can be detected or counteracted?
There is gathering momentum around the idea of adding Rust to the Linux kernel. Why exactly is that a big deal, and what does this mean for the rest of us? The Linux kernel has been just C and assembly for its entire lifetime. A big project like the kernel has a great deal of shared tooling around making its languages work, so adding another one is quite an undertaking. There’s also the project culture developed around the language choice. So why exactly are the grey-beards of kernel development even entertaining the idea of adding Rust? To answer in a single line, it’s because C was designed in 1971, to run on the minicomputers at Bell Labs. If you want to shoot yourself in the foot, C will hand you the loaded firearm.
On the other hand, if you want to write a kernel, C is a great language for doing low-level coding. Direct memory access? Yep. Inline assembly? Sure. Runs directly on the metal, with no garbage collection or virtual machines in the way? Absolutely. But all the things that make C great for kernel programming also make C dangerous for kernel programming.
Now I hear your collective keyboards clacking in consternation: “It’s possible to write safe C code!” Yes, yes it is possible. It’s just very easy to mess up, and when you mess up in a kernel, you have security vulnerabilities. There’s also some things that are objectively terrible about C, like undefined behavior. C compilers do their best to do the right thing with cursed code like i++ + i++; or a[i] = i++;. But that’s almost certainly not going to do what you want it to, and even worse, it may sometimes do the right thing.
Rust seems to be gaining popularity. There are some ambitious projects out there, like rewriting coreutils in Rust. Many other standard applications are getting a Rust rewrite. It’s fairly inevitable that the collection of Rust developers started to ask, could we invade the kernel next? This was pitched for a Linux Plumbers Conference, and the mailing list response was cautiously optimistic. If Rust could be added without breaking things, and without losing the very things that makes Rust useful, then yes it would be interesting. Continue reading “Things Are Getting Rusty In Kernel Land”→