The Rise And (Eventual) Fall Of The SIM Card

There are few devices that better exemplify the breakneck pace of modern technical advancement than the mobile phone. In the span of just a decade, we went from flip phones and polyphonic ringtones to full-fledged mobile computers with quad-core processors and gigabytes of memory.

While rapid advancements in computational power are of course nothing new, the evolution of mobile devices is something altogether different. The Razr V3 of 2003 and the Nexus 5 of 2013 are so vastly different that it’s hard to reconcile the fact they were (at least ostensibly) designed to serve the same purpose — with everything from their basic physical layout to the way the user interacts with them having undergone dramatic changes in the intervening years. Even the network technology they use to facilitate voice and data communication are different.

Two phones, a decade apart.

Yet, there’s at least one component they share: the lowly SIM card. In fact, if you don’t mind trimming a bit of unnecessary plastic away, you could pull the SIM out of the Razr and slap it into the Nexus 5 without a problem. It doesn’t matter that the latter phone wasn’t even a twinkling in Google’s eye when the card was made, the nature of the SIM card means compatibility is a given.

Indeed there’s every reason to believe that very same card, now 20 years old, could be installed in any number of phones on the market today. Although, once again, some minor surgery would be required to pare it down to size.

Such is the beauty of the SIM, or Subscriber Identity Module. It allows you to easily transfer your cellular service from one phone to another, with little regard to the age or manufacturer of the device, and generally without even having to inform your carrier of the swap. It’s a simple concept that has served us well for almost as long as cellular telephones have existed, and separates the phone from the phone contract.

So naturally, there’s mounting pressure in the industry to screw it up.

Continue reading “The Rise And (Eventual) Fall Of The SIM Card”

PCIe For Hackers: The Diffpair Prelude

PCIe, also known as PCI-Express, is a highly powerful interface. So let’s see what it takes to hack on something that powerful. PCIe is be a bit intimidating at first, however it is reasonably simple to start building PCIe stuff, and the interface is quite resilient for hobbyist-level technology. There will come a time when we want to use a PCIe chip in our designs, or perhaps, make use of the PCIe connection available on a certain Compute Module, and it’s good to make sure that we’re ready for that.

PCIe is everywhere now. Every modern computer has a bunch of PCIe devices performing crucial functions, and even iPhones use PCIe internally to connect the CPU with the flash and WiFi chips. You can get all kinds of PCIe devices: Ethernet controllers, high-throughput WiFi cards, graphics, and all the cheap NVMe drives that gladly provide you with heaps of storage when connected over PCIe. If you’re hacking on a laptop or a single-board computer and you’d like to add a PCIe device, you can get some PCIe from one of the PCIe-carrying sockets, or just tap into an existing PCIe link if there’s no socket to connect to. It’s been two decades since we’ve started getting PCIe devices – now, PCIe is on its 5.0 revision, and it’s clear that it’s here to stay.

Continue reading “PCIe For Hackers: The Diffpair Prelude”

The First Gui? Volscan Controls The Air

In the 1950s,  computers were, for the most part, ponderous machines. But one machine offered a glimpse of the future. The Volscan was probably the first real air traffic computer designed to handle high volumes of military aircraft operations. It used a light gun that looked more like a soldering gun than a computer input device. There isn’t much data about Volscan, but it appears to have been before its time, and had arguably the first GUI on a computer system ever.

The Air Force had a problem. The new — in the 1950s — jets needed long landing approaches and timely landings since they burned more fuel at lower altitudes. According to the Air Force, they could land 40 planes in an hour, but they needed to be able to do 120 planes an hour. The Whirlwind computer had proven that computers could process radar data — although Whirlwind was getting the data over phone lines from a distance. So the Air Force’s Cambridge Research Center started working on a computerized system to land planes called Volscan, later known as AN/GSN-3.

Continue reading “The First Gui? Volscan Controls The Air”

Will A.I. Steal All The Code And Take All The Jobs?

New technology often brings with it a bit of controversy. When considering stem cell therapies, self-driving cars, genetically modified organisms, or nuclear power plants, fears and concerns come to mind as much as, if not more than, excitement and hope for a brighter tomorrow. New technologies force us to evolve perspectives and establish new policies in hopes that we can maximize the benefits and minimize the risks. Artificial Intelligence (AI) is certainly no exception. The stakes, including our very position as Earth’s apex intellect, seem exceedingly weighty. Mathematician Irving Good’s oft-quoted wisdom that the “first ultraintelligent machine is the last invention that man need make” describes a sword that cuts both ways. It is not entirely unreasonable to fear that the last invention we need to make might just be the last invention that we get to make.

Artificial Intelligence and Learning

Artificial intelligence is currently the hottest topic in technology. AI systems are being tasked to write prose, make art, chat, and generate code. Setting aside the horrifying notion of an AI programming or reprogramming itself, what does it mean for an AI to generate code? It should be obvious that an AI is not just a normal program whose code was written to spit out any and all other programs. Such a program would need to have all programs inside itself. Instead, an AI learns from being trained. How it is trained is raising some interesting questions.

Humans learn by reading, studying, and practicing. We learn by training our minds with collected input from the world around us. Similarly, AI and machine learning (ML) models learn through training. They must be provided with examples from which to learn. The examples that we provide to an AI are referred to as the data corpus of the training process. The robot Johnny 5 from “Short Circuit”, like any curious-minded student, needs input, more input, and more input.

Continue reading “Will A.I. Steal All The Code And Take All The Jobs?”

Laptop Motherboard? Let’s Boot And Tinker

Last time, I’ve shared my experience on why you might want to consider a laptop motherboard for a project of yours, and noted some things you might want to keep in mind if buying one for a project. Now, let’s go through the practical considerations!

Making It Boot

Usually, when you plug some RAM and a charger into a board, then press the power button, your board should boot up and eventually show the BIOS on the screen. However, there will be some caveats – it’s very firmware-dependent. Let me walk you through some confusing situations you might encounter.

If the board was unpowered for a while, first boot might take longer – or it might power on immediately after a charger has been plugged in, and then, possibly, power off. A bit of erratic behaviour is okay, since boards might need to do memory training, or recover after having lost some CMOS settings. Speaking of those, some boards will not boot without a CMOS battery attached, and some will go through the usual ‘settings lost’ sequence. Sometimes, the battery will be on a daughterboard, other times, especially with new boards, there will be no CR2032 in sight and the board will rely on the main battery to provide CMOS settings saving functions – in such case, if you don’t use the battery, expect the first boot to take longer, at least. Overall, however, pressing the power switch will cause the board to boot. Continue reading “Laptop Motherboard? Let’s Boot And Tinker”

Repurposing Old Smartphones: When Reusing Makes More Sense Than Recycling

When looking at the specifications of smartphones that have been released over the past years, it’s remarkable to see how aspects like CPU cores, clockspeeds and GPU performance have improved during this time, with even new budget smartphones offering a lot of computing power, as well as a smattering of sensors. Perhaps even more remarkable is that of the approximately 1.5 billion smartphones sold each year, many will be discarded again after a mere two years of use. This seems rather wasteful, and a recent paper by Jennifer Switzer and colleagues proposes that a so-called Computational Carbon Intensity (CCI) metric should be used to determine when it makes more sense to recycle a device than to keep using it.

What complicates the decision of when it makes more sense to reuse than recycle is that there are many ways to define when a device is no longer ‘fit for purpose’. It could be argued that the average smartphone is still more than good enough after two years to be continued as a smartphone for another few years at least, or at least until the manufacturer stops supplying updates. Beyond the use as a smartphone, they’re still devices with a screen, WiFi connection and a capable processor, which should make it suitable for a myriad of roles.

Unfortunately, as we have seen with the disaster that was Samsung’s ‘upcycling’ concept a few years ago, or Google’s defunct Project Ara, as promising as the whole idea of ‘reuse, upcycle, recycle’ sounds, establishing an industry standard here is frustratingly complicated. Worse, over the years smartphones have become ever more sealed-up, glued-together devices that complicate the ‘reuse’ narrative.

Continue reading “Repurposing Old Smartphones: When Reusing Makes More Sense Than Recycling”

Laptop Motherboard? No, X86 Single-Board Computer!

Sometimes a Raspberry Pi will not cut it – especially nowadays, when the prices are high and the in-stock amounts are low. But if you look in your closet, you might find a decently-specced laptop with a broken screen or faulty hinges. Or perhaps someone you know is looking to get rid of a decent laptop with a shattered case. Electronics recycling or eBay, chances are you can score a laptop with at least some life left in it.

Let’s hack! I’d like to show you how a used laptop motherboard could be the heart of your project, and walk you through some specifics you will want to know.

And what a great deal it could be for your next project! Laptop motherboards can help bring a wide variety of your Linux- and Windows-powered projects to life, in a way that even NUCs and specialized SBCs often can’t do. They’re way cheaper, way more diverse, and basically omnipresent. The CPU can pack a punch, and as a rule PCIe, USB3, and SATA ports are easily accessible with no nonsense like USB-throttled Ethernet ports.

Continue reading “Laptop Motherboard? No, X86 Single-Board Computer!”