Ask Hackaday, What’s Next?

Writing for Hackaday involves drinking from the firehose of tech news, and seeing the latest and greatest of new projects and happenings in the world of hardware. But sometimes you sit back in a reflective mood, and ask yourself: didn’t this all used to be more exciting? If you too have done that, perhaps it’s worth considering how our world of hardware hacking is fueled, and what makes stuff new and interesting.

Hardware projects are like startup fads

An AliExpress page of Nixie clock kits
When AliExpress has hundreds of kits for them, Nixie clocks are a mature project sector, by any measure.

Hardware projects are like startup fads, they follow the hype cycle. Take Nixie clocks for instance, they’re cool as heck, but here in 2024 there’s not so much that’s exciting about them. If you made one in 2010 you were the talk of the town, in 2015 everyone wanted one, but perhaps by 2020 yours was simply Yet Another Nixie Clock. Now you can buy any number of Nixie clock kits on Ali, and their shine has definitely worn off. Do you ever have the feeling that the supply of genuinely new stuff is drying up, and it’s all getting a bit samey? Perhaps it’s time to explore this topic.

I have a theory that hardware hacking goes in epochs, each one driven by a new technology. If you think about it, the Arduino was an epoch-defining moment in a readily available and easy to use microcontroller board; they may be merely a part and hugely superseded here in 2024 but back in 2008 they were nothing short of a revolution if you’d previously has a BASIC Stamp. The projects which an Arduino enabled produced a huge burst of creativity from drones to 3D printers to toaster oven reflow and many, many, more, and it’s fair to say that Hackaday owes its early-day success in no small part to that little board from Italy. To think of more examples, the advent of affordable 3D printers around the same period as the Arduino, the Raspberry Pi, and the arrival of affordable PCB manufacture from China were all similar such enabling moments. A favourite of mine are the Espressif Wi-Fi enabled microcontrollers, which produced an explosion of cheap Internet-connected projects. Suddenly having Wi-Fi went from a big deal to built-in, and an immense breadth of new projects came from those parts. Continue reading “Ask Hackaday, What’s Next?”

40,000 FPS Omega camera captures Olympic photo-finish

Olympic Sprint Decided By 40,000 FPS Photo Finish

Advanced technology played a crucial role in determining the winner of the men’s 100-meter final at the Paris 2024 Olympics. In a historically close race, American sprinter Noah Lyles narrowly edged out Jamaica’s Kishane Thompson by just five-thousandths of a second. The final decision relied on an image captured by an Omega photo finish camera that shoots an astonishing 40,000 frames per second.

This cutting-edge technology, originally reported by PetaPixel, ensured the accuracy of the result in a race where both athletes recorded a time of 9.78 seconds. If SmartThings’ shot pourer from the 2012 Olympics were still around, it could once again fulfill its intended role of celebrating US medals.

Omega, the Olympics’ official timekeeper for decades, has continually innovated to enhance performance measurement. The Omega Scan ‘O’ Vision Ultimate, the camera used for this photo finish, is a significant upgrade from its 10,000 frames per second predecessor. The new system captures four times as many frames per second and offers higher resolution, providing a detailed view of the moment each runner’s torso touches the finish line. This level of detail was crucial in determining that Lyles’ torso touched the line first, securing his gold medal.

This camera is part of Omega’s broader technological advancements for the Paris 2024 Olympics, which include advanced Computer Vision systems utilizing AI and high-definition cameras to track athletes in real-time. For a closer look at how technology decided this historic race, watch the video by Eurosport that captured the event.

Continue reading “Olympic Sprint Decided By 40,000 FPS Photo Finish”

Australia’s Controlled Loads Are In Hot Water

Australian grids have long run a two-tiered pricing scheme for electricity. In many jurisdictions, regular electricity was charged at a certain rate. Meanwhile, you could get cheaper electricity for certain applications if your home was set up with a “controlled load.” Typically, this involved high energy equipment like pool heaters or hot water heaters.

This scheme has long allowed Australians to save money while keeping their water piping-hot at the same time. However, the electrical grid has changed significantly in the last decade. These controlled loads are starting to look increasingly out of step with what the grid and the consumer needs. What is to be done?

Continue reading “Australia’s Controlled Loads Are In Hot Water”

Reviewing Nuclear Accidents: Separating Fact From Fiction

Few types of accidents speak as much to the imagination as those involving nuclear fission. From the unimaginable horrors of the nuclear bombs on Nagasaki and Hiroshima, to the fever-pitch reporting about the accidents at Three Mile Island, Chernobyl and Fukushima, all of these have resulted in many descriptions and visualizations which are merely imaginative flights of fancy, with no connection to physical reality. Due to radiation being invisible with the naked eye and the interpretation of radiation measurements in popular media generally restricted to the harrowing noise from a Geiger counter, the reality of nuclear power accidents in said media has become diluted and often replaced with half-truths and outright lies that feed strongly into fear, uncertainty, and doubt.

Why is it that people are drawn more to nuclear accidents than a disaster like that at Bhopal? What is it that makes the one nuclear bomb on Hiroshima so much more interesting than the firebombing of Tokyo or the flattening of Dresden? Why do we fear nuclear power more than dam failures and the heavy toll of air pollution? If we honestly look at nuclear accidents, it’s clear that invariably the panic afterwards did more damage than the event itself. One might postulate that this is partially due to the sensationalist vibe created around these events, and largely due to a poorly informed public when it comes to topics like nuclear fission and radiation. A situation which is worsened by harmful government policies pertaining to things like disaster response, often inspired by scientifically discredited theories like the Linear No-Threshold (LNT) model which killed so many in the USSR and Japan.

In light of a likely restart of Unit 1 of the Three Mile Island nuclear plant in the near future, it might behoove us to wonder what we might learn from the world’s worst commercial nuclear power disasters. All from the difficult perspective of a world where ideology and hidden agendas do not play a role, as we ask ourselves whether we really should fear the atom.

Continue reading “Reviewing Nuclear Accidents: Separating Fact From Fiction”

Smart Ball Technology Has Reached Football, But The Euros Show Us It’s Not Necessarily For The Better

Adidas brought smart balls to Euro 2024, for better or worse. Credit: Adidas

The good old fashioned game of football used to be a simple affair. Two teams of eleven, plus a few subs, who were all wrangled by a referee and a couple of helpful linesmen. Long ago, these disparate groups lived together in harmony. Then, everything changed when VAR attacked.

Suddenly, technology was being used to adjudicate all kinds of decisions, and fans were cheering or in uproar depending on how the hammer fell. That’s only become more prevalent in recent times, with smart balls the latest controversial addition to the world game. With their starring role in the Euro 2024 championship more than evident, let’s take a look at what’s going on with this new generation of intelligent footballs.

Continue reading “Smart Ball Technology Has Reached Football, But The Euros Show Us It’s Not Necessarily For The Better”

The Flash Memory Lifespan Question: Why QLC May Be NAND Flash’s Swan Song

The late 1990s saw the widespread introduction of solid-state storage based around NAND Flash. Ranging from memory cards for portable devices to storage for desktops and laptops, the data storage future was prophesied to rid us of the shackles of magnetic storage that had held us down until then. As solid-state drives (SSDs) took off in the consumer market, there were those who confidently knew that before long everyone would be using SSDs and hard-disk drives (HDDs) would be relegated to the dust bin of history as the price per gigabyte and general performance of SSDs would just be too competitive.

Fast-forward a number of years, and we are now in a timeline where people are modifying SSDs to have less storage space, just so that their performance and lifespan are less terrible. The reason for this is that by now NAND Flash has hit a number of limits that prevent it from further scaling density-wise, mostly in terms of its feature size. Workarounds include stacking more layers on top of each other (3D NAND) and increasing the number of voltage levels – and thus bits – within an individual cell. Although this has boosted the storage capacity, the transition from single-level cell (SLC) to multi-level (MLC) and today’s TLC and QLC NAND Flash have come at severe penalties, mostly in the form of limited write cycles and much reduced transfer speeds.

So how did we get here, and is there life beyond QLC NAND Flash?

Continue reading “The Flash Memory Lifespan Question: Why QLC May Be NAND Flash’s Swan Song”

RIP Lynn Conway, Whose Work Gave Us VLSI And Much More

Lynn Conway, American engineer and computer scientist, passed away at the age of 86 from a heart condition on June 9th, at her Michigan home. Her work in the 1970s led to the integrated circuit design and manufacturing methodology known as Very Large Scale Integration, or VLSI, something which touches almost all facets of the world we live in here in 2024.

It was her work at the legendary Xerox PARC that resulted in VLSI, and its subsequent publication had the effect through the 1980s of creating a revolution in the semiconductor industry. By rendering an IC into a library of modular units that could be positioned algorithmically, VLSI enabled much more efficient use of space on the die, and changed the design process from one of layout into one of design. In simple terms, by laying out pre-defined assemblies with a computer rather than individual components by hand, a far greater density of components could be achieved, and more powerful circuits could be produced.

You may have also heard of Lynne Conway, not because of her VLSI work, but because as a transgender woman she found herself pursuing a parallel career as an activist in her later decades. As an MIT student in the 1950s she had tried to transition but been beaten back by the attitudes of the time, before dropping out and only returning to Columbia University to finish her degree a few years later in the early 1960s. A job at IBM followed, but when she announced her intent to transition she was fired from IBM and lost access to her family. Continue reading “RIP Lynn Conway, Whose Work Gave Us VLSI And Much More”