A Soviet Cassette Recorder Receiving Some Love

For those of us who lived in the capitalist west during the Cold War, there remains a fascination to this day about the Other Side. The propaganda we were fed as kids matched theirs in describing the awful things on the other side of the wall, something that wasn’t borne out when a decade or so later in the 1990s we met people from the former communist side and found them unsurprisingly to be just like us. It’s thus still of interest to have a little peek into Eastern Bloc consumer electronics, something we have the chance of courtesy of [DiodeGoneWild], who’s fixing a 1980s Soviet cassette recorder.

The model in question is a Vesna 309, and it has some audio issues and doesn’t turn the tape. It gets a teardown, the motor is cleaned up inside, and a few capacitor and pot cleanups later it’s working again. But the interest lies as much in the machine itself as it does in the repair, as it’s instructive to compare with a Western machine of the same period.

We’re told it would have been an extremely expensive purchase for a Soviet citizen, and in some ways such as the adjustable level control it’s better-specified than many of our equivalents. It’s based upon up-to-date components for its era, but the surprise comes in how comparatively well engineered it is. A Western cassette deck mechanism would have been a much more sketchy affair than the substantial Soviet one, and its motor would have been a DC part with a simple analogue speed controller rather than the brushless 3-phase unit in the Vesna. Either we’re looking at the cassette deck for senior comrades only, or the propaganda was wrong — at least about their cassette decks. The full video is below, and if you’re hungry for more it’s not the first time we’ve peered into electronics from the eastern side of the Iron Curtain.

Continue reading “A Soviet Cassette Recorder Receiving Some Love”

AMD Returns To 1996 With Zen 5’s Two-Block Ahead Branch Predictor

An interesting finding in fields like computer science is that much of what is advertised as new and innovative was actually pilfered from old research papers submitted to ACM and others. Which is not to say that this is necessarily a bad thing, as many of such ideas were not practical at the time. Case in point the new branch predictor in AMD’s Zen 5 CPU architecture, whose two-block ahead design is based on an idea coined a few decades ago. The details are laid out by [George Cozma] and [Camacho] in a recent article, which follows on a recent interview that [George] did with AMD’s [Mike Clark].

The 1996 ACM paper by [André Seznec] and colleagues titled “Multiple-block ahead branch predictors” is a good start before diving into [George]’s article, as it will help to make sense of many of the details. The reason for improving the branch prediction in CPUs is fairly self-evident, as today’s heavily pipelined, superscalar CPUs rely heavily on branch prediction and speculative execution to get around the glacial speeds of system memory once past the CPU’s speediest caches. While predicting the next instruction block after a branch is commonly done already, this two-block ahead approach as suggested also predicts the next instruction block after the first predicted one.

Perhaps unsurprisingly, this multi-block ahead branch predictor by itself isn’t the hard part, but making it all fit in the hardware is. As described in the paper by [Seznec] et al., the relevant components are now dual-ported, allowing for three prediction windows. Theoretically this should result in a significant boost in IPC and could mean that more CPU manufacturers will be looking at adding such multi-block branch prediction to their designs. We will just have to see how Zen 5 works once released into the wild.

A History Of Internet Outages

We heard a story that after the recent hurricane, a man noted that while the house was sweltering hot because the power was still out, his kids were more anxious for the internet to come back online. The Internet is practically a basic necessity for most people, but as you may have noticed with the recent CrowdStrike debacle, the Internet isn’t always reliable. Granted, the problem in that case wasn’t the Internet per se, but a problem with many critical hosts that provide services. [Thomas Germain] from the BBC took the opportunity to recall some of the more bizarre reasons we’ve had massive Internet outages in the past.

While teens after a hurricane might miss social media, global outages can be serious business. With 8.5 million computers dead, 911 services went down, medical surgeries were canceled, and — of course — around 46,000 flights were canceled in a single day. We have short memories for these outages, but as [Thomas] points out, this was far from the first massive outage, and many of them have very strange backstories.

Continue reading “A History Of Internet Outages”

End Of An Era: Sony Cuts Production Of Writable Optical Media

The 1990s saw a revolution occur, launched by the CD burner. As prices of writeable media and drives dropped, consumers rushed to duplicate games, create their own mix CDs, and backup their data on optical disc. It was a halcyon time.

Fast forward to today, and we’re very much on downward curve when it comes to optical media use. Amidst ever-declining consumer interest, Sony has announced it will cut production of writeable optical media. Let’s examine what’s going on, and explore the near future for writable optical discs.

Continue reading “End Of An Era: Sony Cuts Production Of Writable Optical Media”

The Rise Of The Disappearing Polymorphs

Science and engineering usually create consistent results. Generally, when you figure out how to make something, you can repeat that at will to make more of something. But what if, one day, you ran the same process, and got different results? You double-checked, and triple-checked, and you kept ending up with a different end product instead?

Perhaps it wasn’t the process that changed, but the environment? Or physics itself? Enter the scary world of disappearing polymorphs.

Continue reading “The Rise Of The Disappearing Polymorphs”

Reviewing Nuclear Accidents: Separating Fact From Fiction

Few types of accidents speak as much to the imagination as those involving nuclear fission. From the unimaginable horrors of the nuclear bombs on Nagasaki and Hiroshima, to the fever-pitch reporting about the accidents at Three Mile Island, Chernobyl and Fukushima, all of these have resulted in many descriptions and visualizations which are merely imaginative flights of fancy, with no connection to physical reality. Due to radiation being invisible with the naked eye and the interpretation of radiation measurements in popular media generally restricted to the harrowing noise from a Geiger counter, the reality of nuclear power accidents in said media has become diluted and often replaced with half-truths and outright lies that feed strongly into fear, uncertainty, and doubt.

Why is it that people are drawn more to nuclear accidents than a disaster like that at Bhopal? What is it that makes the one nuclear bomb on Hiroshima so much more interesting than the firebombing of Tokyo or the flattening of Dresden? Why do we fear nuclear power more than dam failures and the heavy toll of air pollution? If we honestly look at nuclear accidents, it’s clear that invariably the panic afterwards did more damage than the event itself. One might postulate that this is partially due to the sensationalist vibe created around these events, and largely due to a poorly informed public when it comes to topics like nuclear fission and radiation. A situation which is worsened by harmful government policies pertaining to things like disaster response, often inspired by scientifically discredited theories like the Linear No-Threshold (LNT) model which killed so many in the USSR and Japan.

In light of a likely restart of Unit 1 of the Three Mile Island nuclear plant in the near future, it might behoove us to wonder what we might learn from the world’s worst commercial nuclear power disasters. All from the difficult perspective of a world where ideology and hidden agendas do not play a role, as we ask ourselves whether we really should fear the atom.

Continue reading “Reviewing Nuclear Accidents: Separating Fact From Fiction”

Carbon–Cement Supercapacitors Proposed As An Energy Storage Solution

Although most energy storage solutions on a grid-level focus on batteries, a group of researchers at MIT and Harvard University have proposed using supercapacitors instead, with their 2023 research article by [Nicolas Chanut] and colleagues published in Proceedings of the National Academy of Sciences (PNAS). The twist here is that rather than any existing supercapacitors, their proposal involves conductive concrete (courtesy of carbon black) on both sides of the electrolyte-infused insulating membrane. They foresee this technology being used alongside green concrete to become part of a renewable energy transition, as per a presentation given at the American Concrete Institute (ACI).

Functional carbon-cement supercapacitors (connected in series) (Credit: Damian Stefaniuk et al.)

Putting aside the hairy issue of a massive expansion of grid-level storage, could a carbon-cement supercapacitor perhaps provide a way to turn the concrete foundation of a house into a whole-house energy storage cell for use with roof-based PV solar? While their current prototype isn’t quite building-sized yet, in the research article they provide some educated guesstimates to arrive at a very rough 20 – 220 Wh/m3, which would make this solution either not very great or somewhat interesting.

The primary benefit of this technology would be that it could be very cheap, with cement and concrete being already extremely prevalent in construction due to its affordability. As the researchers note, however, adding carbon black does compromise the concrete somewhat, and there are many questions regarding longevity. For example, a short within the carbon-cement capacitor due to moisture intrusion and rust jacking around rebar would surely make short work of these capacitors.

Swapping out the concrete foundation of a building to fix a short is no small feat, but maybe some lessons could be learned from self-healing Roman concrete.