There was a time when nuclear power plants were going to save the world. Barring accidents, the plants are clean and generate a lot of power. However, a few high-profile accidents and increased public awareness of some key issues have made nuclear power a hard sell, at least in the United States. The fastest growing nuclear power-related business in the US — according to sources — is companies decommissioning nuclear power plants. However, there’s a move afoot to make nuclear power a viable solution again. The company behind it says their plants will be cheaper to build, cheaper to operate, and are much safer than conventional plants. Are those claims reasonable?
A colleague of mine used to say he juggled a lot of balls; steel balls, plastic balls, glass balls, and paper balls. The trick was not to drop the glass balls. How do you know which is which? For example, suppose you were tasked with making sure a nuclear power plant was safe. What would be important? A fail-safe way to drop the control rods into the pile, maybe? A thick containment wall? Two loops of cooling so that only the inner loop gets radioactive? I’m not a nuclear engineer, so I don’t know, but ensuring electricians at a nuclear plant aren’t using open flames wouldn’t be high on my list of concerns. You might think that’s really obvious, but it turns out if you look at history that was a glass ball that got dropped.
In the 1960s and 70s, there was a lot of optimism in the United States about nuclear power. Browns Ferry — a Tennessee Valley Authority (TVA) nuclear plant — broke ground in 1966 on two plants. Unit 1 began operations in 1974, and Unit 2 the following year. By 1975, the two units were producing about 2,200 megawatts of electricity.
That same year, an electrical inspector and an electrician were checking for air leaks in the spreading room — a space where control cables split to go to the two different units from a single control room. To find the air drafts they used a lit candle and would observe the flame as it was sucked in with the draft. In the process, they accidentally started a fire that nearly led to a massive nuclear disaster.
Rockets with nuclear bombs for propulsion sounds like a Wile E. Coyote cartoon, but it has been seriously considered as an option for the space program. Chemical rockets combust a fuel with an oxidizer within themselves and exhaust the result out the back, causing the rocket to move in the opposite direction. What if instead, you used the higher energy density of nuclear fission by detonating nuclear bombs?
Detonating the bombs within a combustion chamber would destroy the vehicle so instead you’d do so from outside and behind. Each bomb would include a little propellant which would be thrown as plasma against the back of the vehicle, giving it a brief, but powerful push.
That’s just what a group of top physicists and engineers at General Atomic worked on between 1958 and 1965 under the name, Project Orion. They came close to doing nuclear testing a few times and did have success with smaller tests, exploding a series of chemical bombs which pushed a 270-pound craft up 185 feet as you’ll see below.
Our better-traveled colleagues having provided ample coverage of the 34C3 event in Leipzig just after Christmas, it is left to the rest of us to pick over the carcass as though it was the last remnant of a once-magnificent Christmas turkey. There are plenty of talks to sit and watch online, and of course the odd gem that passed the others by.
It probably doesn’t get much worse than nuclear conflagration, when it comes to risks facing the planet. Countries nervously peering at each other, each jealously guarding their stocks of warheads. It seems an unlikely place to find a 34C3 talk about 6502 microprocessors, but that’s what [Moritz Kütt] and [Alex Glaser] managed to deliver.
Policing any peace treaty is a tricky business, and one involving nuclear disarmament is especially so. There is a problem of trust, with so much at stake no party is anxious to reveal all but the most basic information about their arsenals and neither do they trust verification instruments manufactured by a state agency from another player. Thus the instruments used by the inspectors are unable to harvest too much information on what they are inspecting and can only store something analogous to a hash of the data they do acquire, and they must be of a design open enough to be verified. This last point becomes especially difficult when the hardware in question is a modern high-performance microprocessor board, an object of such complexity could easily have been compromised by a nuclear player attempting to game the system.
We are taken through the design of a nuclear weapon verification instrument in detail, with some examples and the design problems they highlight. Something as innocuous as an ATtiny microcontroller seeing to the timing of an analogue board takes on a sinister possibility, as it becomes evident that with compromised code it could store unauthorised information or try to fool the inspectors. They show us their first model of detector using a Red Pitaya FPGA board, but make the point that this has a level of complexity that makes it unverifiable.
Then comes the radical idea, if the technology used in this field is too complex for its integrity to be verified, what technology exists at a level that can be verified? Their answer brings us to the 6502, a processor in continuous production for over 40 years and whose internal structures are so well understood as to be de facto in the public domain. In particular they settle upon the Apple II home computer as a 6502 platform, because of its ready availability and the expandability of [Steve Wozniak]’s original design. All parties can both source and inspect the instruments involved.
If you’ve never examined a nuclear warhead verification device, the details of the system are fascinating. We’re shown the scintillation detector for measuring the energies present in the incident radiation, and the custom Apple II ADC board which uses only op-amps, an Analog Devices flash ADC chip, and easily verifiable 74-series logic. It’s not intentional but pleasing from a retro computing perspective that everything except perhaps the blue LED indicator could well have been bought for an Apple II peripheral back in the 1980s. They then wrap up the talk with an examination of ways a genuine 6502 system could be made verifiable through non-destructive means.
It is not likely that nuclear inspectors will turn up to the silos with an Apple II in hand, but this does show a solution to some of the problems facing them in their work and might provide pointers towards future instruments. You can read more about their work on their web site.
[lasersaber] has a passion: low-power motors. In a bid to challenge himself and inspired by betavoltaic cells, he has 3D printed and built a small nuclear powered motor!
This photovoltaic battery uses fragile glass vials of tritium extracted from keychains and a small section of a solar panel to absorb the light, generating power. After experimenting with numerous designs, [lasersaber] went with a 3D printed pyramid that houses six coils and three magnets, encapsulated in a glass cloche and accompanied by a suitably ominous green glow.
Can you guess how much power and current are coursing through this thing? Guess again. Lower. Lower.
Under 200mV and 20nA!
More than one hundred years ago, Henri Becquerel discovered that uranium emitted penetrating rays similar to those used by Wilhelm Röntgen to take the first X-ray image (of his wife’s hand), starting a new era of far-reaching applications. There are of course many dangers that come with the use of radioactivity, but there are also many beneficial uses for our society.
The nuclear age changed steel, and for decades we had to pay the price for it. The first tests of the atomic bomb were a milestone in many ways, and have left a mark in history and in the surface of the Earth. The level of background radiation in the air increased, and this had an effect on the production of steel, so that steel produced since 1945 has had elevated levels of radioactivity. This can be a problem for sensitive instruments, so there was a demand for steel called low background steel, which was made before the trinity tests.
The production of steel is done with the Bessemer process, which takes the molten pig iron and blasts air through it. By pumping air through the steel, the oxygen reacts with impurities and oxidizes, and the impurities are drawn out either as gas or slag, which is then skimmed off. The problem is that the atmospheric air has radioactive impurities of its own, which are deposited into the steel, yielding a slightly radioactive material. Since the late 1960s steel production uses a slightly modified technique called the BOS, or Basic Oxygen Steelmaking, in which pure oxygen is pumped through the iron. This is better, but radioactive material can still slip through. In particular, we’re interested in cobalt, which dissolves very easily in steel, so it isn’t as affected by the Bessemer or BOS methods. Sometimes cobalt is intentionally added to steel, though not the radioactive isotope, and only for very specialized purposes.
Recycling is another reason that modern steel stays radioactive. We’ve been great about recycling steel, but the downside is that some of those impurities stick around.
Why Do We Need Low Background Steel?
Imagine you have a sensor that needs to be extremely sensitive to low levels of radiation. This could be Geiger counters, medical devices, or vehicles destined for space exploration. If they have a container that is slightly radioactive it creates an unacceptable noise floor. That’s where Low Background Steel comes in.
So where do you get steel, which is a man-made material, that was made before 1945? Primarily from the ocean, in sunken ships from WWII. They weren’t exposed to the atomic age air when they were made, and haven’t been recycled and mixed with newer radioactive steel. We literally cut the ships apart underwater, scrape off the barnacles, and reuse the steel.
Fortunately, this is a problem that’s going away on its own, so the headline is really only appropriate as a great reference to a popular movie. After 1975, testing moved underground, reducing, but not eliminating, the amount of radiation pumped into the air. Since various treaties ending the testing of nuclear weapons, and thanks to the short half-life of some of the radioactive isotopes, the background radiation in the air has been decreasing. Cobalt-60 has a half-life of 5.26 years, which means that steel is getting less and less radioactive on its own (Cobalt-60 from 1945 would now be at .008% of original levels). The newer BOS technique exposes the steel to fewer impurities from the air, too. Eventually the need for special low background steel will be just a memory.
Oddly enough, steel isn’t the only thing that we’ve dragged from the bottom of the ocean. Ancient Roman lead has also had a part in modern sensing.