Fukushima Daiichi at night

A Tritium Story: How Afraid Should You Be Of Hydrogen’s Big Brother?

Despite being present in everything that contains water, tritium is not an isotope that many people were that familiar with outside of select (geeky) channels, such as DEF CON with a tritium-containing badge, the always excellent NurdRage’s assembly of a tritium-based atomic battery, or the creation of a tritium-phosphor-based glow-in-the-dark tesseract cube.

Tritium is a hydrogen isotope that shares a lot of characteristics with its two siblings: 1H (protium) and 2H (deuterium), with the main distinction being that tritium (3H) is not a stable isotope, with a half-life of ~12.32 years that sees it decay into 3He. Most naturally occurring tritium on Earth originates from interactions between fast neutrons (>4.0 MeV) from cosmic radiation and atmospheric nitrogen.

Recently tritium has become a politically hot topic on account of the announced release of treated water at the Japanese Fukushima Daiichi nuclear plant. This has raised for many the question of just how much tritium is ‘too much’ and what we’re likely to notice from this treated — but still tritium-containing water — being released into the ocean.

Continue reading “A Tritium Story: How Afraid Should You Be Of Hydrogen’s Big Brother?”

Keith Thorne, Engineer At LIGO, To Deliver Remoticon Keynote

It is my pleasure to announce that Keith Thorne has graciously agreed to deliver a keynote take at Hackaday Remoticon 2. Get your ticket now!

Keith is an astrophysicist and has worked on the Laser Interferometer Gravitational-Wave Observatory (LIGO) since 2008, literally looking for ripples in space-time that you know as gravitational waves. The effects of the phenomena are so subtle that detecting an event requires planet-scale sensors in the form of 4 km long interferometers placed in different parts of the United States whose readings can be compared against one another. A laser beam inside these interferometers bounces back and forth 300 times for a total travel distance of 1,200 km in which any interaction with gravitational waves will ever-so-slightly alter how the photons from the beam register.

The challenges of building, operating, and interpreting such a device are manifold. These interferometers are the highest precision devices ever devised, able to detect motion 1/10,000 of the diameter of a proton! To get there, the mirrors need to be cooled to 77 nano-Kelvins. Getting the most out of it is what Keith and the rest of the team specialize in. This has included things like hacking the Linux kernel to achieve a sufficient level of real-time digital control, and using “squeezed light” to improve detection sensitivity in frequencies where quantum mechanics is getting in the way. While the detectors were first run in 2015 & 2016, successfully observing three events, the work to better understand this phenomenon is ongoing and may include a third site in India, and a space-based detector in the future.

In getting to know Keith he mentioned that he is excited to speak to a conference packed with people who want to hear the gory technical details of this fantastic piece of hardware. I’m sure we’re all giddy to learn what he has to say. But if you’re someone who wants to work on a project like this, he tipped us off that there’s an active EE job posting for LIGO right now. Maybe you’ll end up doing the keynote at a future Hackaday conference.

Call for Proposals is Still Open!

We’re still on the hunt for great talks about hardware creation. True creativity is fed by a steady stream of inspiration. Be that inspiration by giving a talk about the kinds of things you’ve been working on!

Things Are Looking Brighter! But Not The Stars

Growing up in Montana I remember looking out at night and seeing the Milky Way, reminding me of my insignificance in the universe. Now that I live in a city, such introspection is no longer easy, and like 1/2 of humanity that also lives in urban areas, I must rely on satellites to provide the imagery. Yet satellites are part of the problem. Light pollution has been getting worse for decades, and with the recent steady stream of satellite launches and billionaire joyrides we have a relatively new addition to the sources of interference. So how bad is it, and how much worse will it get?

Looking up at the night sky, you can usually tell the difference between various man-made objects. Planes go fairly slowly across the sky, and you can sometimes see them blinking green and red. Meteors are fast and difficult to see. Geostationary satellites don’t appear to move at all because they are orbiting at the same rate as earth’s rotation, while other orbit types will zip by.

SpaceX has committed to reducing satellite brightness, and some observations have confirmed that new models are a full magnitude darker, right at the threshold of naked-eye observation. Unfortunately, it’s only a step in the right direction, and not enough to satisfy astronomers, who aren’t looking up at the night sky with their naked eyes, naturally.

The satellites aren’t giving off the light themselves. They are merely reflecting the light from the sun back to the earth, exactly the same way the moon is. Thus something that is directly in the shadow of the Earth will not reflect any light, but near the horizon the reflection from the satellites can be significant. It’s not practical to only focus our observatories in the narrow area that is the Earth’s shadow during the night, so we must look closer to the horizon and capture the reflections of the satellites. Continue reading “Things Are Looking Brighter! But Not The Stars”

Creating Methane From Captured Carbon Dioxide And The Future Of Carbon Capture

There’s something intrinsically simple about the concept of carbon (CO2) capture: you simply have the CO2 molecules absorbed or adsorbed by something, after which you separate the thus captured CO2 and put it somewhere safe. Unfortunately, in physics and chemistry what seems easy and straightforward tends to be anything but simple, let alone energy efficient. While methods for carbon capture have been around for decades, making it economically viable has always been a struggle.

This is true both for carbon capture and storage/sequestration (CCS) as well as carbon capture and utilization (CCU). Whereas the former seeks to store and ideally permanently remove (sequester) carbon from the atmosphere, the latter captures carbon dioxide for use in e.g. industrial processes.

Recently, Pacific Northwest National Laboratory (PNNL) has announced a breakthrough CCU concept, involving using a new amine-based solvent (2-EEMPA) that is supposed to be not only more efficient than e.g. the previously commonly used MEA, but also compatible with directly creating methane in the same process.

Since methane forms the major component in natural gas, might this be a way for CCU to create a carbon-neutral source of synthetic natural gas (SNG)? Continue reading “Creating Methane From Captured Carbon Dioxide And The Future Of Carbon Capture”

Decoding SMD Part Markings

You’ve probably encountered this before — you have a circuit board that is poorly documented, and want to know the part number of a tiny SMD chip. Retro computer enthusiast [JohnK] recently tweeted about one such database that he recently found, entitled The Ultimate SMD Marking Codes Database. This data base is only a couple of years old judging from the Wayback Machine, but seems to be fairly exhaustive and can be found referenced in quite a few electronics forums.

Unlike their larger SMD siblings, these chips in question are so small that there is no room to print the entire part number on the device. Instead, the standard practice is for manufacturers use an abbreviated code of just a few characters. These codes are only unique to each part or package, and aren’t necessarily unique across an entire product line. And just because it is standard practice does not imply the marking codes themselves follow any standard whatsoever. This seemingly hodgepodge system works just fine for the development, procurement and manufacturing phases of a product’s lifecycle. It’s during the repair, refurbishment, or just hacking for fun phases where these codes can leave you scratching your head.

Several sites like the one [JohnK] found have been around for years, and adding yet another database to your toolbox is a good thing. But none of them will ever be exhaustive. There’s a good reason for that — maintaining such a database would be a herculean task. Just finding the part marking information for a known chip can be difficult. Some manufacturers put it clearly in the data sheet, and some refer you to other documentation which may or may not be readily available. And some manufacturers ask you to contact them for this information — presumably because it is dynamic changes from time to time. Continue reading “Decoding SMD Part Markings”

Need A New Programming Language? Try Zig

Maybe you’ve heard of it, maybe you haven’t. Zig is a new programming language that seems to be growing in popularity. Let’s do a quick dive into what it is, why it’s unique, and what sort of things you would use it for. (Ed Note: Other than “for great justice“, naturally.)

What Is It?

You’ve likely heard of Rust as it has made significant inroads in critical low-level infrastructures such as operating systems and embedded microcontrollers. As a gross oversimplification, it offers memory safety and many traditional runtime checks pushed to compile time. It has been the darling of many posts here at Hackaday as it offers some unique advantages. With Rust on the rise, it makes sense that there might be some space for some new players. Languages like Julia, Go, Swift, and even Racket are all relative newcomers vying for the highly coveted mindshare of software engineers everywhere.

So let’s talk Zig. In a broad sense, Zig is really trying to provide some of the safety of Rust with the simplicity and ease of C. It touts a few core features such as:

  • No hidden control flow
  • No hidden memory allocations
  • No preprocessor, no macros
  • First-class support for optional standard library
  • Interoperable by design
  • Adjustable Runtime Safety
  • Compile-time code-execution

Continue reading “Need A New Programming Language? Try Zig”

Counting Down To The Final Atlas Rocket

The Atlas family of rockets have been a mainstay of America’s space program since the dawn of the Space Age, when unused SM-65 Atlas intercontinental ballistic missiles (ICBMs) were refurbished and assigned more peaceful pursuits. Rather than lobbing thermonuclear warheads towards the Soviets, these former weapons of war carried the first American astronauts into orbit, helped build the satellite constellations that our modern way of life depends on, and expanded our knowledge of the solar system and beyond.

SM-65A Atlas ICBM in 1958

Naturally, the Atlas V that’s flying today looks nothing like the squat stainless steel rocket that carried John Glenn to orbit in 1962. Aerospace technology has evolved by leaps and bounds over the last 60 years, but by carrying over the lessons learned from each generation, the modern Atlas has become one of the most reliable orbital boosters ever flown. Since its introduction in 2002, the Atlas V has maintained an impeccable 100% success rate over 85 missions.

But as they say, all good things must come to an end. After more than 600 launches, United Launch Alliance (ULA) has announced that the final mission to fly on an Atlas has been booked. Between now and the end of the decade, ULA will fly 28 more missions on this legendary booster. By the time the last one leaves the pad the company plans to have fully transitioned to their new Vulcan booster, with the first flights of this next-generation vehicle currently scheduled for 2022.

Continue reading “Counting Down To The Final Atlas Rocket”