Mechanic Prince Of Tides

Lord Kelvin’s name comes up anytime you start looking at the history of science and technology. In addition to working on transatlantic cables and thermodynamics, he also built an early computing device to predict tides. Kelvin, whose real name was William Thomson, became interested in tides in a roundabout way, as explained in a recent IEEE Spectrum article.

He’d made plenty of money on his patents related to the telegraph cable, but his wife died, so he decided to buy a yacht, the Lalla Rookh. He used it as a summer home. If you live on a boat, the tides are an important part of your day.

Today, you could just ask your favorite search engine or AI about the tides, but in 1870, that wasn’t possible. Also, in a day when sea power made or broke empires, tide charts were often top secret. Not that the tides were a total mystery. Newton explained what was happening back in 1687. Laplace realized they were tied to oscillations almost a century later. Thomson made a machine that could do the math Laplace envisioned.

We know today that the tides depend on hundreds of different motions, but many of them have relatively insignificant contributions, and we only track 37 of them, according to the post. Kelvin’s machine — an intricate mesh of gears and cranks — tracked only 10 components.

In operation, the user turned a crank, and a pen traced a curve on a roll of paper. A small mark showed the hour with a special mark for noon. You could process a year’s worth of tides in about 4 hours. While Kelvin received credit for the machine’s creation, he acknowledged the help of many others in his paper, from craftsmen to his brother.

We actually did a deep dive into tides, including Kelvin’s machine, a few years ago. He shows up a number of times in our posts.

Can You Hear Me Now? Try These Headphones

When you are young, you take it for granted that you can pick out a voice in a crowded room or a factory floor. But as you get older, your hearing often gets to the point where a noisy room merges into a mishmash of sounds. University of Washington researchers have developed what they call Target Speech Hearing. In plain English, it is an AI-powered headphone that lets you look at someone and pull their voice out of the chatter. For best results, however, have to enroll their voice first, so it wouldn’t make a great eavesdropping device.

If you want to dive into the technical details, their paper goes into how it works. The prototype uses a Sony noise-cancelling headset. However, the system requires binaural microphones so additional microphones attach to the outside of the headphones.

Continue reading “Can You Hear Me Now? Try These Headphones”

The Genius Of Slide Rule Precision

Most people have heard of or seen slide rules, with older generations likely having used these devices in school and at their jobs. As purely analog computers these ingenious devices use precomputed scales on slides, which when positioned to a specific input can give the output to a wide range of calculations, ranging from simple divisions and multiplications to operations that we generally use a scientific calculator for these days. Even so, these simple devices are both very versatile and can be extremely precise, as [Bob, the Science Guy] demonstrates in a recent video.

Slide rules at their core are very simple: you got different scales (marked by a label) which can slide relative to each other. Simple slide rules will only have the A through D scales, with an input provided by moving one scale relative to the relevant other scale (e.g. C and D for multiplication/division) after which the result can be read out. Of course, it seems reasonable that the larger your slide rule is, the more precision you can get out of it. Except that if you have e.g. the W1 and W2 scales on a shorter (e.g. 10″) slide rule, you can use those to get the precision of a much larger (20″) slide rule, as [Bob] demonstrates.

Even though slide rules have a steeper learning curve than punching numbers into a scientific calculator, it is hard to argue the benefits of understanding such relationships between the different scales, and why they exist in the first place.

Continue reading “The Genius Of Slide Rule Precision”

Whole-Fruit Chocolate: Skipping The Sugar By Using The Entire Cacao Pod

Images of whole-fruit chocolate formulations after kneading at 31 °C and subsequent heating to 50 °C. The ECP concentration in the sweetening gel and the added gel concentrations into the CM are shown on the x and y axis, respectively. (Credit: Kim Mishra et al., Nature Food, 2024)
Images of whole-fruit chocolate formulations after kneading at 31 °C and subsequent heating to 50 °C. The ECP concentration in the sweetening gel and the added gel concentrations in the CM are shown on the X and Y axes, respectively. (Credit: Kim Mishra et al., Nature Food, 2024)

It’s hard to imagine a world without chocolate, and yet it is undeniable that there are problems associated both with its manufacturing and its consumption. Much of this is due to the addition of sugar, as well as the discarding of a significant part of the cacao pod, which harbors the pulp and seeds. According to a study by [Kim Mishra] and colleagues in Nature Food, it might be possible to ditch the sugar and instead use a mixture of cacao pulp juice (CPJC) and endocarp powder (ECP), which are turned into a sweetening gel.

This gel replaces the combination of sugar with an emulsifier (lecithin or something similar) in current chocolate while effectively using all of the cacao pod except for the husk. A lab ran a small-scale production, with two different types of whole-fruit chocolate produced, each with a different level of sweetness, and given to volunteers for sampling. Samples had various ECP ratios in the gel and gel ratios in the chocolate mixture with the cacao mass (CM).

With too much of either, the chocolate becomes crumbly, while with too little, no solid chocolate forms. Eventually, they identified a happy set of ratios, leading to the taste test, which got an overall good score in terms of chocolate taste and sweetness. In addition to being able to skip the refined sugar addition, this manufacturing method also cuts out a whole supply chain while adding significantly more fiber to chocolate. One gotcha here is that this study focused on dark chocolate, but then some chocolate fans would argue vehemently that anything below 50% cacao doesn’t qualify as chocolate anymore, while others scoff at anything below 75%.

Matters of taste aside, this study shows a promising way to make our regular chocolate treat that much healthier and potentially greener. Of course, we want to know how it will print. Barring that, maybe how it engraves.

Recycling Of Portland Cement And Steel In Electric Arc Furnaces

The use of concrete and steel have both become the bedrock of modern-day construction, which of course also means that there is a lot of both which ends up as waste once said construction gets demolished again. While steel is readily recyclable, the Portland cement that forms the basis of concrete so far is not. Although the aggregate from crushed concrete can be reclaimed, the remainder tends to end up in a landfill, requiring fresh input of limestone to create more cement. Now a team of researchers from the University of Cambridge claim to have found a way to recycle hydrated Portland cement by using it as flux during steel production in electric arc furnaces (EAFs).

Not only does this save a lot of space in landfills, it also stands to reduce a lot of the carbon dioxide produced during cement and steel production, which is primarily from the use of limestone for cement and lime-dolomite for steel. The details can be found in the open access paper in Nature by [Cyrille F. Dunant] and colleagues. Essentially reclaimed cement paste is mixed with some fresh material to form the flux that shields the molten steel in an EAF from the atmosphere. The flux creates the slag layer that floats on top of the molten steel, with this slag after cooling down being ground up and turned into cement clinker, which is then mixed to create fresh cement.

The process has been patented by Cambridge, who call the product ‘Cambridge Electric Cement‘, with the claim that if using low-carbon power sources for the EAF like hydro and nuclear, it would constitute ‘no emissions’ and ‘no landfill’ cement. We have to see how this works out on an industrial scale, of course, but it would definitely be nice to keep concrete and cement in general out of landfills, while cutting back on limestone mining, as well as questionable practices like adding heavy metal-laden fly ash as filler to concrete.

Thanks to [cscott] for the tip.

Mapping The Human Brain And Where This May Lead Us

In order to understand something, it helps to observe it up close and study its inner workings. This is no less true for the brain, whether it is the brain of a mouse, that of a whale, or the squishy brain inside our own skulls. It defines after all us as a person; containing our personality and all our desires and dreams. There are also many injuries, disorders and illnesses that affect the brain, many of which we understand as poorly as the basics of how memories are stored and thoughts are formed. Much of this is due to how complicated the brain is to study in a controlled fashion.

Recently a breakthrough was made in the form of a detailed map of the cells and synapses in a segment of a human brain sample. This collaboration between Harvard and Google resulted in the most detailed look at human brain tissue so far, contained in a mere 1.4 petabytes of data. Far from a full brain map, this particular effort involved only a cubic millimeter of the human temporal cortex, containing 57,000 cells, 230 millimeters of blood vessels and 150 million synapses.

Ultimately the goal is to create a full map of a human brain like this, with each synapse and other structures detailed. If we can pull it off, the implications could be mind-bending.

Continue reading “Mapping The Human Brain And Where This May Lead Us”

Nuclear Fusion R&D In 2024: Getting Down To The Gritty Details

To those who have kept tabs on nuclear fusion research the past decades beyond the articles and soundbites in news outlets, it’s probably clear just how much progress has been made, and how many challenges still remain. Yet since not that many people are into plasma physics, every measure of progress, such as most recently by the South Korean KSTAR (Korea Superconducting Tokamak Advanced Research) tokamak, is met generally by dismissive statements about nuclear fusion always being a certain number of decades away. Looking beyond this in coverage such as the article by Science Alert about this achievement by KSTAR we can however see quite a few of these remaining challenges being touched upon.

Recently KSTAR managed to generate 100 million degrees C plasma and maintain this for 48 seconds, a significant boost over its previous record from 2021 of 30 seconds, partially due to the new divertors that were installed. These divertors are essential for removing impurities from the plasma, yet much like the inner wall of the reactor vessel, these plasma-facing materials (PFM) bear the brunt of the super-hot plasma and any plasma instabilities, as well as the constant neutron flux from the fusion products. KSTAR now features tungsten divertors, which has become a popular material choice for this component.

Researching the optimal PFMs, as well as plasma containment modes and methods to suppress plasma instabilities are just some of the challenges that form the road still ahead before commercial fusion can commence.

Continue reading “Nuclear Fusion R&D In 2024: Getting Down To The Gritty Details”