When A Record Player Doesn’t Work Due To Solid State Grease

Normally, mechanical devices like record players move smoothly, with well-greased contact surfaces enabling the tone arm to automatically move, the multi-record mechanism to drop down a fresh disc, and the listener to have a generally good time. Unfortunately, the 1972-era ITT KP821 record player that [Mark] got recently handed by a friend wasn’t doing a lot of moving, with every part of the mechanism seemingly frozen in place, though the current owner wasn’t certain that they were doing something wrong.

More after the break…

Continue reading “When A Record Player Doesn’t Work Due To Solid State Grease”

Elegoo Rapid PETG Vs PETG Pro: Same Price, Similar Specs, Which To Buy?

Even within a single type of FDM filament there is an overwhelming amount of choice. Take for example Elegoo’s PETG filament offerings, which include such varieties like ‘Pro’ and ‘Rapid’. Both cost the same, but is there a reason to prefer one over the other, perhaps even just for specific applications? To test this, [Dr. Igor Gaspar] over at the My Tech Fun YouTube channel bought some spools of these two filaments and subjected both to a series of tests.

Obviously, the Rapid filament is rated for higher extrusion speeds – <270 vs <600 mm/s – while the website claims a higher required nozzle temperature that confusingly does not match those listed on the spool. There are quite a few differences in the listed specifications, including the physical and mechanical properties, which make it hard to draw any immediate conclusions. Could you perhaps just use Rapid PETG and forget about the Pro version?

Test objects were printed with a Bambu Lab P1P with an AMS unit. After calibrating the ideal temperature for each filament, a tensile break test gave a win to the Rapid PETG, followed by a layer adhesion test win. This pattern continued across further tests, with Rapid PETG either matching or beating the Pro PETG.

There are only two advantages of the Pro version that can be seen here, which are less moisture sensitivity and  stringing risk, and you of course get the luxury cardboard spool with the closed edges. Whether that’s enough to make you go ‘Pro’ remains to be seen, of course.

Continue reading “Elegoo Rapid PETG Vs PETG Pro: Same Price, Similar Specs, Which To Buy?”

Opening A Six-Lock Safe With One Key Using Brunnian Links

Brunnian links are a type of nontrivial link – or knot – where multiple linked loops become unlinked if a single loop is cut or removed. Beyond ‘fun’ disentanglement toys and a tantalizing subject of academic papers on knot theory, it can also be used for practical applications, as demonstrated by [Anthony Francis-Jones] in a recent video. In it we get a safe that is locked with multiple padlocks, each of which can unlock and open the safe by itself.

This type of locked enclosure is quite commonly used in military and other applications where you do not want to give the same key to each person in a group, yet still want to give each person full access. After taking us through the basics of Brunnian links, including Borromean rings, we are introduced to the design behind the safe with its six padlocks.

As a demonstration piece it uses cheap luggage padlocks and Perspex (acrylic) rods and sheets to give a vibrant and transparent view of its workings. During the assembly it becomes quite apparent how it works, with each padlock controlling one direction of motion of a piece, each of which can be used to disassemble the entire locking mechanism and open the safe.

Brunnian links are also found in the braids often made by children out of elastic bands, which together with this safe can be used to get children hooked on Brunnian links and general knot theory.

Continue reading “Opening A Six-Lock Safe With One Key Using Brunnian Links”

Caltech Scientists Make Producing Plastics From CO2 More Efficient

For decades there has been this tantalizing idea being pitched of pulling CO2 out of the air and using the carbon molecules for something more useful, like making plastics. Although this is a fairly simple process, it is also remarkably inefficient. Recently Caltech researchers have managed to boost the efficiency somewhat with a new two-stage process involving electrocatalysis and thermocatalysis that gets a CO2 utilization of 14%, albeit with pure CO2 as input.

The experimental setup with the gas diffusion electrode (GDE) and the copolymerization steps. (Credit: Caltech)
The experimental setup with the gas diffusion electrode (GDE) and the copolymerization steps. (Credit: Caltech)

The full paper as published in Angewandte Chemie International is sadly paywalled with no preprint available, but we can look at the Supplemental Information for some details. We can see for example the actual gas diffusion cell (GDE) starting on page 107 in which the copper and silver electrodes react with CO2 in a potassium bicarbonate (KHCO3) aqueous electrolyte, which produces carbon monoxide (CO) and ethylene (C2H4). These then react under influence of a palladium catalyst in the second step to form polyketones, which is already the typical way that these thermoplastics are created on an industrial scale.

The novelty here appears to be that the ethylene and CO are generated in the GDEs, which require only the input of CO2 and the potassium bicarbonate, with the CO2 recirculated for about an hour to build up high enough concentrations of CO and C2H4. Even so, the researchers note a disappointing final quality of the produced polyketones.

Considering that a big commercial outfit like Novomer that attempted something similar just filed for Chapter 11 bankruptcy protection, it seems right to be skeptical about producing plastics on an industrial scale, before even considering using atmospheric CO2 for this at less than 450 ppm.

View inside the vacuum vessel of Wendelstein 7-X in Greifswald, Germany. (Credit: Jan Hosan, MPI for Plasma Physics)

Wendelstein 7-X Sets New Record For The Nuclear Fusion Triple Product

Fusion product against duration, showing the Lawson criterion progress. (Credit: Dinklage et al., 2024, MPI for Plasma Physics)
Fusion product against duration, showing the Lawson criterion progress. (Credit: Dinklage et al., 2024, MPI for Plasma Physics)

In nuclear fusion, the triple product – also known as the Lawson criterion – defines the point at which a nuclear fusion reaction produces more power than is needed to sustain the fusion reaction. Recently the German Wendelstein 7-X stellarator managed to hit new records here during its most recent OP 2.3 experimental campaign, courtesy of a frozen hydrogen pellet injector developed by the US Department of Energy’s Oak Ridge National Laboratory. With this injector the stellarator was able to sustain plasma for over 43 seconds as microwaves heated the freshly injected pellets.

Although the W7-X team was informed later that the recently decommissioned UK-based JET tokamak had achieved a similar triple product during its last – so far unpublished – runs, it’s of note that the JET tokamak had triple the plasma volume. Having a larger plasma volume makes such an achievement significantly easier due to inherently less heat loss, which arguably makes the W7-X achievement more noteworthy.

The triple product is also just one of the many ways to measure progress in commercial nuclear fusion, with fusion reactors dealing with considerations like low- and high-confinement mode, plasma instabilities like ELMs and the Greenwald Density Limit, as we previously covered. Here stellarators also seem to have a leg up on tokamaks, with the proposed SQuID stellarator design conceivably leap-frogging the latter based on all the lessons learned from W7-X.

Top image: Inside the vacuum vessel of Wendelstein 7-X. (Credit: Jan Hosan, MPI for Plasma Physics)

Introducing PooLA Filament: Grass Fiber-Reinforced PLA

We’re probably all familiar with adding wood dust, hemp and carbon fibers to PLA filament, but there are so many other fillers one could add. During the completely unrelated recent heatwave in Germany, [Stefan] from CNCKitchen decided to give a new type of biodegradable filler type a shot by scooping some freshly dried cow patties off the very picturesque grazing fields near his place. In the resulting video a number of questions are answered about this ‘PooLA’ that nobody was asking, such as whether it makes for a good filler, and whether it smells bad while printing.

Perhaps unsurprisingly to those who have spent any amount of time around large herbivores like cows, cow dung doesn’t smell bad since it’s mostly composed of the grass fibers that are left over after the cow’s multiple stomachs and repeated chewing have done their thing. As [Stefan] and his colleagues thus found out was that printing with PooLA smells like printing with grass.

As for the practical benefits of PooLA, it adds a nice coloring, but like other ‘reinforced’ PLA filaments seems to trade flexibility for stiffness, so that at ratios of cow dung powder between 5 to 20% added to the PLA powder the test parts would break faster. Creating the filament was also a bit of a chore, for reasons that [Stefan] still has to figure out.

That said, aside from the technically unneeded bacterial corpses and other detritus in cow patties, using grass fibers in FDM filament isn’t a crazy idea, and might fit right in there with other fibers.

Continue reading “Introducing PooLA Filament: Grass Fiber-Reinforced PLA”

Measuring The Impact Of LLMs On Experienced Developer Productivity

Recently AI risk and benefit evaluation company METR ran a randomized control test (RCT) on a gaggle of experienced open source developers to gain objective data on how the use of LLMs affects their productivity. Their findings were that using LLM-based tools like Cursor Pro with Claude 3.5/3.7 Sonnet reduced productivity by about 19%, with the full study by [Joel Becker] et al. available as PDF.

This study was also intended to establish a methodology to assess the impact from introducing LLM-based tools in software development. In the RCT, 16 experienced open source software developers were given 246 tasks, after which their effective performance was evaluated.

A large focus of the methodology was on creating realistic scenarios instead of using canned benchmarks. This included adding features to code, bug fixes and refactoring, much as they would do in the work on their respective open source projects. The observed increase in the time it took to complete tasks with the LLM’s assistance was found to be likely due to a range of factors, including over-optimism about the LLM tool capabilities, LLMs interfering with existing knowledge on the codebase, poor LLM performance on large codebases, low reliability of the generated code and the LLM doing very poorly on using tacit knowledge and context.

Although METR suggests that this poor showing may improve over time, it seems fair to argue whether LLM coding tools are at all a useful coding partner.