The Great Windows 11 Computer Extinction Experiment

There was a time when a new version of Windows was a really big deal, such the launch of Windows 95 for which the tones of the Rolling Stones’ Start me up could be heard across all manner of media outlets. Gradually over years this excitement has petered out, finally leaving us with Windows 10 that would, we were told, be the last ever version of the popular operating system and thence only receive continuous updates

But here we are in 2021, and a new Windows has been announced. Windows 11 will be the next latest and greatest from Redmond, but along with all the hoopla there has been an undercurrent of concern. Every new OS comes with a list of hardware requirements, but those for Windows 11 seem to go beyond the usual in their quest to cull older hardware. Aside from requiring Secure Boot and a Trusted Platform Module that’s caused a run on the devices, they’ve struck a load of surprisingly recent processors including those in some of their current Surface mobile PCs off their supported list, and it’s reported that they will even require laptops to have front-facing webcams if they wish to run Windows 11.

Continue reading “The Great Windows 11 Computer Extinction Experiment”

Garage Semiconductor Fab Gets Reactive-Ion Etching Upgrade

It’s a problem that few of us will likely ever face: once you’ve built your first homemade integrated circuit, what do you do next? If you’re [Sam Zeloof], the answer is clear: build better integrated circuits.

At least that’s [Sam]’s plan, which his new reactive-ion etching setup aims to make possible. While his Z1 dual differential amplifier chip was a huge success, the photolithography process he used to create the chip had its limitations. The chemical etching process he used is a bit fussy, and prone to undercutting of the mask if the etchant seeps underneath it. As its name implies, RIE uses a plasma of highly reactive ions to do the etching instead, resulting in finer details and opening the door to using more advanced materials.

[Sam]’s RIE rig looks like a plumber’s stainless steel nightmare, in the middle of which sits a vacuum chamber for the wafer to be etched. After evacuating the air, a small amount of fluorinated gas — either carbon tetrafluoride or the always entertaining sulfur hexafluoride — is added to the chamber. A high-voltage feedthrough provides the RF energy needed to create a plasma, which knocks fluorine ions out of the process gas. The negatively charged and extremely reactive fluorine ions are attracted to the wafer, where they attack and etch away the surfaces that aren’t protected by a photoresist layer.

It all sounds simple enough, but the video below reveals the complexity. There are a lot of details, like correctly measuring vacuum, avoiding electrocution, keeping the vacuum pump oil from exploding, and dealing with toxic waste products. Hats off to [Sam’s dad] for pitching in to safely pipe the exhaust gases through the garage door. This ties with [Huygens Optics]’s latest endeavor for the “coolest things to do with fluorine” award.

Continue reading “Garage Semiconductor Fab Gets Reactive-Ion Etching Upgrade”

Just How Did 1500 Bytes Become The MTU Of The Internet?

[Benjojo] got interested in where the magic number of 1,500 bytes came from, and shared some background on just how and why it seems to have come to be. In a nutshell, the maximum transmission unit (MTU) limits the maximum amount of data that can be transmitted in a single network-layer transaction, but 1,500 is kind of a strange number in binary. For the average Internet user, this under the hood stuff doesn’t really affect one’s ability to send data, but it has an impact from a network management point of view. Just where did this number come from, and why does it matter?

[Benjojo] looks at a year’s worth of data from a major Internet traffic exchange and shows, with the help of several graphs, that being stuck with a 1,500 byte MTU upper limit has real impact on modern network efficiency and bandwidth usage, because bandwidth spent on packet headers adds up rapidly when roughly 20% of all packets are topping out the 1,500 byte limit. Naturally, solutions exist to improve this situation, but elegant and effective solutions to the Internet’s legacy problems tend to require instant buy-in and cooperation from everyone at once, meaning they end up going in the general direction of nowhere.

So where did 1,500 bytes come from? It appears that it is a legacy value originally derived from a combination of hardware limits and a need to choose a value that would play well on shared network segments, without causing too much transmission latency when busy and not bringing too much header overhead. But the picture is not entirely complete, and [Benjojo] asks that if you have any additional knowledge or insight about the 1,500 bytes decision, please share it because manuals, mailing list archives, and other context from that time is either disappearing fast or already entirely gone.

Knowledge fading from record and memory is absolutely a thing that happens, but occasionally things get saved instead of vanishing into the shadows. That’s how we got IGNITION! An Informal History of Liquid Rocket Propellants, which contains knowledge and history that would otherwise have simply disappeared.