Jenny’s Daily Drivers: FreeBSD 13.2

Last month I started a series in which I try out different operating systems with the aim of using them for my everyday work, and my pick was Slackware 15, the latest version of the first Linux distro I tried back in the mid 1990s. I’ll be back with more Linux-based operating systems in due course, but the whole point of this series is to roam as far and wide as possible and try every reasonable OS I can. Thus today I’m making the obvious first sideways step and trying a BSD-based operating system. These are uncharted waters for me and there was a substantial choice to be made as to which one, so after reading around the subject I settled on FreeBSD as it seemed the most accessible.

First, A Bit Of Context

A PC with the FreeBSD boot screen
Success! My first sight of a working FreeBSD installation.

Most readers will be aware that the BSD operating systems trace their heritage in a direct line back to the original AT&T UNIX, while GNU/Linux is a pretty good UNIX clone originating with Linus Torvalds in the early 1990s and Richard Stallman’s GNU project from the 1980s onwards. This means that for Linux users there’s a difference in language to get used to.

Where Linux is a kernel around which distributions are built with different implementations of the userland components, the various BSD operating systems are different operating systems in their own right. Thus we talk about for example Slackware and Debian as different Linux distributions, but by contrast NetBSD and FreeBSD are different operating systems even if they have a shared history. There are BSD distributions such as GhostBSD which use FreeBSD as its core, but it’s a far less common word in this context. So I snagged the FreeBSD 13.2 USB stick file from the torrent, and wrote it to a USB Flash drive. Out with the Hackaday test PC, and on with the show. Continue reading “Jenny’s Daily Drivers: FreeBSD 13.2”

Ask Hackaday: What’s The Deal With Humanoid Robots?

When the term ‘robot’ gets tossed around, our minds usually race to the image of a humanoid machine. These robots are a fixture in pop culture, and often held up as some sort of ideal form.

Yet, one might ask, why the fixation? While we are naturally obsessed with recreating robots in our own image, are these bipedal machines the perfect solution we imagine them to be?

Continue reading “Ask Hackaday: What’s The Deal With Humanoid Robots?”

Would We Recognize Extraterrestrial Technology If We Saw It?

There’s a common critique in science fiction series like Star Trek about the extraterrestrial species not looking ‘alien’ enough, as well as about their technology being strangely similar to our own, not to mention compatible to the point where their widgets can be integrated into terrestrial systems by any plucky engineer. Is this critique justified, or perhaps more succinctly put: if we came across real extraterrestrial life with real extraterrestrial technology, would we even notice? Would an alien widget borrowed of an alien spacecraft even work with our own terrestrial spacecraft’s system?

Within the domain of exobiology there are still plenty of discussions on the possible formation and evolutionary paths conceivable within the Universe, but the overarching consensus seems to be that it’s hard to escape the herding effect of fundamental physics. For lifeforms, carbon-based chemistry is the only reasonable option, and when it comes to technology, it’s hard to not end up at technology using the same physical principles which we presume to exist across the Universe, which would practically guarantee some level of interoperability.

What’s notable here is that over the past years, a number of people have claimed to have observed potential alien technology in our Solar System, in particular the ʻOumuamua asteroid in 2017 and a more recent claim by astrophysicist Abraham Loeb regarding an interstellar meteor that impacted Earth in 2019, which he says could be proof of ‘alien technology’. This raises the question of whether we are literally being pummeled by extraterrestrial spacecraft these days.

Continue reading “Would We Recognize Extraterrestrial Technology If We Saw It?”

How To Survive A Wet Bulb Event

Territories across the northern hemisphere are suffering through record-breaking heatwaves this summer. Climate scientists are publishing graphs with red lines jagging dangerously upwards as unprecedented numbers pour in. Residents of the southern hemisphere watch on, wondering what the coming hot season will bring.

2023 is hinting at a very real climate change that we can’t ignore. As the mercury rises to new heights, it’s time to educate yourself on the very real dangers of a wet bulb event. Scientists predict that these deadly weather conditions could soon strike in the hottest parts of the world. What you learn here could end up saving your life one day.

Hot Bodies

The body has methods of maintaining a set temperature. Credit: Wikimedia Commons, CNX OpenStax, CC BY-SA 4.0

To understand the danger of a wet bulb event, we must first understand how our bodies work. The human body likes to maintain its  temperature at approximately 37 °C (98.6 °F). That temperature can drift slightly, and the body itself will sometimes move its temperature setpoint higher to tackle infection, for example. The body is a delicate thing, however, and a body temperature above 40 °C (104 °F) can become life threatening. Seizures, organ failures, and unconsciousness are common symptoms of an overheating human. Death is a near-certainty if the body’s temperature reaches 44 °C (112 °F), though in one rare case, a patient in a coma survived a body temperature of 46.5 °C (115.7 °F).

Thankfully, the body has a host of automated systems for maintaining its temperature at its chosen set point. Blood flow can be controlled across the body, and we instinctively seek to shed clothes in the heat and cover ourselves in the cold. However, the bare naked fact is that one system is most crucial to our body’s ability to cool itself. The perspiration system is vital, as it uses sweat to cool our body via evaporation. Water is a hugely effective coolant in this way, with beads of sweat soaking up huge amounts of heat from our skin as they make the phase change from liquid to vapor.

Continue reading “How To Survive A Wet Bulb Event”

ChatGPT, The Worst Summer Intern Ever

Back when I used to work in the pharma industry, I had the opportunity to hire summer interns. This was a long time ago, long enough that the fresh-faced college students who applied for the gig are probably now creeping up to retirement age. The idea, as I understood it, was to get someone to help me with my project, which at the time was standing up a distributed data capture system with a large number of nodes all running custom software that I wrote, reporting back to a central server running more of my code. It was more work than I could manage on my own, so management thought they’d take mercy on me and get me some help.

The experience didn’t turn out quite like I expected. The interns were both great kids, very smart, and I learned a lot from them. But two months is a very tight timeframe, and getting them up to speed took up most of that time. Add in the fact that they were expected to do a presentation on their specific project at the end of the summer, and the whole thing ended up being a lot more work for me than if I had just done the whole project myself.

I thought about my brief experience with interns recently with a project I needed a little help on. It’s nothing that hiring anyone would make sense to do, but still, having someone to outsource specific jobs to would be a blessing, especially now that it’s summer and there’s so much else to do. But this is the future, and the expertise and the combined wisdom of the Internet are but a few keystrokes away, right? Well, maybe, but as you’ll see, even the power of large language models has its limit, and trying to loop ChatGPT in as a low-effort summer intern leaves a lot to be desired.

Continue reading “ChatGPT, The Worst Summer Intern Ever”

PCIe For Hackers: Our M.2 Card Is Done

We’ve started designing a PCIe card last week, an adapter from M.2 E-key to E-key, that adds an extra link to the E-key slot it carries – useful for fully utilizing a few rare but fancy E-key cards. By now, the schematic is done, the component placement has been figured out, and we only need to route the differential pairs – should be simple, right? Buckle up.

Getting Diffpairs Done

PCIe needs TX pairs connected to RX on another end, like UART – and this is non-negotiable. Connectors will use host-side naming, and vice-versa. As the diagram demonstrates, we connect the socket’s TX to chip’s RX and vice-versa; if we ever get confused, the laptop schematic is there to help us make things clear. To sum up, we only need to flip the names on the link coming to the PCIe switch, since the PCIe switch acts as a device on the card; the two links from the switch go to the E-key socket, and for that socket’s purposes, the PCIe switch acts as a host.

While initially routing this board, I absolutely forgot about one more important thing for PCIe – series capacitors on every data pair, on the host TX side of the link. We need three capacitor pairs here – on TX of the PCIe switch uplink, and two pairs on TX side of the switch – again, naming is host-side. I only remembered this after having finished routing all the diffpairs, and, after a bit of deliberation, I decided that this is my chance to try 0201 capacitors. For that, I took the footprints from [Christoph]‘s wonderful project, called “Effect of moon phase on tombstoning” – with such a name, these footprints have got to be good.

We’ve talked about differential pair calculations before in one of the PCIe articles, and there was a demo video too! That said, let’s repeat the calculations on this one – I’ll show how to get from “PCB fab website information” to “proper width and clearance diffpairs”, with a few fun shortcuts. Our setup is, once again, having signals on outer layers, referenced to the ground layer right below them. I, sadly, don’t yet understand how to calculate differential impedance for signal layers sandwiched between two ground planes, which is to say – if there’s any commenters willing to share this knowledge, I’d appreciate your input tremendously! For now, I don’t see that there’d be a tangible benefit to such an arrangement, anyway.

Continue reading “PCIe For Hackers: Our M.2 Card Is Done”

DisplayPort: Tapping The Altmode

Really, the most modern implementation of DisplayPort is the USB-C DisplayPort altmode, synonymous with “video over USB-C”, and we’d miss out if I were to skip it. Incidentally, our last two articles about talking USB-PD have given a few people a cool new toy to play with – people have commented on the articles, reached out to me for debugging help, and I’ve even seen people build the FUSB302B into their projects! Hot on the heels of that achievement, let’s reach further and conquer one more USB-C feature – one that isn’t yet openly available for us to hack on, even though it deserves to be.

For our long-time readers, it’s no surprise to see mundane capabilities denied to hackers. By now, we all know that many laptops and phones let you get a DisplayPort connection out of a USB-C port. Given that the USB-C specifications are openly available, and we’ve previously implemented a PD sink using those specifications, you’d expect that we could do DisplayPort with the same ease. Yet, the DisplayPort altmode specification is behind a VESA membership paywall, with a hefty pricetag – a practice of theirs that has been widely criticized, counter to their purpose as a standards organization and having resulted in some of their standards failing.

Not to worry, however – we can easily find an assortment of PDFs giving a high-level overview and some details of the DisplayPort altmode, and here’s my favorite! I also have a device running MicroPython with a FUSB302 chip connected, and a few DisplayPort altmode devices of mine that I can disassemble. This, turns out, is more than enough for us to reverse-engineer our way into an open-source DisplayPort altmode library!

Continue reading “DisplayPort: Tapping The Altmode”