Tech Support… Can AI Be Worse?

You can’t read the news today without another pundit excitedly reporting how AI is going to take every job you can imagine. Of course, AI will change the employment landscape. It will take some jobs and reduce the need for others. What about tech support? Is it possible that an AI might be able to help people with technical issues better than humans? My first answer was no way, but then I was painfully reminded of something. The question isn’t if AI can help you better than any human can. The question is if AI can help you better than the low-paid person on the other end of the phone you are likely to talk to. Sadly, I think the answer to that question is almost certainly yes.

In all fairness, if you read Hackaday, you probably don’t encounter many technical support people who can solve a problem you can’t. By the time you call them, it is a lost cause. But this is more than just “Hackday folks are smarter than the tech support agents.” The overall quality of tech support at many companies is rock bottom no matter who you are. Continue reading “Tech Support… Can AI Be Worse?”

It’s About Time

I’m pretty good with time zones. After all, I live in Germany, Hackaday’s server is in Los Angeles, and our writers are scattered all over the globe. I’m always translating one time into another, and practice makes (nearly) perfect. But still, it got me.

I was in the states visiting my parents, when Daylight Saving Time struck, but only in the USA. Now all my time conversions were off by an hour, and once I’d worked through the way the sun travels around the globe, I thought I had it made. And then my cell phone started reporting a time that was neither CEST nor EDT, but a third time zone that was an hour off. Apparently some cell towers don’t transmit time zone information, and my phone defaults to UTC. Who knew? For a short while, my phone lied to me, the microwave oven clock in the hotel lied to me, and I felt like I was going nuts.

But this all got me thinking about clocks and human time, and possibly the best advice I’ve ever heard for handling it in your own programs. Always keep time in something sensible like UNIX time – seconds elapsed since an epoch – because you don’t have to worry about anything more than adding one to a counter every second. When and if you need to convert to or from human times, you can write the function to do that simply enough, if you don’t already have a library function to do so.

Want to set an alarm for 2 hours from now? That’s easy, because you only need to add 7,200 seconds, and you don’t need to worry about 59 wrapping around to 0 or 23:59 to 0:00. Time math is easy in seconds. February 29th? That’s just another 86,400 seconds. It’s only us humans who make it complicated.

Why X86 Needs To Die

As I’m sure many of you know, x86 architecture has been around for quite some time. It has its roots in Intel’s early 8086 processor, the first in the family. Indeed, even the original 8086 inherits a small amount of architectural structure from Intel’s 8-bit predecessors, dating all the way back to the 8008. But the 8086 evolved into the 186, 286, 386, 486, and then they got names: Pentium would have been the 586.

Along the way, new instructions were added, but the core of the x86 instruction set was retained. And a lot of effort was spent making the same instructions faster and faster. This has become so extreme that, even though the 8086 and modern Xeon processors can both run a common subset of code, the two CPUs architecturally look about as far apart as they possibly could.

So here we are today, with even the highest-end x86 CPUs still supporting the archaic 8086 real mode, where the CPU can address memory directly, without any redirection. Having this level of backwards compatibility can cause problems, especially with respect to multitasking and memory protection, but it was a feature of previous chips, so it’s a feature of current x86 designs. And there’s more!

I think it’s time to put a lot of the legacy of the 8086 to rest, and let the modern processors run free. Continue reading “Why X86 Needs To Die”

Hacking And Working On The Go

I’m off visiting my parents for a while, and have managed to bring nearly everything along with me that I need to get work done, and it all fit in a small backpack! This includes a portable audio interface to run my podcast mic, two (count them) two Linux computers, and all manner of simple hacking tools. Microcontrollers with USB/serial adapters built in are a godsend.

But putting together the minimal setup was no easy task! Alone the USB cable assortment I had to bring was astounding. And in the end, it looks like I forgot a USB-B mini, and good luck finding that at the local drug store. (I know! But the Zoom recorder wants mini. Don’t ask me why.)

And then there’s the power adapters — brick for the laptop, USB-C fast charger for the Steam Deck, another wall-plug USB for recharging the power banks. And of course, this silly custom keyboard which I’m so used to typing on, and which embodies so much muscle memory in its macros that I’m practically helpless without it.

So fundamentally, I’m astounded by the amount of functionality I could cram into my pack, but I’m also aghast at all the little things that add up around the edges. And I’m sure that I’ll find stuff that I’m missing in the next few weeks.

Do you need to travel for work with your full kit? What’s your approach? Minimal? Maximal? Leave us your hacker travel kit tips in the comments.

Sometimes It’s The Little Things

I had one of those why-didn’t-I-think-of-it moments this week, reading this article about multiplexing I2C on the ESP32 microcontroller. The idea is so good, and so simple, that it’s almost silly that it’s not standard hacker practice. And above all, it actually helps solve a problem that I’ve got. This is why I read Hackaday every day.

I2C is great in that it lets you connect up multiple devices to a pair of wires using a very bus architecture. Every device has its own address, the host calls them out, and hopefully all other devices keep quiet while just the right one responds. But what happens when you want to use a few of the same sensors, where each IC has the same address? The usual solution is to buy a multiplexer chip.

But many modern microcontrollers, like the ESP32, have an internal multiplexer setup that lets you map the pins with the dedicated hardware peripherals, usually at initialization time. Indeed, I’ve been doing it as an “init” task so long, I never thought to do it otherwise. But that’s exactly the idea behind [BastelBaus]’s hack – if you dynamically reassign the pins, you can do the I2C multiplexing with the chip you’ve got. This should probably work for any other chips that have multiple assignable pins for hardware peripherals as well.

Cool idea, but really simple. Why hadn’t I ever thought of it? I think it’s because I’ve always had this init / mainloop schema in my mind, which for instance the Arduino inherited and formalized in its setup() and loop() functions. Pin mappings go in the init section, right?

So what this hack really amounts to, for me, is a rethinking of what’s static and what’s dynamic. It’s always worth questioning your assumptions, especially when you’re facing a problem that requires a creative solution. Sometimes limitations are only in your mind. Have you had your mind opened recently by a tiny little hack?

The White House Memory Safety Appeal Is A Security Red Herring

In the Holy Programming Language Wars, the lingua franca of system programming – also known as C – is often lambasted for being unsecure, error-prone, and plagued with more types of behavior that are undefined than ones that are defined by the C standards. Many programming languages were said to be ‘C killers’, yet C is still alive today. That didn’t stop the US White House’s Office of the National Cyber Director (ONCD) from putting out a report in which both C and C++ got lambasted for being ‘unsafe’ when it came to memory management.

The full report (PDF) is pretty light on technical details, while citing only blog posts by Microsoft and Google as its ‘expert sources’. The claim that memory safety issues are the primary cause of CVEs is not substantiated, or at least ignores the severity of CVEs when looking at the CISA statistics for active exploits. Beyond this call for ‘memory safety’, the report then goes on to effectively call for more testing and validation, while kicking in doors that were opened back in the 1970s already with the Steelman requirements and the High Order Language Working Group (HOLWG) of 1975.

What truly is the impact and factual basis of the ONCD report?

Continue reading “The White House Memory Safety Appeal Is A Security Red Herring”

Want To Learn Binary? Draw Space Invaders!

This was the week that I accidentally taught my nearly ten-year-old son binary. And I didn’t do it on purpose, I swear.

It all started innocently enough. He had a week vacation, and on one of those days, we booked him a day-course for kids at our local FabLab. It was sold as a “learn to solder” class, and the project they made was basically a MiniPOV: eight LEDs driven by a museum-piece AVR ATtiny2313. Blinking lights make a pattern in your persistence of vision as you swipe it back and forth.

The default pattern was a heart, which is nice enough. But he wanted to get his own designs in there, and of course he knows that I know how to flash the thing with new code. So I got him to solder on an ISP header and start drawing patterns on grids of graph paper while I got the toolchain working and updated some of the 2000’s-era code so it would compile.

There’s absolutely no simpler way to get your head around binary than to light up a row of LEDs, and transcribing the columns of his fresh pixel art into ones and zeros was just the motivation he needed. We converted the first couple rows into their decimal equivalents, but it was getting close to dinner time, so we cheesed out with the modern 0b00110100 format for the rest. This all happened quite organically; “unintentional parenting” is what we call it.

While we were eating dinner, I got the strangest sense of deja vu. When I was around ten or eleven, my own father told me about the custom fonts for the Okidata 24-pin printer at his lab, because he needed me out of his hair for a while, and I set out to encode all of the Hobbit runes for it. (No comment.) He must have handed me a piece of graph paper explained how it goes, and we had a working rune font by evening. That was probably how I learned about binary as well.

Want to teach someone binary? Give them a persistence of vision toy, or a dot-matrix printer.

(Art is from a much older POV project: Trakr POV — a hack of an old kids’ toy to make a long-exposure POV image. But it looks cool, and it gets the point across.)