Learn Assembly The FFmpeg Way

You want to learn assembly language. After all, understanding assembly unlocks the ability to understand what compilers are doing and it is especially important for time-critical code. But most tutorials are — well — boring. So you can print “Hello World” super fast. Who cares?

But decoding video data is something where assembly can really pay off, so why not study a real project like FFmpeg to see how they do things? Sounds like a pain, but thanks to the FFmpeg asm-lessons repository, it’s actually quite accessible.

According to the repo, you should already understand C — especially C pointers. They also expect you to understand some basic mathematics. Most of the FFmpeg code that uses assembly uses the single instruction multiple data (SIMD) opcodes. This allows you to do something like “add 5 to these 200 data items” very quickly compared to looping 200 times.

Continue reading “Learn Assembly The FFmpeg Way”

Demonstration of the multichannel design feature, being able to put identical blocks into your design, only route one of them, and have all the other blocks' routing be duplicated

KiCad 9 Moves Up In The Pro League

Do you do PCB design for a living? Has KiCad been just a tiny bit insufficient for your lightning-fast board routing demands? We’ve just been graced with the KiCad 9 release (blog post, there’s a FOSDEM talk too), and it brings features of the rank you expect from a professional-level monthly-subscription PCB design suite.

Of course, KiCad 9 has delivered a ton of polish and features for all sorts of PCB design, so everyone will have some fun new additions to work with – but if you live and breathe PCB track routing, this release is especially for you.

Continue reading “KiCad 9 Moves Up In The Pro League”

A New 8-bit CPU For C

It is easy to port C compilers to architectures that look like old minicomputers or bigger CPUs. However, as the authors of the Small Device C Compiler (SDCC) found, pushing C into a typical 8-bit CPU is challenging. Lessons learned from SDCC inspired a new 8-bit architecture, F8. This isn’t just a theoretical architecture. You can find an example Verilog implementation in the SDDC project and on GitHub. The name choice may turn out to be unfortunate as there was an F8 CPU from Fairchild back in the 1970s that apparently few people remember.

In the video from FOSDEM 2025, [Phillip Krause] provides a nice overview of the how and why of F8. While it might seem odd to create a new 8-bit CPU when you can get bigger CPUs for pennies, you have to consider that 8-bit machines are more than enough for many jobs, and if you can squeeze one into an FPGA, it might be a good choice as opposed to having to get a bigger FPGA to hold your design and a 32-bit CPU.

Continue reading “A New 8-bit CPU For C”

Homebrew CPU Gets A Beautiful Rotating Cube Demo

[James Sharman] designed and built his own 8-bit computer from scratch using TTL logic chips, including a VGA adapter, and you can watch it run a glorious rotating cube demo in the video below.

The rotating cube is the product of roughly 3,500 lines of custom assembly code and looks fantastic, running at 30 frames per second with shading effects from multiple light sources. Great results considering the computing power of his system is roughly on par with vintage 8-bit home computers, and the graphics capabilities are limited. [James]’s computer uses a tile map instead of a frame buffer, so getting 3D content rendered was a challenge.

The video is about 20 seconds of demo followed by a detailed technical discussion on how exactly one implements everything required for a 3D cube, from basic math to optimization. If a deep dive into that sort of thing is up your alley, give it a watch!

We’ve featured [James]’ fascinating work on his homebrew computer before. Here’s more detail on his custom VGA adapter, and his best shot at making it (kinda) run DOOM.

Continue reading “Homebrew CPU Gets A Beautiful Rotating Cube Demo”

Get Ready For KiCAD 9!

Rev up your browsers, package managers, or whatever other tool you use to avail yourself of new software releases, because the KiCAD team have announced that barring any major bugs being found in the next few hours, tomorrow should see the release of version 9 of the open source EDA suite. Who knows, depending on where you are in the world that could have already happened when you read this.

Skimming through the long list of enhancements brought into this version there’s one thing that strikes us; how this is now a list of upgrades and tweaks to a stable piece of software rather than essential features bringing a rough and ready package towards usability. There was a time when using KiCAD was a frustrating experience of many quirks and interface annoyances, but successive versions have improved it beyond measure. We would pass comment that we wished all open source software was as polished, but the fact is that much of the commercial software in this arena is not as good as this.

So head on over and kick the tires on this new KiCAD release, assuming that it passes those final checks. We look forward to the community’s verdict on it.

How Hard Is It To Write A Calculator App?

How hard can it be to write a simple four-function calculator program? After all, computers are good at math, and making a calculator isn’t exactly blazing a new trail, right? But [Chad Nauseam] will tell you that it is harder than you probably think. His post starts with a screenshot of the iOS calculator app with a mildly complex equation. The app’s answer is wrong. Android’s calculator does better on the same problem.

What follows is a bit of a history lesson and a bit of a math lesson combined. As you might realize, the inherent problem with computers and math isn’t that they aren’t good at it. Floating point numbers have a finite precision and this leads to problems, especially when you do operations that combine large and small numbers together.

Indeed, any floating point representation has a bigger infinity of numbers that it can’t represent than those that it can. But the same is true of a calculator. Think about how many digits you are willing to type in, and how many digits you want out. All you want is for each of them to be correct, and that’s a much smaller set of numbers.

Continue reading “How Hard Is It To Write A Calculator App?”

Why AI Usage May Degrade Human Cognition And Blunt Critical Thinking Skills

Any statement regarding the potential benefits and/or hazards of AI tends to be automatically very divisive and controversial as the world tries to figure out what the technology means to them, and how to make the most money off it in the process. Either meaning Artificial Inference or Artificial Intelligence depending on who you ask, AI has seen itself used mostly as a way to ‘assist’ people. Whether in the form of a chat client to answer casual questions, or to generate articles, images and code, its proponents claim that it’ll make workers more efficient and remove tedium.

In a recent paper published by researchers at Microsoft and Carnegie Mellon University (CMU) the findings from a survey are however that the effect is mostly negative. The general conclusion is that by forcing people to rely on external tools for basic tasks, they become less capable and prepared of doing such things themselves, should the need arise. A related example is provided by Emanuel Maiberg in his commentary on this study when he notes how simple things like memorizing phone numbers and routes within a city are deemed irrelevant, but what if you end up without a working smartphone?

Does so-called generative AI (GAI) turn workers into monkeys who mindlessly regurgitate whatever falls out of the Magic Machine, or is there true potential for removing tedium and increasing productivity?

Continue reading “Why AI Usage May Degrade Human Cognition And Blunt Critical Thinking Skills”