EmuDevz Is Literally A Software Game

The idea of gamifying all the things might have died down now that the current hype is shoving AI into all the things — but you’ve probably never seen it quite like EmuDevz, a game in which you develop an 8-bit emulator by [Rodrigo Alfonso].

There’s a lot of learning you’ll have to do along the way, about programming and how retro systems work, including diving into 6502 assembly code. Why 6502? Well, the emulator you’re working on (it’s partially-written at the start of the game; you need only debug and finish the job) is for a fantasy system called the NEEES “an antique game console released in 1983”. It’s the NEEES and not NES for two reasons. One, Nintendo has lawyers and they really, really know how to use them. Two, by creating a fantasy console that is not-quite-a-Famicom, the goalposts for EmuDevz can be moved a bit closer in.

Continue reading “EmuDevz Is Literally A Software Game”

A New Cartridge For An Old Computer

Although largely recognizable to anyone who had a video game console in the 80s or 90s, cartridges have long since disappeared from the computing world. These squares of plastic with a few ROM modules were a major route to get software for a time, not only for consoles but for PCs as well. Perhaps most famously, the Commodore VIC-20 and Commodore 64 had cartridge slots for both gaming and other software packages. As part of the Chip Hall of Fame created by IEEE Spectrum, [James] found himself building a Commodore cartridge more than three decades after last working in front of one of these computers.

[James] points out that even by the standards of the early 80s the Commodore cartridges were pretty low on specs. They’re limited to 16 kB, which means programming in assembly and doing things like interacting with video hardware directly. Luckily there’s a treasure trove of documentation about the C64 nowadays as well as a number of modern programming tools for them, in contrast to the 80s when tools and documentation were scarce or nonexistent. Hardware these days is cheap as well; the cartridge PCB and other hardware cost only a few dollars, and the case for it can easily be 3D printed.

Burning the software to the $3 ROM chip was straightforward as well with a TL866 programmer, although [James] left a piece of memory management code in the first pass which caused the C64 to lock up. Removing this code and flashing the chip again got the demo up and running though, and it’ll be on display at their travelling “Chips that Changed the World” exhibit. If you find yourself in the opposite situation, though, we’ve also seen projects that cleverly pull the data off of ancient C64 ROM chips for preservation.

Behind The Bally Home Computer System

Although we might all fundamentally recognize that gaming consoles are just specialized computers, we generally treat them, culturally and physically, differently than we do desktops or laptops. But there was a time in the not-too-distant past where the line between home computer and video game console was a lot more blurred than it is today. Even before Microsoft entered the scene, companies like Atari and Commodore were building both types of computer, often with overlapping hardware and capabilities. But they weren’t the only games in town. This video takes a look at the Bally Home Computer System, which was a predecessor of many of the more recognized computers and gaming systems of the 80s.

At the time, Bally as a company was much more widely known in the pinball industry, but they seemed to have a bit of foresight that the computers used in arcades would eventually transition to the home in some way. The premise of this console was to essentially start out as a video game system that could expand into a much more full-featured computer with add-ons. In addition to game cartridges it came with a BASIC interpreter cartridge which could be used for programming. It was also based on the Z80 microprocessor which was used in other popular PCs of the time, so in theory it could have been a commercial success but it was never able to find itself at the top of the PC pack.

Although it maintains a bit of a cult following, it’s a limited system even by the standards of the day, as the video’s creator [Vintage Geek] demonstrates. The controllers are fairly cumbersome, and programming in BASIC is extremely tedious without a full keyboard available. But it did make clever use of the technology at the time even if it was never a commercial success. Its graphics capabilities were ahead of other competing systems and would inspire subsequent designs in later systems. It’s also not the last time that a video game system that was a commercial failure would develop a following lasting far longer than anyone would have predicted.

Continue reading “Behind The Bally Home Computer System”

Hardware Built For Executing Python (Not Pythons)

Lots of microcontrollers will accept Python these days, with CircuitPython and MicroPython becoming ever more popular in recent years. However, there’s now a new player in town. Enter PyXL, a project to run Python directly in hardware for maximum speed.

What’s the deal with PyXL? “It’s actual Python executed in silicon,” notes the project site. “A custom toolchain compiles a .py file into CPython ByteCode, translates it to a custom assembly, and produces a binary that runs on a pipelined processor built from scratch.” Currently, there isn’t a hard silicon version of PyXL — no surprise given what it costs to make a chip from scratch. For now, it exists as logic running on a Zynq-7000 FPGA on a Arty-Z7-20 devboard. There’s an ARM CPU helping out with setup and memory tasks for now, but the Python code is executed entirely in dedicated hardware.

The headline feature of PyXL is speed. A comparison video demonstrates this with a measurement of GPIO latency. In this test, the PyXL runs at 100 MHz, achieving a round-trip latency of 480 nanoseconds. This is compared to MicroPython running on a PyBoard at 168 MHz, which achieves a much slower 15,000 nanoseconds by comparison. The project site claims PyXL can be 30x faster than MicroPython based on this result, or 50x faster when normalized for the clock speed differences.

Python has never been the most real-time of languages, but efforts like this attempt to push it this way. The aim is that it may finally be possible to write performance-critical code in Python from the outset. We’ve taken a look at Python in the embedded world before, too, albeit in very different contexts.

Continue reading “Hardware Built For Executing Python (Not Pythons)”

Vibing, AI Style

This week, the hackerverse was full of “vibe coding”. If you’re not caught up on your AI buzzwords, this is the catchy name coined by [Andrej Karpathy] that refers to basically just YOLOing it with AI coding assistants. It’s the AI-fueled version of typing in what you want to StackOverflow and picking the top answers. Only, with the current state of LLMs, it’ll probably work after a while of iterating back and forth with the machine.

It’s a tempting vision, and it probably works for a lot of simple applications, in popular languages, or generally where the ground is already well trodden. And where the stakes are low, as [Al Williams] pointed out while we were talking about vibing on the podcast. Can you imagine vibe-coded ATM software that probably gives you the right amount of money? Vibe-coding automotive ECU software?

While vibe coding seems very liberating and hands-off, it really just changes the burden of doing the coding yourself into making sure that the LLM is giving you what you want, and when it doesn’t, refining your prompts until it does. It’s more like editing and auditing code than authoring it. And while we have no doubt that a stellar programmer like [Karpathy] can verify that he’s getting what he wants, write the correct unit tests, and so on, we’re not sure it’s the panacea that is being proclaimed for folks who don’t already know how to code.

Vibe coding should probably be reserved for people who already are expert coders, and for trivial projects. Just the way you wouldn’t let grade-school kids use calculators until they’ve mastered the basics of math by themselves, you shouldn’t let junior programmers vibe code: It simultaneously demands too much knowledge to corral the LLM, while side-stepping any of the learning that would come from doing it yourself.

And then there’s the security side of vibe coding, which opens up a whole attack surface. If the LLM isn’t up to industry standards on simple things like input sanitization, your vibed code probably shouldn’t be anywhere near the Internet.

So should you be vibing? Sure! If you feel competent overseeing what [Dan] described as “the worst summer intern ever”, and the states are low, then it’s absolutely a fun way to kick the tires and see what the tools are capable of. Just go into it all with reasonable expectations.

Programmer’s Macro Pad Bangs Out Whole Functions

Macro pads are handy for opening up your favorite programs or executing commonly used keyboard shortcuts. But why stop there?

That’s what [Jeroen Brinkman] must have been thinking while creating the Programmer’s Macro Pad. Based on the Arduino Pro Micro, this hand-wired pad is unique in that a single press of any of its 16 keys can virtually “type” out multiple lines of text. In this case, it’s a capability that’s being used to prevent the user from having to manually enter in commonly used functions, declarations, and conditional statements.

For example, in the current firmware, pressing the “func” key will type out a boilerplate C function:

int () { //
;
return 0;
}; // f 

It will also enter in the appropriate commands to put the cursor where it needs to be so you can actually enter in the function name. The other keys such as “array” and “if” work the same way, saving the user from having to enter (and potentially, even remember) the correct syntax.

The firmware is kept as simple as possible, meaning that the functionality of each key is currently hardcoded. Some kind of tool that would let you add or change macros without having to manually edit the source code and flash it back to the Arduino would be nice…but hey, it is a Programmers Macro Pad, after all.

Looking to speed up your own day-to-day computer usage? We’ve covered a lot of macro pads over the years, we’re confident at least a few of them should catch your eye.

Moving Software Down To Hardware

In theory, any piece of software could be built out of discrete pieces of hardware, provided there are enough transistors, passive components, and time available. In general, though, we’re much more likely to reach for a programmable computer or microcontroller for all but the simplest tasks for several reasons: cost, effort, complexity, economics, and sanity. [Igor Brichkov] was working with I2C and decided that he wanted to see just where this line between hardware and software should be by implementing this protocol itself directly with hardware.

One of the keys to “programming” a communications protocol in hardware is getting the timing right, the first part of which is initializing communications between this device and another on the bus. [Igor] is going to be building up the signal in parts and then ORing them together. The first part is a start condition, generated by one oscillator and a counter. This also creates a pause, at which point a second oscillator takes over and sends data out. The first data needed for I2C is an address, which is done with a shift register and a counter pre-set to send the correct bits out on the communications lines.

To build up the rest of the signal, including data from the rotary encoder [Igor] is using for his project, essentially sets of shift registers and counters are paired together to pass data out through the I2C communications lines in sequence. It could be thought of that the main loop of the hardware program is a counter, which steps through all the functions sequentially, sending out data from the shift registers one by one. We saw a similar project over a decade ago, but rather than automating the task of sending data on I2C it allowed the user to key in data manually instead.

Continue reading “Moving Software Down To Hardware”