It’s MacOS. On An Unmodified Wii!

We’re used to the so-called “Hackintoshes”, non-Apple hardware running MacOS. One we featured recently was even built into the case of a Nintendo Wii. But [Dandu] has gone one better than that, by running MacOS on an unmodified Wii, original Nintendo hardware (French, Google Translate link).

How has this seemingly impossible task been achieved? Seasoned Mac enthusiasts will remember the days when Apple machines used PowerPC processors, and the Wii uses a PowerPC chip that’s a close cousin of those used in the Mac G3 series of computers. Since the Wii can run a Linux-based OS, it can therefore run Mac-on-Linux, providing in theory an environment in which it can host one of the PowerPC versions of MacOS.

The installation sequence has more than its share of difficulties, but eventually he was able to get the Wii running MacOS 9, the last classic MacOS. It runs DOOM, Internet Explorer 5, and iTunes even on these limited resources, though the last package had display and sound issues. He then tries a MacOS X build, but without success.

It’s fair to say that this is not exactly a way to get your hands on a cheap Mac, and remains more of an exercise in pushing a console beyond its original function. But it’s still an interesting diversion, and maybe someone will in time make a MacOS X version work on the Wii too. If you’re curious about the Mac-in-a-Wii that inspired this work, you can see it here.

The 13.5 Million Core Computer

Having a dual- or quad-core CPU is not very exotic these days and CPUs with 12 or even 16 cores aren’t that rare. The Andromeda from Cerebras is a supercomputer with 13.5 million cores. The company claims it is one of the largest AI supercomputers ever built (but not the largest) and can perform 120 Petaflops of “dense compute.”

We aren’t sure about the methodology, but they also claim more than one exaflop of “AI computing.” The computer has a fabric backplane that can handle 96.8 terabits per second between nodes. According to a post on Extreme Tech, the core technology is a 3-plane wafer processor, WSE-2. One plane is for communications, one holds 40 GB of static RAM, and the math plane has 850,000 independent cores and 3.4 million floating point units.

The data is sent to the cores and collected by a bank of 64-core AMD EPYC 3 processors. Andromeda is optimized to handle sparse matrix computations. The company claims that the performance scales “almost linearly.” That is, as you double the number of cores used, you roughly half the total run time.

The machine is available for remote use and cost about $35 million to build. Since it uses 500 kW at peak run times, it isn’t free to operate, either. Extreme Tech notes that the Frontier computer at Oak Ridge National Labs is both larger and more precise, but it cost $600 million, so you’d expect it to be more capable.

Most homebrew “supercomputers” we see are more for learning how to work with clusters than trying to hit this sort of performance. Of course, if you have a modern graphics card, OpenCL and CUDA will let you do some of this, too, but at a much lesser scale.

Home-Built CPU Runs With Home-Built Toolchain

A few years ago [Takaya Saeki] and fellow students of the University of Tokyo, were given a very limited instruction during their ‘CPU exercise’ class, along the lines of:

Take this ray-tracing program written in OCaml and run it on your CPU implemented on an FPGA

Splitting into groups to cover the CPU, FPU, simulator tool, and compiler toolchain, the students started with designing a RISC ISA, then designed a CPU around that. You can follow along with the retrospective writeup of the class, then dive into the GitHub pages for each of the components of the system, although the commentary is mainly in Japanese. Hey, you can google translate right? Continue reading “Home-Built CPU Runs With Home-Built Toolchain”

A breadboard with a few DIP chips

Minimalist 6502 System Uses A CPU And Not Much Else

A central processing unit, or CPU, is the heart of any computer system. But it’s definitely not the only part: you also need RAM, ROM and at least some peripherals to turn it into a complete system that can actually do something useful. Modern microcontrollers typically have some or all of these functions integrated into a single chip, but classic CPUs don’t: they were meant to be placed on motherboards along with dozens of other chips. That’s why [c0pperdragon]’s latest project, the SingleBreadboardComputer, is such an amazing design: assisting its 6502 CPU are just four companion chips.

The entire system takes up just one strip of solderless breadboard. Next to the CPU we find 32 KB of SRAM, 32 KB of flash and a clock oscillator. The fifth chip is a 74HC00 quad two-input NAND gate, which is used as a very tiny piece of glue logic to connect everything together. Two of its NAND gates are used for address decoding logic, allowing either the ROM or RAM chip to be selected depending on the state of the CPU’s A15 line as well as blocking the RAM during the low phase of the system clock. The latter function is needed because the address lines are not guaranteed to be stable during the low phase and could cause writes to random memory locations.

The remaining two NAND gates are connected as an RS-flipflop in order to implement a serial output. This is needed because the CPU cannot keep its outputs in the same state for multiple clock cycles, which is required for a serial port. Instead, [c0pperdragon] uses the MLB pin, normally used to implement multiprocessor systems, to generate two-clock pulses, and stores the state in the flipflop for as long as needed. A few well-timed software routines can then be used to transmit and receive serial data without any further hardware.

Currently, the only software for this system is a simple demonstration that sends back data received on its serial port, but if you fancy a challenge you could write programs to do pretty much anything. You could probably find some inspiration in other minimalist 6502 boards, or projects that emulate a complete motherboard in an FPGA.

A smartphone-sized PCB is in a person's hand. A large blue chip package houses a 486 and the board has a SoundBlaster card and a 40 PIN Raspberry Pi Connector along one edge for attaching a Raspberry Pi Zero.

TinyLlama Is A 486 In Your Pocket

We love retrocomputing and tiny computers here at Hackaday, so it’s always nice to see projects that combine the two. [Eivind]’s TinyLlama lets you play DOS games on a board that fits in your hand.

Using the 486 SOM from the 86Duino, the TinyLlama adds an integrated Crystal Semiconductor audio chip for AdLib and SoundBlaster support. If you populate the 40 PIN Raspberry Pi connector, you can also use a Pi Zero 2 to give the system MIDI capabilities when coupled with a GY-PCM5102 I²S DAC module.

Audio has been one of the trickier things to get running on these small 486s, so its nice to see a simple, integrated solution available. [Eivind] shows the machine running DOOM (in the video below the break) and starts up Monkey Island at the end. There is a breakout board for serial and PS/2 mouse/keyboard, but he says that USB peripherals work well if you don’t want to drag your Model M out of the closet.

Looking for more projects using the 86Duino? Checkout ISA Sound Cards on 86Duino or Using an 86Duino with a Graphics Card.

Continue reading “TinyLlama Is A 486 In Your Pocket”

Two Esoteric Programming Languages, One Interpreter

Many of you will have heard of the esoteric programming language Brainf**k_. It’s an example language that’s nearly impossible to use because it’s too simple. It’s basically a Turing computer in code – you can essentially put characters into an array, read them out, increment, decrement, and branch. The rest is up to you. Good luck!

What could be worse? Befunge, a language that parses code not just left-to-right or top-to-bottom, but in any direction depending on the use of ^, v, >, and <. (We love the way that GOTO 10 looks like a garden path in the example.)

Uniting the two, [rsheldiii] brings us BrainFunge, a Brainf**k_ interpreter written in Befunge. And surprisingly, the resulting write-up sheds enough light on both of the esoteric programming languages that they make a little bit of sense. If you try to read along, you’ll definitely be helped out by Esolang Park, which was new to us, and accommodates the non-traditional parsing while displaying the contents of the stack.

If you get a taste of the esoteric, and you find that you’d like a little more, we have a great survey of some of the oddest for you. After cutting your teeth on Befunge, for example, we bet you’ll be ready for Piet.

This Week In Security: Mastodon, Fake Software Company, And ShuffleCake

Due to Twitter’s new policy of testing new features on production, the interest in Mastodon as a potential replacement has skyrocketed. And what’s not to love? You can host it yourself, it’s part of the Fediverse, and you can even run one of the experimental forks for more features. But there’s also the danger of putting a service on the internet, as [Gareth Heyes] illustrates by stealing passwords from, ironically, the infosec.exchange instance.
Continue reading “This Week In Security: Mastodon, Fake Software Company, And ShuffleCake”