If you ask someone old enough to have been a computer user in the 16-bit era what machine they had, you’ll receive a variety of answers mentioning Commodore, Atari, Apple, or even PC brands. If your informant lay in the Commodore camp though, you’ll probably have an impassioned tale about their Amiga, its capabilities, and how it was a clearly superior platform whose potential was wasted. The Amiga was for a while one of the most capable commonly available computers, and became something of a cult within its own lifetime despite the truly dismal performance of the various companies that owned it. Today it retains one of the most active retro computer scenes, has an active software community, and even sees new hardware appearing.
For Amiga enthusiasts without the eye-watering sums required to secure one of the new Amiga-compatible machines with a PowerPC or similar at its heart, the only option to relive the glory beside finding an original machine is to run an emulator. [Marco Chiapetta] takes us through this process using a Raspberry Pi, and produces an Amiga that’s close enough to the real thing to satisfy most misty-eyed enthusiasts.
He starts with a cutesy Amiga-themed Raspberry Pi case that while it’s not essential for the build, makes an entirely appropriate statement about his new machine, We’re taken through the set-up of the Amibian emulator distro, then locating a set of Amiga ROMs. Fortunately that last step is easier than you might think, even without trawling for an illicit copy.
The result is an Amiga. OK, it’s not an Amiga, but without the classic Commodore logo is it any more not an Amiga than some of the other non-branded Amiga-compatible boards out there? Less talking, more classic gaming!
We’ve covered quite a few Amigas on these pages. Getting an A500 online was the subject of a recent post, and we brought you news of a new graphics card for the big-box Amiga’s Zorro slot.
“For Amiga enthusiasts without the eye-watering sums required to secure one of the new Amiga-compatible machines with a PowerPC or similar at its heart, the only option to relive the glory beside finding an original machine is to run an emulator. [Marco Chiapetta] takes us through this process using a Raspberry Pi, and produces an Amiga that’s close enough to the real thing to satisfy most misty-eyed enthusiasts.”
Amiga Forever. Yeah it costs, but it’s reasonable.
But, about how much might it cost (in U.S. $) as a guestimate please ?
My rough guess, having clicked on the link, is <$100.'
Why do people keep talking about the MC68000 as a 16 bit processor? Just because its internal data bus was 16 bit? What counts is the size of its data registers and that’s 32 bits. It also had 32 bit address registers but ignored a few upper bits (8 in the 68000) until the 68020 came along.
I know. It was frustrating when they were saying that when the Amiga/ST were contemporary. “16 bit era”
The “ST” in the Atari ST stands for “Sixteen/Thirty-two”.
I’m also kind of annoyed that they call the Sega Genesis and Super Nintendo both 16 bit machines. The Sega Genesis had a 68000, so based on operation size, it would be 32 bits. The Super Nintendo has a 65816, so based on data bus size it would be 8 bits. So, if the Super Nintendo is a 16 bit machine, the Sega Genesis is a 32 bit machine. If the Sega Genesis is a 16 bit machine, then the Super Nintendo is an 8 bit machine.
Which is it?
Hot take: It doesn’t matter, it never mattered, it was always purely marketing.
Super Nintendo had Super Metroid. SEGA Genesis did not. End of story.
+1
The articles call them 16bit computers. The original Amiga had a 16bit bus and all of the custom chips operated on data 16bits at a time .. so fair to call the Amiga a 16 bit generation machine. The custom chips had as much or more influence on the system performance than the single CPU.
The CPU in the Amiga was actually somewhat of an afterthought, since it was originally made as a gaming console and the chipset was designed for pushing pixels to screen independent of the CPU. Hence why the two-bus design with two different sets of RAM. The chipset was designed more like an arcade cabinet with special functions aimed at generating period-typical game graphics – the CPU was there just for the game logic.
But as the video game market crash unfolded, they realized it’s not going to sell as a game console and put in a more powerful CPU and retconned it into a desktop computer.
The game console heritage, plus the Hold-And-Modify hack mode cum offical feature that enabled 4096 color graphics, made it more powerful in graphics than any other affordable desktop computer on the market, but that was also its limitation because the hardware was geared towards very specific tasks.
It’s like, you can generate pretty graphics on-screen, or produce fancy sounds out of the chips, but not actually compute the same thing in real time, if you know what I mean. It was “hardware acceleration” just for show, and most of the show was clever tricks to make the hardware do what it wasn’t designed to.
Similiar to the early 3D accelerator cards or MPEG-1 accelerator cards that could overlay fancy graphics on screen but were totally useless if you wanted to actually do the stuff on any set of data you had – you could only display it – Amigas became obsoleted by actual general purpose computers that didn’t rely on a bunch of hacks and tricks to do what they did.
I mean, people like to point out that Amigas were used for making stuff like the effects in Babylon 5 TV series, but actually, the graphics were rendered on SGI workstations.
That’s not a binary. B5 absolutely started out modelling and rendering using Amigas and Lightwave/the Video Toaster. It’s only later on that they moved over to SGI when Lightwave was ported to it.
First Season = Amiga
Remainder Seasons = SGI/PC
Sure, you -could- render the graphics on an Amiga, but it just wasn’t powerful enough and it took too much time. Hence the SGIs.
I wouldn’t agree with the last paragraph – firstly the Amiga wasn’t the only platform of that time period using hacks and tricks to improve themselves, but secondly, that’s still the case – the hacks change as technology improves, but software and hardware still takes shortcuts.
Amigas were obsoleted by Commodore going bust. Of course today’s technology is able to do more than decades ago – it’s nothing to do with hacks or the Amiga, it’s what you’d expect from massive improvements in technology.
Amigas were as general purpose as anything else. Moreover, today’s platforms have adopted the idea of having dedicated GPUs rather than doing everything on the CPU (true, today’s GPU’s are themselves far more general purpose, but that’s a change that came later, nothing to do with Amiga vs anything else).
I would disagree. The history of computing seems to reveal a trend where machines “push the envelope” by clever tricks to get an edge over competitors, but those clever tricks are not universal and therefore not very useful. Then the next generation comes along with more computing power and does the same thing with hardware that can do the same tricks in a more general fashion.
The point is that the hacks didn’t really improve on the basic machinery – they merely enabled a party trick to sell a product. For example, doing multimedia by slapping on an MPEG-1 decoder card didn’t kickstart the multimedia revolution because the underlying hardware was still wheezy as fuck. All it could do was render a pre-canned video and overlay it on the monitor signal – the actual computer was still struggling to decode a simple JPEG.
See the transition from 3D accelerator cards that were simple state machines for hardware-accelerating certain OpenGL instructions, to GPUs that are actually giant vector processors that can be used for folding proteins as well as drawing fancy games. Now we have heterogenous computing, which is similiar to the “co-processors” of the Amigas and other period computers, but not really, as the lines between a GPU and CPU are rather blurred and often are the same unit anyways.
What I mean is, nowadays tinkering in assembly to use a glitch in the hardware to get extra colors on screen isn’t a viable option, because it would depend on very specific hardware and simply not function anywhere else. It’s just not done that way anymore, because the code has to function over a number of platforms and you can’t target a single product line of computer as you would in 1984.
You have to do it in a compatible fashion, and so the era of clever hacks is gone.
The 68000 is most definitely a 32 bit machine. All the internal address and data registers are 32-bit except the address bits on early chips were 24-bit brought out to pins (developers asked not to use the upper 8 address bits in order to allow proper future expansion). Of course they used those bits anyway.
Maybe “definitely” by your definition. Otherwise no, it is not something generally agreed upon then or now. Most people called it a 16 bit processor back when it was used a lot. I guess most people were interested in the processing power rather what type of data abstraction the processor used – and at the time ALU width were a reasonable way to measure that. Or not really even then (before super scalar execution and instruction fusion) as some machines implemented their ALU(s) with 4 or even 1 wide ALU using multiple cycles.
Personally I see it as a 16 bit implementation of a 32 bit ISA (Instruction Set Architecture). The data buses were 16 bit, the ALU was 16 bit and operations requiring more than 16 bit took more cycles to compute. Register files were 32 bit but using that as an indication of an inherent processor width is problematic. There have been 8 bit processors with 64 bit ISA operations and (generally ganged) 64 bit registers – should one then call them 64 bit wide? How about the machines that uses bit slicing with SIMD instructions and can conceptually do some operation with 4096+ bits at a time?
By dividing the definition into implementation width and ISA width the worst problems are removed but processor width is still hard to define. To return to the bit sliced computer mentioned above – is it a 1 bit computer? All ALUs are 1 bit and so are the registers but it is obviously more powerful than a real 1 bit ISA (like some PLC = Programmable Logic Controller).
TL;DR Reducing processor (or computer system) bit width into a single number is hard and gives little information.
ok, I concede. I wrote that from memory and without updating myself on the reality of it. I guess you probably can’t call it 32-bit with a 16-bit data bus. I was just remembering the registers and the 24-bit address bus with 8-msb bits missing from the original pinout.
But the OS was 32-bit from the start – in practice, “bits” can refer to many aspects of a platform, so it’s incorrect to say it’s one or the other without being more specific. But I note that these days, people seem to refer to the OS most commonly when they say 32-bit or 64-bit.
Well the OS is used as a connotation because it’s been > 10 years since 99% of x86 processors have supported 64-bit.. (the 1% being Atom chips that no one wants to use, VIA chips which only China uses, or a couple 1-off chips from Intel)..
Yes I’d say the Amiga was “32-bit ready”, but with a 16-bit main bus, 16-bit limits on ALL custom chips, and a 24-bit address space limit and a processor that requires so many clock cycles to actually execute any 32-bit instructions.. — it feels wrong to call it a 32-bit computer.
32-bit processor? *maybe*.. 32-bit computer generation? no. IMO
Because for most useful aspects it was a 16 bit processor. But see my longer post below.
You sure you mean internal and not external bus??? IIRC the 68000 had a 16bit external bus and an internal 32bit bus.
And for extra marks, the 68008 had an 8bit external bus…..
From a programmer’s perspective it is 32-bit but from an engineer’s perspective, it is 16-bit. Annoyngly, the IBM PC was called a 16-bit computer when the external data bus was only 8-bit until the IBM AT (80286) came along. Let’s start calling the IBM PC an 8-bit computer.
Much disappoint on the not working keyboard.
Time to dust off the Amiga sitting on the shelf, fire up the scsi drives, (If they will fire up) and see if it still works..
“For Amiga enthusiasts without the eye-watering sums required to secure one of the new Amiga-compatible machines with a PowerPC or similar at its heart, the only option to relive the glory beside finding an original machine is to run an emulator. [Marco Chiapetta] takes us through this process using a Raspberry Pi, and produces an Amiga that’s close enough to the real thing to satisfy most misty-eyed enthusiasts.”
Emulation isn’t the only option. There are a number of FPGA-based Amigas out there. If you want a standalone option, best one out at the moment is probably MiST running the Minimig AGA core, though a standalone version of the more powerful Vampire is planned.
http://amigastore.eu/en/318-mist-fpga-computer.html
http://somuch.guru/minimig/minimig-mist/
“For Amiga enthusiasts without the eye-watering sums required to secure one of the new Amiga-compatible machines with a PowerPC or similar at its heart, the only option to relive the glory beside finding an original machine is to run an emulator.” Original machines FTW. :)
So installing emulators on a computer is a hack now? Guess I’m an old-school hacker LMAO
I was an Amiga enthusiast and wasn’t in the Commodore camp. I was in the Jay Miner camp. Commodore snagged the machine from Tramiel and shortly thereafter started to grind it to a pulp. They managed to make every bad move they could possibly make as quickly as they could before finally deciding to cash out.
One program that violated the 24 bit address restriction was AmigaBASIC, supplied by Microsoft, making sure it would not work with future processors. Oversight or deliberate plan?
Jay Miner left the most impressive legacy in the computer industry with the Atari 2600, Atari 800, and the Amiga. BITD I considered myself a Commodore fan with my VIC-20, 64, and Amigas but today I consider myself a Jay Miner fan.
My favorite machines are also .. the Jay Miner machines.. The 800 being the most special to me.
https://github.com/CrashSerious/PiUAE
https://twitter.com/CrashSerious/status/292489462045343744
Note, dates. ;-)
I was at a club meeting where Mr Miner spoke. Being the project lead when Commodore took over, it was his job, as the developers completed their part of the project, to fire them. This is probably why the Amiga froze. Future improvements were warmed over copies of the original machine. As far as hacks, the Amiga was doing true multi-tasking when we had Apple, and IBM with CGA/256 color displays. The Amiga with HAM did 4096 colours. The mouse encoders were tasks, the track disk read tracks, not needing an index sector. I could go on. When I first heard about the Amiga, I was “Yeah right”. When I got one, and did speech generation, I was truly impressed. I developed a PC hard driver interface and helped develop the disk drivers so I saw a little bit of the scope of the brilliance in this machine. Able to load and run multiple OS’s. It disappoints me to see all the negative comments of a machine that was light years ahead of the competition.
Donning Flame retardant suit now.