It Wasn’t DOOM That Killed The Amiga

If you were the type of person who might have read Hackaday had we been around in the late 1980s or early 1990s, it’s a reasonable guess that you would have had a 16-bit home computer on your desk, and furthermore that it might have been a Commodore Amiga. These machines gave the best bang for the buck in those days with their impressive multimedia capabilities, and they gained a fervent following which persists to this day. [Carl Svensson] was one of them, and he’s penned a retrospective on the demise of the platform with the benefit of much hindsight.

The heyday of the Amiga from its 1985 launch until the days of the A1200 in the early-to-mid 1990s saw Moore’s Law show perhaps its fastest effects for the consumer. In that decade the PC world jumped from the 8088 to the Pentium, and from a PC speaker and CGA if you were lucky, to a Sound Blaster 16 and accelerated SVGA. By comparison the Amiga didn’t change much except in model numbers and a few extra graphics modes, and when a faster processor came it was far to little too late.

Defender of the Crown, released in 1986

There’s a well-worn path with some justification of blaming Commodore-s notoriously awful management for the debacle, but the piece goes beyond that into the mid ’90s. His conclusion is that what really killed the Amiga was that the CPU price reductions which defined the x86 world at that time never came to 68k or PowerPC lines, and that along with the architecture zealotry of the fan base meant that there would never be the much-longed-for revival.

He also takes a look at the other home computer platforms of the era, including the “all its killer architecture managed to kill was, sadly, Atari itself” Atari Falcon, and the Acorn Archimedes, which also lives on for enthusiasts and is perhaps the most accessible survivor. From here having also the benefit of hindsight we can’t disagree with him on his assessment, so perhaps it’s best to look at the Amiga not as the platform we should rightfully still be using, but the great stepping stone which provided us a useful computer back in t he day without breaking the bank.

111 thoughts on “It Wasn’t DOOM That Killed The Amiga

  1. >architecture zealotry

    Amiga painted themselves into a corner with the architecture, because it became impossible to upgrade the system without breaking compatibility. Spreading the different functions across the chipset and having different fast and slow RAM etc. made it so you couldn’t upgrade it piece by piece. On an x86 system you could have a faster processor or a better video card, whereas on an Amiga the moment you upgraded something all the old stuff just became a fancy motherboard to bootstrap a whole other computer on an expansion card.

    1. The fans justified the architecture as some sort of misunderstood example of distributed parallel processing, but that was just the problem: there wasn’t one video card you could pull out and replace, because the chip that drew (half of) the graphics was also doing stuff like reading the joysticks, or managing the floppy drive, etc.

      The machine started out as a sort of game console slash arcade machine that was geared to compete with a Nintendo or a Sega system, so the chipset alone was supposed to be the machine, and it wasn’t designed to be “universal” and extensible. It was designed in a way that you could in principle have just a ROM based state machine on a cartridge that would program the chipset to run games. After the video game crash of the 80’s they re-engineered it to be a general purpose computer with a CPU and a separate memory bus on the “cartridge side”, and that legacy is what made the system so unwieldy.

        1. It was how things were done in the era when your CPU wasn’t necessarily a single chip because of low levels of integration and high cost of IC manufacturing. The Amiga was entering design in the tail end of that period, and was outmoded in that sense too.

          1. More precisely, the original idea was that they would push cost down by having the most bare-bones basic system they could manage for the specs they wanted, and then put in the necessary compute power in the cartridge – only as much as necessary to run the game like you would on a Nintendo or the like – but that plan was obviously thrown out the window when they had to make it into a general purpose computer.

      1. “on an Amiga the moment you upgraded something all the old stuff just became a fancy motherboard to bootstrap a whole other computer on an expansion card.”

        That pretty much describes a PC too, there was no real upgrade path between processors other than faster clock speeds, if you wanted to go from 286 to 386 then you almost always needed a motherboard swap or a plug in card where “all the old stuff just became a fancy motherboard to bootstrap a whole other computer on an expansion card.”.

        I think you’re kind of right though, the PC had a reasonably well defined set of minimum standards and it did pretty well in respect of compatibility between competing manufacturers of PCs and add ons.

        You could buy a VGA card with a Tseng ET4000 chip or one with a Western Digital Paradise chip, plug it in, install the drivers and you got a standard set of graphics modes that worked the same way for all software.

        I could be very wrong about this but there never really seemed to be agreed upon standards with Amiga cards, a graphics card from manufacturer A would not offer the same modes or even work with software written for a card from manufacturer B, C, D, etc.

        Ditto with things like samplers, Midi interfaces, disk interfaces, even accelerator cards could be wildly different in operation and that bit of cool software your mate had was often tied to one specific make and even model of add on hardware so it limited its own market.

        Maybe if Commodore had defined some minimum compatibility standards which had to be met then perhaps they may have been around a little longer, possibly even have given Apple something to worry about for a while…

        1. That’s true. CPU swap was a major upgrade of course, but if you went for the 386 chances are your old ISA graphics card would still work fine, since a motherboard was basically just a bus – since wasn’t the chipset’s job to also do the graphics. You could probably use the same RAM sticks as well.

          1. Some users had swapped their 8088/8088 for an NEC V20/V30, though! 🙂

            It was the equivalent to the 68010 upgrade that Amiga, Atari and Sega users had done.

            There also were 486 upgrade chips for 286 and 386 users.
            You know, those MakeIt! upgrade CPU boards.
            The 286 version had a 386SX/486SLC type CPU with some special logic.

            Owners of an ordinary 80386 (386DX) PC motherboard could just install a 486DLC CPU in the early 90s. It was a 486 software compatible core with a little internal cache (1KB to 8KB), but without an FPU.

            For 486 PC owners, there were Overdrive CPUs. The Pentium OverDrive (POD) was a real 586 with a 486 pinout.

            Especially in the 486 days, PCs were evolving so fast.
            Sometimes, a PC had become outdated within 6 months!

            That’s why upgrade sockets were installed on that era’s 486 motherboards.

            Anyway, just saying this for sake of completeness.

            XT owners could also use CPU accelerator cards. Like TinyTurbo286.

            They more or less reduced the mainboard to a riser card with RAM.

            However, the 8088 could sometimes be kept in charge.

            Some cards had an 8088 socket and a switch on the metal bracket.

            By toggling it, the user could switch between 8088/V20 and the 80286.

          2. >They more or less reduced the mainboard to a riser card with RAM.

            That’s how a “PC” was originally, and some other “home computers”. A CPU would sit at the end of a bus that was only meant for connecting peripherals, and that’s all the motherboard and the “base system” was for. Everything else was add-on modules, including the BIOS.

        2. > with Amiga cards, a graphics card from manufacturer A would not offer the same modes or even work with software written for a card from manufacturer B, C, D, etc.

          More to the point, the concept of an add-on graphics card would necessarily override the way the chipset functions, because the whole point of an Amiga was the specific way the different ICs in the chipset on the motherboard would function together.

          Having the add-on would transform the entire architecture to something else.

          1. Hence why this happened:

            >accelerator cards could be wildly different in operation and that bit of cool software your mate had was often tied to one specific make and even model

            They couldn’t standardize the way the accelerator or add-on cards should work, so each add-on that improved the function of the original system would transform the system into a different machine, leading to the software compatibility issues.

        3. “I could be very wrong about this but there never really seemed to be agreed upon standards with Amiga cards, a graphics card from manufacturer A would not offer the same modes or even work with software written for a card from manufacturer B, C, D, etc.”

          No, I don’t you’re wrong here.
          Apple II and PC were an open architecture using off-the-shelf parts, Amiga was a closed on using custom chips. Nowadays term would be “proprietary”, maybe.

          And that’s why the Amiga wasn’t being accepted by the majority, maybe.

          The Apple II and PC had compatibles and thus were the property of the people.

          They weren’t the best systems, but they meant freedom to the users.

          Same goes for poor home computers like ZX Spectrum, which had numerous compatibles.

          Computers made by Commodore, Atari, Acorns etc. were being owned entirely by these manufacturers.

          1. Even so, other companies could have made compatible hardware under license – except for the part where the whole point of the system was revolving around the particular way the custom chips operated and what software hacks had evolved around them.

            To be compatible, the add-on hardware would have to replicate the function of the underlying system as well. A new graphics card would have to emulate the way the old chipset works to be software compatible, but since that also involves things like disk controllers that were mashed in with the graphics functions, you couldn’t then upgrade your disk controller separately.

          2. “Apple II and PC were an open architecture using off-the-shelf parts”

            Yes, they used off-the-shelf parts. No, they were not “open architecture” by any means. They were intentionally proprietary. Yes, lots of data was released in order to make expansion cards and write your own software. That did not make them “open architecture” as we now use the term. However, when other companies attempted to release clones many were immediately squashed. Some PC clone manufacturers obviously survived, despite the legal challenges by IBM, and by the time of the PS/2 era plenty of successful clones were on the market. Tandy was not 100% compatible, but it wasn’t terrible either, and as with others (such as AT&T/ Olivetti, who made the first PC clone I had used) they didn’t actually copy IBM’s IP so attempts to litigate them out of business didn’t succeed. What did IBM do when they lost those battles? Do you remember Micro Channel Architecture? That most certainly was not “open architecture” and it clearly showed IBM’s disdain for open architecture. That required a hefty license fee and non-disclosure agreements to develop compatible hardware. I, and many others, had doubts and stuck with EISA and later VESA local bus. I may not have made winning bets many times, but that was one of them.

        4. The issue with software not working between different models, or after upgrades was more to do with programmers often hitting the hardware directly and not following commodores guidelines to the letter. An interesting point is Commodores own game Mindscape, written for the Amiga 1000, I believe, and written to be compliant to standards, worked on every Amiga model made. In relation to graphics cards, Commodore was way too slow to introduce its own Retargetable graphics standard, which I think they should have done by 1990 for the Amiga 3000 and AmigaOS2.0 considering that 3rd party 24 bit graphics cards were just around the corner. Instead 2 competing standards emerged from private sources, cybergraphx and Picasso96. I think Commodore were planning native RTG support in Amiga OS by 1995, but they were gone by 1994.

      2. “The machine started out as a sort of game console slash arcade machine that was geared to compete with a Nintendo or a Sega system, so the chipset alone was supposed to be the machine, and it wasn’t designed to be “universal” and extensible. ”

        The description also fits the Sharp X6800, I think.

        Just like the Amiga it was a graphics workstation with arcade hardware.

        Many Japanese game companies used it as a valuable development system for such platforms.

        However, unlike the Amiga, it was rather nicely expandable, supported quality graphics beyond NTSC standard and had used serious 5,25″ floppies.
        It also existed in an 68030 version.

        So the arcade as sucg nature wasn’t in the way.
        It was the implementation, rather.

        Programmers programmed the bare metal, also, which made any progress difficult.

        On PC platform, this bare metal approach originally wasn’t bein encouraged, either.

        Programs ported from CP/M era did still stick to programming guidelines, used proper OS calls.

        On PC and “MS-DOS compatibles” (not PC compatibles), it was originally being recommended to use BIOS and DOS functions, too, rather than talking directly to hardware.

        However, due to the limitations of DOS (no graphics API yet; CP/M had gotten GSX) and PC (8088 CPU, slow hardware) at the time, the programmers had started to adopt home computer coding style and began to by-pass any high-level routines.

        That’s why PC/XT era applications and games were so unstable and timing sensitive. They behaved as if the PC was a C64 or something.

        Like with the Amiga, programmers managed to gain a temporarily benefit for sacrificing forwards compatibility of their applications.

        It was understandable that it was done this way, but it wasn’t being wise.

        We can’t blame them, though, because the 4.77 MHz IBM PC 5150 and PC/XT 5160 were so incredibly weak and slow. They had to use workarounds to make things run.

        Using clean programming simply was too slow on that hardware.
        Ideally, they would have had included two code paths or compiled two separate applications (compatible and optimized).
        Communication programs could use Fossil drivers, at least.

        IBM seemingly had recognized this whole dead-end situation early-on and allowed for Option-ROMs that could override BIOS routines and thus could allow future hardware expansion to be used
        That way, IBM wasn’t putting itself into a corner. It could continue to release new standard hardware for a broad user base.

        The VGA BIOS overrides previous video routines (int 10h?), for example.
        Or let’s take SCSI controllers. They can carry an Option-ROM that provides int 13h routines.

        By late 80s, when the more powerful 16-Bit PC/AT hardware was common, PC applications started to be more flexible and high-level again, just like in the days before the dog slow PC/XT.

        The Amiga could have had this, too. There were smart mechanisms like auto-config, for example.
        But especially Amiga game programmers did not use KickStart or AmigaDOS+Workbench to access hardware. Instead, they used booter floppies, even!

        Being so leet and cool and such, they rather fiddled directly with registers and memory addresses.
        Worse than any DOS application every possibly could!
        That made developing new hardware very difficult.

        Later DOS games of the late 80s/early 90s did at least check variables (BLASTER variable) or tried to auto-detect hardware or used INI/CFG textfiles.
        They also shipped with separate binaries to make sure users/players weren’t left behind.

        (Speaking under correction)

        1. >The Amiga could have had this, too.

          But it didn’t, because there was THE chipset that provided everything, and the hardware was so homogeneous that programmers simply standardized on the thing they had, creating hacks and tricks that didn’t carry over.

        2. >That’s why PC/XT era applications and games were so unstable and timing sensitive.

          Hence why the “turbo” button in later machines – that didn’t overclock your CPU but instead made it slower to account for software that relied on the original 8086/8088 timing.

          1. Hm. Back in the day (90s) I’ve never thought of the turbo button as an 4,77 MHz 8088 compatibility mode.
            By that time, the PC was a museum’s piece already.

            I’ve rather seen the turbo button as an mechanism to prevent timing loops in older software from going crazy.

            And to throttle applications and games, if needed.
            Not just XT software, but also older AT software that was being run too fast on a high-performance 386/486.

            The turbo button also seemed like a nice tool for developers who wanted to see how their software would run on outdated hardware. Must have been useful for debugging purposes, too.

            In later years (post-Turbo XT era), I believe the turbo button nolonger changed system or CPU frequency, but added wait states, disabled caches or changed FSB speed.

          2. >to throttle applications and games, if needed.

            That’s exactly what it did. It either underclocked the CPU, or turned off the external cache to force the CPU to wait longer for memory, and that made the software run slower. It wasn’t a full-on “compatibility mode”, but simply a speed throttle, and much faster later machines would still run the oldest software too fast when throttled down.

    2. Meanwhile on the PC side, since it was a heterogeneous platform, software and games were designed to work with faster and slower machines, with different resources available (EGA, VGA, SVGA… PC speaker, sound blaster… etc.) so even people with older machines could enjoy newer games – and then pay to upgrade. The Amiga was just stuck doing the same thing.

      1. “The Amiga was just stuck doing the same thing.”

        Yes and no. It depends. It was stuck AND had compatibility issues.
        OCS/ECS/AGA ettc were all same yet very different.

        Here’s a quote from game “Beneath a Steel Sky”.

        “At the beginning the programmers were happy and did rejoice at their task, for the Amiga before them did shineth and was full of promise. But then they did look closer and did see’th the awful truth; it’s floppies were tiny and sloweth (rareth was its hard drive). And so small was it’s memory that did at first appear large; queereth also was its configuration(s). Then they did findeth another Amiga, and this was slightly different from the first. Then a third, and this was different again. All different, but not really better, for all were pseudo backward compatible. But, eventually, it did come to pass that Steel Sky was implemented on a 1meg os-legal CBM Amiga. And the programmers looked and saw that it was indeed a miracle. But they were not joyous and instead did weep for nobody knew just what had been done. ”

        https://www.mobygames.com/game/386/beneath-a-steel-sky/trivia/

        So programming for the Amiga wasn’t all sunshine and rainbows.
        The platform could be really mean, too, sometimes.

        And hard drives not being a standard piece of harware was a PITA.

        The PCs had them as stabdard since mid-80s or something.
        Even Macintoshs users weren’t cheap on HDDs.

        1. I don’t get where the notion comes from that the amiga was closed Architecture. I don’t think anyone had to purchase a licence fee to build a zorro board for example. All the RTG cards pretty much used off the shelf PC or workstation graphics chips. Commodore lagged on RTG support in the OS. The PC had its own issues with incompatible graphics cards, with manufacturers adding additional modes that weren’t supported by others

    3. No, the problem was with Commodore not actually continuing development of the system.
      The things you mentioned are not deal breakers, not even actual problems if you compare the Amiga design with other systems released at the same time or even later. But it was never actually updated.
      The only real update released was the AGA design which kept the basic architecture but added some extra features including increased chipset performance by increasing bus width for a subset of operations.

      There were however plans from the hardware designers intended to continue development into a more modern system, including changing CPU architecture into a RISC design and more.
      But the Commodore corporate bosses decided to save money instead. Dumb and shortsighted? For sure.

      1. Yes and no. It was difficult to continue development because of the issues with the architecture. They could have kept adding hack-on-hack to it, but the subsequent versions were all more or less turning it into an entirely different machine, and continuing on that route would simply create backwards and forwards compatibility issues. To really go forward, they would have had to abandon the structure for a more generic PC-like architecture.

        The same problems were hurting all the other systems as well, and they’re a major part of why the PC x86 platform “won”.

        1. In a hardware sense, “abandon”*, but software is a more malleable creature.

          *I think both the commercial as well as retro-community has shown “backwards compatibility” can be dealt with.

        2. You deeply misunderstand Amiga architecture. There were no issues with it per se. Commodore chose a compromise where CPU uses spare Graphic subsystem cycles and ram, it worked because 68000 is very slow and does 1 memory fetch every 4 CPU cycles (8 bus states). 8 MHz 68000 needs only 2 MHz of ram. This hack is not a defining part of Amiga architecture, as exemplified by A3000/4000. Nothing forces you into using chipset ram. Amiga 3000/4000 shipped with Fast ram – in PC world we just call this RAM. You could totally ship Amiga OCS chipset/ram on an ISA card for PCs and call it Gaming Accelerator, just like you always could, and many did, attach accelerator card with different faster CPU and own “fast ram” to any Amiga.

          Commodore simply fired all original Amiga engineers around 1987, retaining few as contractors. Amiga team was “let go” before Amiga 1000 to 500 cost optimization redesign. This meant Amiga never got HD floppy controller – there was no one capable of upgrading PLL inside Paula. left at Commodore. Buying A600 in 1992 you received mostly same 1985 technology.

          1. The OCS chipset WAS the machine at the very beginning, architecture wise. The CPU was added where the cartridge port would have been before they did a 180 with the design in the early stages. The architecture is fundamentally a mash-up of two different systems, and THAT was the defining “feature” of the Amiga.

            When your motherboard is essentially an “accelerator card”, and everyone programs for that particular set of hardware, extending the system on either end would mean the architecture would grow away from the original setup and break compatibility.

            Don’t get hung up on the fast/slow RAM difference.

          2. Well, Paula runs from a 3.5 MHz clock. Which is fine for a 500kbit/s MFM stream (divide by 7) but is a no go for a 1Mbit/s one. Morevover, there are not enough DMA slots to transfer the MFM stream to/from Chip memory. It would have been a major redesign : changing the clocking scheme of Paula, stealing two DMA slots from the CPU => Agnus has to be changed accordingly because of the DMAL signaling.

        3. The PC also relies on hacks to keep backward compatibility, and there’s no reason why an updated Amiga couldn’t do the same. However the PC situation had many problems I don’t think you realize.
          For instance the super VGA situation required hacks to fit into the architecture, and there was no standard for how that worked nor (for a long time) any software standard. This meant software had to create their own driver support and generally supported only a subset of the many cards out in the market.
          The same problem existed for sound cards.
          The amiga actually had a solution to that already: OS friendly programming that exposed hardware that could have been used to transparently change underlying hardware.

          But there’s no reason that an updated Amiga architecture couldn’t use the solution PC graphics card manufacturers have done, by simply keeping an AGA hardware emulator integrated in the new chipset. Yes that would still not be 100% backwards compatible, just as the same applied to the PC.

          1. There is a fundamental difference in how the Amiga puts that legacy stuff at the bottom of the system architecture, so the machine itself has to carry it around to call itself an “Amiga”, whereas on a PC the legacy stuff is in the add-on modules where the legacy functions can be dropped off once they become unnecessary and unused. The machine still remains a “PC” in architecture because the legacy stuff doesn’t define the system architecture itself.

      2. More to the point, with an architecture that was not designed to be modular from the get-go, upgrading the machine was more expensive with expansions that bypassed the previous layers only to duplicate function, and maintaining backwards compatibility with newer machines would mean dragging the legacy stuff along, so subsequent machines would become more expensive.

        I suppose if Amiga had survived the intervening generations, they could have eventually implemented compatibility layers in microcode etc. to emulate all the legacy stuff – but why?

        1. The PC still keeps a lot of compatibility hardware around. Things like legacy DMA and PIT (programmable interrupt timer), VGA emulator in video cards, PC speaker support.
          The same applies to the processor, there are a lot of things that’s not really used anymore but is still kept working. For instance how often is the real mode or 16bit protected mode used in a modern PC?
          Where’s the difference?

          An updated 68k compatible processor would probably use microcode, sure. Thing is unlike the 86k family there would be less need for that as mostly the system instructions are complex enough to warrant it.

          1. Those compatibility layers don’t exist at the core of the system, at the very bottom of the system architecture itself, so you’re not building up a pyramid of cruft.

            Some software still expects video cards to have standard VGA, but it’s not like you have to have VGA and its video ram built into your motherboard’s chipset to have a “PC”, because having VGA is not the same thing as having the OCS of an Amiga. This feature was never a defining thing for a PC, since it was an add-on module from the very beginning.

            I made a longer reply regarding this to your question below.

          2. A “PC” could have EGA, MCGA, VGA… and it would still be a PC, with or without the legacy emulation or the associated hardware. You could skip ahead so that you didn’t need to buy a PC with an EGA card and then add a VGA card to it. You could just buy the latest PC with a VGA card already. The new stuff would emulate the old, but this was becoming cheaper and cheaper with improvements in hardware.

            An original Amiga upgraded with better video would still carry the old OCS along, so you paid for both, because the OCS was at the core of the system and you had to have it to be an “Amiga”. This made it more complex and costly for the company and the users to have small incremental upgrades to keep up with competition in between major generation overhauls. That was the gist of the answer I made below a few posts down the line.

      3. This is the first correct answer in the comments. Going back to my actual experience of the period, this is what actually happened:

        1. In the 70s, the microcomputer market started up: there was the CP/M – S100 bus business computers which business people bought, because they were dull. But in essence they were like PCs except no-one really did own it.
        2. The home-micro market which thrived with dozens of proprietary designs. Importantly, they leveraged bespoke technology and designs to provide better capabilities at far lower prices (several factors) than the CP/M machines.
        3. The IBM PC arrived in 1981. By the standards of the day it was dull, boring, slow, expensive and average. It was barely faster than 4MHz Z80 CP/M machines. However “No one ever got fired for buying IBM” so business people bought it in droves, because it required “No thinking” to justify buying one. It was a safe decision.
        4. Because the IBM PC was chronically slow, PC programmers bypassed all the proper DOS APIs (which were inadequate anyway) and programmed to the metal. Standards were terrible; all the video cards; the virtually non-existent sound cards; serial cards; memory cards had to all be basically 100% compatible at a hardware level or everything would crash. All the standards were de-facto standards defined by whatever company sold the most. It was as bad as could be. Importantly, though it created a massive barrier for porting software to other potential platforms.
        5. At around the time of the IBM AT (the first 286), the home computer market started to produce 16-bit computers. Mostly they were based on the 16/32-bit 68000, which was a far better architecture. All of these computers had GUIs when PCs were text-only, hugely better in terms of sound, graphics, speed and much cheaper.
        6. The Business world didn’t give a toss and refused to support those platforms, because the PC was the standard. It wouldn’t have mattered if they were 10x or 100x faster and better than PCs, because they didn’t care: “No-one got fired for buying IBM”.
        7. Because the Atari and Amiga didn’t get the support they needed they struggled except in more niche markets: games, graphics, audio.
        8. Because the PC had all the cash, effort and support, it slowly started to catch up with these other platforms, even though it remained far more expensive for similar capabilities. Just think about how poor and weedy the early Ad-Lib sound cards were: whiny, 2-op FM tosh vs Sampled 4-channel audio on an Amiga.
        9. Eventually the other platforms were starved into obsolescence.

        And that’s it really. The IBM PC standard held up computing by at least a decade while it starved competitors and eventually caught up. Good grief, they didn’t even have a mainstream 32-bit operating system until 1995, a full 10 years after the 80386 appeared! PC users of course had access to OS/2 2.x, but that wasn’t by Microsoft, so *they* *didn’t* *care*.

        It’s not the openness of PCs which meant they won. The example of the early home computer market showed that other platforms could deliver more, and more cheaply. If they *could* have competed, that is, if business people hadn’t been so dogmatic about the IBM PC ‘standard’, which in reality was just another proprietary platform like all the others, then these types of computers would have still stayed ahead.

        The whole mantra about PCs being easily updatable and expandable is just a post-hoc rationalisation based on the period after PCs had already caught up. At the time, the whole principle was that expansion cards were 100% hardware compatible with each other. There was little innovation, it was compatibility that drove dominance.

        1. You identified the same sticking points that were retarding the home computer market: proprietary bespoke hardware, programmers bypassing high level functions to standardize on particular chip resources.

          The PC was able to grow out of that by being modular and not having such low level hardware resources as the “core” of the system, while the other machines got stuck there because they were leveraging these hacks to get more performance out of their outmoded hardware architectures.

          >Standards were terrible; all the video cards; the virtually non-existent sound cards; serial cards; memory cards had to all be basically 100% compatible at a hardware level or everything would crash.

          And yet this is why they were able to improve and take over. They were crap at the beginning, but you could make them better because they had to be compatible, instead of every piece of hardware being its own unique snowflake with a balkanized user base.

        2. This statement seems to be a bit biased.

          The PC being an open architecture surely was important.

          It’s not too different to nowadays open-source and open-hardware movements.

          That’s why Apple II and various CP/M era systems (Altair, IMSAI, S100 bus, ECB bus) had been adopted by industry.
          (The IBM PC had adopted a similar slot design to the Apple II.)

          By comparison, Atari and Commodore “CBM” were proprietary and unpredictable.
          They changed course so often.

          Business, industry, science/research etc favored something to be future proof, rather.

          That’s why DOS mattered, it took the role of CP/M.

          And like with CP/M, there wasn’t just one version of it.

          There were multiple software houses which created DOSes or DOS compatible OSes.

          And like with Digital Research CP/M v1.2 API/ABI being popular, most third-party DOSes were MS-DOS 2 API/ABI compatible.

          That means that the platform was being independent all time long in theory.

          Which a few years later had been turning out as a truth after IBM lost the lead.
          IBM VGA was the last notable creation that did catch on.

    4. Actually, it was 1985 when fathers have decided to include custom chips to speed things up and offload cpu from ,oving large chunks of graphics mem in favor of Blitter. Remind me please, which platform had scala mm running on most of tv stations mid-eightys to min ninetees? Ah yes, amiga. Why? Becase pcs of that time were either scrolling text or servicin interrup to move mouse pointer xd.

      Openness of i386 wasnt its selling point till mid 90s becaue selecion of cards that could be plugged in varied from shitty ones to crappy ones and costed much more compared to these dumb amigas.

      And argument of ‘slow and fast ram’ in terms of architecture flaw is at lest funny given that holy pcs had base 640, xms, ems highmem and probably one other which i forgot.

      Amigas never were suposed to be ‘upgradable’, keep in minf that we’re disvussing 1985 platform where competitors were same form factor of keyboard pluggrd straight to tv.

      Upgradable lines of amigas were 2000,3000 and 4000 designed from the start as open architecture with zorro slots for expansion cards. And i don’t recall stories where video toaster or picasso turned whole unit to be bootstrap for other computer.

      1. The argument was rather about having a hardware compromise that forced two memory buses.

        In an Amiga, the entire motherboard was a do-it-all system much like the other machines that were designed as just keyboards under a TV. The CPU and its separate RAM, and its “expandability”, was just like a goiter growing on the neck of the architecture – it was a bodge rather than a clever design feature.

        1. So did the PC with main memory and video graphics memory?
          Some soundcards like the Gravis Ultrasound added sound card memory to that.

          I really don’t think you thought this through enough.
          There’s no reason the Amiga couldn’t have made fully upgradable, but also no real need for it to be a competitive product.

          Also you could look at the DraCo systems to see what an Amiga like system without the Amiga chipset could look like.

          1. >There’s no reason the Amiga couldn’t have made fully upgradable

            That wasn’t my claim. You could bolt all sorts of things to it, and people did.

            The point is that upgrading an Amiga changed the whole layout to the point that it wouldn’t be the same “Amiga”. It could be expanded, but the two systems paradigm made your upgrades sit on either side of the divide and that made things increasingly difficult to program and maintain backwards and forwards compatibility with software.

          2. Think of a modern cheap PC. It has a motherboard with IGP, sound, disk controller etc. all on the same fully integrated chipset. This looks kinda like what the Amiga had. All the software has to do in order to access e.g. a dedicated GPU card instead of the IGP is to call a different bus address and ignore that the IGP even exists. The IGP is so cheap that you don’t mind having it even if you never use it.

            Upgrading an Amiga back in the day meant the same thing. The new video card would have its own video output, duplicating the function of the OCS. However, the hardware wasn’t cheap then, so you did mind that you had to pay for something that you didn’t use, not to mention the fact that your old software would still address the remaining OCS parts and not get video output from your new card.

            On a PC back then, the video card was already a separate module that you could swap out or buy the better video card from the start, so upgrading the system was cheaper for both the manufacturer AND the user/owner. The software didn’t have to change to use the new video card – so long as it provided the same standard video modes. Since the motherboard of a PC was only a bus controller, you could change each and every module without duplicating cost and adding complexity. Upgrading a PC meant just having a different PC that would be more or less functionally the same as the original, providing a smoother upgrade path towards the next generations.

            Upgrading an Amiga meant bolting stuff on top of the existing hardware without removing the original. With subsequent upgrades dragging that legacy it would snowball in complexity and cost until the next generation would take all that junk and integrate it into the a chipset, and then the snowball would start growing again. This is what made it more difficult for the company and the users to have small incremental improvements on the basic system and keep up with the competition.

    5. That’s not actually true, back in the Amiga days, I upgraded my A500 to a 68010 and apart from a very small amount of games everything worked perfectly, my A1200 gained a GVP1230+ (68ec030 and even a 68882 FPU, which did almost nothing), then got transplanted to a Tower casing with Zorro 2 breakout board and fitted with a 386SX Bridgeboard) then i went serious and bought an A3000, popped in a 68060 (Cyberstorm) and a Cybervision 64/3d Graphics card and everything worked. I even added a Siamese card (basically a fancy KVM switch) so I could use my P60 and A3000 with a single mouse, monitor and keyboard.
      What killed the Amiga was the cost of the upgrades and the fact so few people did them, that software would nearly always be written for the base A500/A1200, the pre-emptive multitasking was brilliant but with no memory protection, a badly written program would throw up a “guru error” and take down the system. The lack of regular new machines from Commodore and the fact that it was selling so well they wanted the cash to flow in.
      It remains one of my favourite personal computers.

      1. >that software would nearly always be written for the base A500/A1200

        Because an “Amiga” would always carry that along. Otherwise it wouldn’t be an Amiga. Targeting the platform would mean targeting the lowest common denominator instead of whatever weird add-on you had.

        What made it difficult to upgrade was the fact that you couldn’t just separate single functions away from the original chipset without changing the software that was written for the OCS, so the upgrades couldn’t carry that legacy with them. They stacked up on the original system. If any of the expansions should have become “standard”, the company itself would have had to roll that stuff into the next generation of the base system to steer it that way, excluding other options from the new lowest common denominator. On the PC side, that was a task for the add-on manufacturers to provide compatibility with the stuff that people were still using, so it was no headache for the companies that sold PCs. They would just pick from the available hardware modules and build a PC that met what the market wanted without having to choose and engineer that stuff into the base system itself.

  2. “These machines gave the best bang for the buck in those days with their impressive multimedia capabilities, and they gained a fervent following which persists to this day. ”

    Still pricey and not as available at the time.

  3. The videos of excellent talks by Joe Decuir provide insight to the design process and decisions.
    The whole Amiga was a hack fest enabling some stuff nearly impossible at the price level at that time, e.g. the Boing ball demo was not a rotating and bouncing object, but several clever tricks using the unique hardware to make it seem like one. When the commodity PC hardware prices dropped, the modularity became more of an advantage and it was over for Amiga.
    But the things that clever programmers and artists created with Amiga were just wonderful and I am thankful for it.

    1. Precisely. They pushed the old school chips to the breaking point with hacks like Hold-And-Modify to whip the system into doing cool stuff, but relying on such hacks also drove them into a corner and they just couldn’t get out without a complete paradigm shift.

      1. The IBM CGA board supported TVs, too (RGB/Composite).
        Even in 640×200 resolution. ;)

        In the early 90s, VGA to SCART cables appeared. By using a DOS TSR, the VGA could be reprogrammed for TV timings.
        VGA was very flexible. :)

        1. I admit I don’t know. There were other, similar devices available in overseas, I believe. 🤷‍♂️

          The company’s popular DigiView was indeed available in both a PAL and NTSC version, at least. 👍

          1. “ECS Amigas have an (one!) oscillator either clocking at 28.375MHz for PAL or 28.636MHz for NTSC versions. The oscillator is soldered in and cannot be changed in software. Pressing both mouse buttons on an ECS Amiga when booting will re-initialize the chipset to the “opposite” mode within the limits of the system clock producing a “somewhat NTSC or PAL signal”.”

          2. No, there was never Video Toaster in PAL. Its impossible due to the way it works directly on low level NTSC video signal. People using Toasters in PAL land had to pump video thru NTSC/PAL converters both ways, this 50/60Hz conversion didnt do any favors to signal quality.

          3. > there was never Video Toaster in PAL

            I believe they used the underlying Amiga system for sync with video output, so if the Amiga was clocked to generate PAL signals, the video toaster would too.

          4. >I believe they used the underlying Amiga system for sync with video output, so if the Amiga was clocked to generate PAL signals

            Just changing Amiga clock will not help, Toaster uses its own source. Even changing Toaster clock wont help, because Toaster is intimately tied to NTSC encoding. Toaster doesnt decode video, there is no RGB no YPbPr no pixels, just raw rf samples. First PLL syncing to color burst, then directly sampling NTSC, then operationns on those samples – amplitude for luminance phase for hue. PAL encodes color differently, its in the name “Phase Alternating Line”, NewTek never bothered to redesign for it.

          5. I stand corrected, then.

            What were they using though? There were Danish TV shows like Hugo that had graphics made in Deluxe Paint on Amiga and the picture overlay was provided by an Amiga 3000 using what I can only guess is a Video Toaster. The gist of the show was a video game you would play live on TV using touch tones on a telephone.

          6. External Genlock box allows one to put Amiga graphics together with another video source on same screen. Toaster did much more than just genlocking, it provided effects, cuts and transitions for Linear video editing. Being Dutch TV they used separate mixer for effects.

    1. How is Apple “too expensive”? If it was “too expensive” no-one would buy their products, but in fact millions of people do. Surely, all you mean is that they’re “more expensive” than you want to pay, you simply don’t see the value in the product for your needs or wants? That’s fine, because that’s how it works with all other products too, but no-one would ever say a Porsche is “too expensive” and therefore everyone should buy a Opel Corsa.

      1. Apple products are poorly built overpriced maintenance nightmares.
        Comparing them to German cars is just cruel.
        They’re not THAT good at extracting money from the customer in daily use. I’m sure Cupertino is working hard to catchup with Wolfsburg.

  4. Back in the 80’s, personal computers were VERY EXPENSIVE. In 1985 Amiga $1,300 basic+RGB monitor+S/W. Box alone $3,700 today, and it’s when engineer’s wage avg. $35K/yr, tech $23K/yr! It was a huge chunk of cash for a household, many months wages. Really too much and the recession was coming…

    Timing – the computing bubble had popped, there was a severe industry slowdown in silicon Valley. Apple layoffs 20% of their staff. Semi’s -30% sales drop. Hours cut back. It left everyone quite worried.
    ref. “Slowdown in Silicon Valley #2” Computer Chronicles #2/85 https://archive.org/details/Slowdown1985_2
    The Amiga was released at the worst time as well.

    1. “Back in the 80’s, personal computers were VERY EXPENSIVE.”

      Everything was expensive. Technology, I mean.

      Anyway, it didn’t matter.
      Business users could get tax refunds when buying PCs, no big deal.
      A PC was dirt cheap compared to a minicomputer or mainframe.
      Many people seem to forget that.

      And home users, you may ask?
      They had their ZX81/Timex 1000, VIC20 and other game computers.
      They didn’t need to lay their hands on PCs in first place.

    2. Consider the upgrade path between major generations of machines.

      Within the same generation of machines, a PC could have EGA graphics in the beginning, and VGA graphics later on. An Amiga _could_ be upgraded with accelerator cards, but you still had to buy the base system with the OCS and pay for both through the whole span of the generation.

      Then the next generation of Amiga machines could roll the upgrades into the chipset, but they had to choose which features to “standardize” on, whereas on a PC you could still have the (now cheaper) EGA card if you didn’t need the graphics, or the SVGA card if you wanted it. The PC wasn’t defined as a platform by what features it offered, so it was easier, smoother and cheaper, to keep up with progress.

  5. I forgot to mention the Atari 520ST was out in ’85 for $800 -much less than the Amiga.
    The computing and gaming crash back then was absolutely huge. Then the recession came and nobody could afford much. I lived it back then.

  6. Yup, when I got the 1200, upgrading from a 500. Felt like they had cut too much cost. Should have had a 68030 and 4 megs in it. But even then, it was clear to me that the CPU arch was a dead end. The PC video cards were pulling ahead and it was easy to upgrade a system bit by bit. I never looked back when I got my 486 system two years later. Loved the easy coding of the Amiga, but she was just a bit too slow for me.

  7. Doom did kill the Amiga.

    Doom laid the fatal blow, Commodore refused to resuscitate it.

    At least for me. I was a very faithful Amiga power user, even ran my own BBS, etc. With a heavy heart I never saw a reason to buy another Amiga unless it could provide me with a similar experience to Doom on the 486.

    Amiga never delivered, I never bought Amiga hardware again.

    This wasn’t a hardware problem, it was a problem of corporate greed. The people running the company weren’t there to make a great computer anymore, they robbed the company and took off to the Bahamas.

    1. I had only a single brief experience with a Amiga and found that I detested it. I hoped I never had to work on it again and that is precisely how it turned out. It was simply not for me.

    2. “Doom did kill the Amiga.”

      Me, too.

      The rise of shooters (“Ballerspiele”) like Doom or Wolf3D did put an end to point&click adventures and jump&runs.

      To me, Doom and its spin-offs marked the end to my childhood (as far as video games are concerned).

      It was nolonger about exploration and innocent fun, but about murdering on TV screen. Sorry if I’m being a bit harsh here, but it is how it is.

      The N64 and Playstation were the final nails in the coffin of the 2D 16-Bit era.
      Rough 3D models now had replaced pixel-art.

      That’s when I knew my era of gaming was over. Thanks, Doom.

    1. +1

      That’s a good point, indeed.

      Commodore software was being traded at the school yard, literally.

      The whole cracking scene/demo scene originated on the Commodore platforms, too.

      Personally, I can’t recall ever seeing a cracktro on PC software, at least.

      Not that piracy didn’t exist on PC, it did. Massively.
      But it was being done more subtle, not in the public.
      There wasn’t same kind of stigma as with Commodore platforms.

      So maybe, former Commodore users shouldn’t always blaming Commodore company for everything, but maybe also think a bit about their own guilt and take some responsibility ?

      1. >But it was being done more subtle, not in the public.

        How do you reckon? I remember everyone in school trading PC games on diskettes.

        But on the other hand, PC had the whole Shareware culture going on where you were supposed to share it, and then mail in for the codes to unlock the full game or get that one missing file that would add levels or something, which where then instantly leaked anyways.

      2. >Personally, I can’t recall ever seeing a cracktro on PC software, at least.

        That’s because the DRM was different and didn’t rely on diskette based copy protection methods. Instead you had keygens for games which relied on activation keys, or simply copying the missing files that were omitted from shareware games to make them into full versions. Later when CDs were used, you had no-CD cracking executables, and those had the intros and animations.

        1. And, there wasn’t really any point in doing cool intros on PCs because it wasn’t difficult to make a cool plasma effect or flaming scrolling text across the screen. The PC had enough computing power to just do it.

        2. “That’s because the DRM was different and didn’t rely on diskette based copy protection methods. ”

          Strictly speaking, Sierra On-Line games (Larry, Kings Quest etc) had a disk based copy-protection at some point.
          They used defective sectors a PC floppy drive couldn’t replicate so easily.

          But I get what you’re saying.
          They quickly switched over to having the user to recite certain words in manuals or used code disks.
          Monkey Island and Star Flight come to mind..

          What I meant was the scene culture in the DOS/PC and Amiga days.
          There were crackers on PC platform, but they weren’t trying to get popular so much.
          The cracked games/applications didn’t come with all those bells and whistles.

        3. “Later when CDs were used, you had no-CD cracking executables, and those had the intros and animations.”

          Makes sense. I assume that most Amiga coders had switched over to modern 486 PCs with SB Pro or Gravis Ultrasound by then.

          The CD-ROM and soundcard both were related, kind of.
          Many soundcards had a CD-ROM interface by the time 16-Bit audio appeared (GUS, PAS16, SB16, AWE, WSS etc).

          That’s about when good MOD trackers got popular on PC, too.

          1. But then, when people were copying the games between them, they would copy the already cracked games on diskettes and skip the part where you have the intro so most people never got exposed to it.

            The scene evolved so that the intros did not matter to the wider public, only to the people who were directly involved with hacking the games.

  8. I think there is a combination of a few reasons why the Amiga died out in the 90s.
    1> Commodore Management screwed up; ya took off to Bahamas…. plus they didn’t advertise enough…
    2> Doom help make the PC look better at gaming than the Amiga. The Amiga had planar graphics and not chunky graphics till later on, but was too late. I eventually went to PC in 1996 from my A500.
    As soon as DOOM came out it displaced the Amiga as a king at gaming. Like on the PC you just
    had games like Kings Quest, Wing Commander…that were somewhat decent. Funny, I noticed
    they are doing Doom like raycasting 2.5D games on the Amiga recently with some tricks.
    3> There was a cpu transition from Motorola 68k to PowerPC; all I noticed was PowerPC systems were always expensive. It is too bad the 68k cpus weren’t extended like the Intel x86 to Pentium to i cores.
    4> Like someone else also said on here, the variety of card manufacturers help the PC a lot for upgrading; that lead to cheaper upgrading. Amiga or Apple was always a bit more expensive.

    But at the time the Amiga was quite a decent computer; it’s a shame what happened. There is some videos or text files why Commodore Amiga died if you want to get more info on this…..

  9. In addition the many solid comments noted above, one of the things that helped the Amiga platform rise up so quickly also helped lead to it’s downfall… and that is software piracy. I think piracy ultimately hurt the Amiga way more than it did for the PC, since the PC had plenty of paying business customers to keep the platform alive and thriving.

    1. “2> Doom help make the PC look better at gaming than the Amiga. The Amiga had planar graphics and not chunky graphics till later on, but was too late. I eventually went to PC in 1996 from my A500.
      As soon as DOOM came out it displaced the Amiga as a king at gaming. Like on the PC you just
      had games like Kings Quest, Wing Commander…that were somewhat decent. Funny, I noticed
      they are doing Doom like raycasting 2.5D games on the Amiga recently with some tricks. ”

      Depends. You may think of PC’s infamous mode 13h at 320×200 (256c).
      Which strictly isn’t being VGA, but MCGA. Unless being modified (see ModeX/Y; MCGA circuit had fixed palette/resolution).

      But that’s not all. *Real* VGA was mode 12h in 640×480 (16c) and it was planar.
      It’s Standard VGA, so to say.

      That’s what ambitious games had been used. Like SimCity or MS Flight Simulator. Or chess games, Star Trek simulators etc.

      Or text adventure games with graphics, like the famous Gateway series.
      – They even supported Super VGA in 800×600, making mode 13h look like children’s stuff.

      Same goes for JRPGs like Knights of Xentar or Mad Paradox.
      Or Japanese titles in general. Rusty, Princess Maker 2, Seasons of the Sakura..

      About everything related to simulation or writing was using full EGA resolution, at least (640×350).

      For this category of games, mode 13h wasn’t enough.

      But who cares, it’s all about killer games these days.
      No one plays chess or card games anymore. *sigh*

      Anyway, just saying. I’m a minority here, likely. 😒

      1. The mode 13h had several advantages. One being that you had one byte per pixel in linear memory, so it was dead easy to address pixels in a memory buffer and then just throw the whole thing to video memory with a tiny bit of assembly code. It was fast to the point that even games written in QBASIC would run at decent frame rates.

        Standard VGA had you addressing the screen in four 64k “banks” and each byte would address four pixels, which made doing sprites and frame buffering more difficult and it was not suited for fast paced games until you had faster machines.

        1. Yes, it was convenient that each pixel was 1 Byte or 8-Bit (=256 entries for color).

          That meant it did fit in a single 8086 segment without needing bank-switching or segment arithmetics.

          That was very Real-Mode friendly indeed (though QB45 could switch segments, it wasn’t very comfortable, though).

          Anyhow, the problem with mode 13h was its very low resolution and 200 line limitation.

          The “PS/2 graphics” (how MCGA was called once) looked right on IBM PS/2 hardware of the time.

          These monitors had a 0.4mm dot pitch, not much different to a Commodore monitor of same time (Commodore 1804).

          By the 90s, though, a 0.2 dot pitch was common among good monitors, making low-res games look pixelated and ugly.

          The result is our modern retro society with all the pixel fetish.

          In the 80s and early 90s, this wasn’t a thing yet.

          Games were being played on cheap color monitors and TVs,
          not RGB arcade monitors or CAD monitors.

          Back then, games looked just normal, a bit blurry pethaps.

          Same goes for early PC and Amiga games played on period-correct video monitors and early MCGA/VGA monitors (those 13″ to 15″ models with mechanical adjustment knobs rather than on-screen-displays).

          1. “Weren’t the pixels also not square? The aspect ratio was off, or the picture was squished on the monitor as I remember.”

            Good point. On a 0.4mm mask you (we) didn’t notice, though.
            The big, colored pixels were blending together, rather. That prevented pixelation.

            That was the original intention of the PS/2 monitors from 1987, even.

            It was meant to display smooth and pretty graphics, – photos – , in 320×200 256c..

            Not to display pixelated graphics.
            But that’s what quality VGA monitors from the 90s do!

            Playing “Mean Streets” in the late 80s was a much better experience than in 1995.

            Found this on the internet:

            DESCRIPTION
            The IBM 8512 has been designed as a balance between the need for addressability and the ability to blend colors for near image-quality application output.
            The Model 001 has the following features:
            o 0.41 mm phosphor stripe format
            o 31.5KHz horizontal scan frequency non-interlaced
            o Etched screen for low glare
            o 320, 640, and 720 dot lines of horizontal addressability
            o 350, 400, and 480 lines of vertical addressability
            o 200 line modes run double scanned (400 lines) for improved solidity of lines, characters and shapes
            o 256KB color capability
            o A 4:3 horizontal to vertical data area aspect ratio, which allows for the creation of more realistic picture images”

            Source: https://groups.google.com/g/comp.sys.ibm.ps2.hardware/c/ag96mTnYyqo

            Picture of an IBM PS/2 with original PS/2 monitor:

            https://antnik.wordpress.com/2015/07/19/ibm-ps2-77i/

      1. That’s an exception, “piracy” like this didn’t really exist when Mr G. wrote the letter. And there’s an explanation for it!

        In the 70s, computer hobbyists were pioneers, there was no ready-to-use software yet.
        Software wasn’t being sold yet.

        The whole microcomputer scene was consisting of electronics hobbyist, hams, university students etc.

        In short, it was all hobbyist grade/non-commercial at the time.

        The only thing really “sold” were electronic kits, which are an exception – because, they aren’t actual products.
        Rather a set of lose parts in a plastic bag.

        Software was being shared, just like scientific research papers or learning materials.
        It was just normal in the 70s, still.

        Here’s a quote from Wikipedia:

        “Software was not considered copyrightable before the 1974 US Commission on New Technological Uses of Copyrighted Works (CONTU) decided that “computer programs, to the extent that they embody an author’s original creation, are proper subject matter of copyright”. ”

        https://en.wikipedia.org/wiki/History_of_free_and_open-source_software

  10. I think the Amiga was already dead before Doom arrived. Dead before Castle Wolfenstein. It died to the VGA graphics cards that became the default with business machines. It died to an Intel CPU that could do a 32bit multiply in 3 cycles. It died to John Carmack, who got inspired by this crazy column graphics mode that a PC could do (ModeX, 320×240, square pixels), to make Castle Wolfenstein, something that could have looked awesome on Amiga years earlier, had anyone gotten that idea then.

    The Amiga had some great features and it’s sad that it lost out: the Copper, shared video memory with the CPU (Commodore concept), Precursor of C++ like structures (better than C++ vtables).

    1. The question is, being dead to whom?
      Home users or non-home users?

      I’m asking this question, because in the comments sections around the globe it’s usually all about games.

      But very seldomly, users of other fields share their memories.

      So was the Amiga all about low-resolution games?
      Who are/were the other user types?
      Was there ever more than Deluxe Paint and TV gen-lock stuff?

      1. The home music scene was big on the Amiga because it was cheaper than having real samplers and MIDI controlled synths.

        But then again most of the tracker music was so low in quality, thanks to the limited resources available on the system, that it never quite amounted to “real music”. It was more versatile than the bleep-bloop chip tunes from a C64 but really rough in terms of sample based music and hardware synths, so it was rarely used for anything other than making tons and tons of crappy techno in the demo scene. Then the PC overtook the Amiga on that front as well with more memory for better samples, more channels, better sound cards etc.

      2. You’re right. There was the Video Toaster that gave us Babylon 5. There was BattleTech arcade, networked 4 pods.

        I think my point was that the business machines started to infringe on the Amiga’s media capabilities, maybe by eyeballing the Mac. And then the Amiga did not make the leap into business productivity.

        You could justify spending the money on a PC to be productive, maybe enjoy some games as well.

        1. The Video Toaster was used at first, but wasn’t used for the majority of the production run. They used it for the pilot, but the show was actually rendered on PC and SGI machines for better production quality.

  11. Non-x86 PC systems, even the ones with superior graphics and sound like the Atari STs and Amigas inevitably lost out to the x86 PC clones with their EXPANSION SLOTS due to the development of sound and graphics expansion cards for the PCs and the huge software library which came from the huge installed base of those compatible machines. The PC software incompatible Atari ST and Amigas with their software not even being compatible between each other were thereby absolutely doomed.

  12. I had an Amiga 1000. with all the bells and whistles. I even had a 8MB RAM memory module that had battery backup, and a PLL for the clock that drive memory refresh. I could load RAM with the entire OS, turn off the computer, turn the computer back on, boot from the RAM, and be fully booted before my finger left the power switch. It was a nice machine. The video capabilities were ground breaking.

  13. AGA should have been started earlier, not delayed and had chunky pixels. Then it would have easily survived, until the playstation. Amiga was cheap, commodore failed to cross the boundary to making expensive computers at consumer levels

  14. The only thing that killed Amiga was the cost.

    Windows succeeded because it ran on a variety of hardware of varying age and quality and its DRM was comically pathetic so entire neighbourhoods would just share the same product key/disks.

    You could use your old machine, your old programs, your old display, your old keyboard but still enter the future.

    Amiga cost a ton and stuff wasn’t compatible. They wanted you to buy a new Amiga with each update and it only worked on particular hardware (for the average person anyway).

    You can make a huge article but really that’s it. They just cost too much and you couldn’t easily pirate it. The end.

  15. When ID opened the Doom source code, the Amiga ports ran fine on a modest 68030 system. If ID wanted to at the time, they could have have released an official Amiga version. It probably would have sold a lot of accelerator cards too.

Leave a Reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.