Digital Images And The Amiga

There was a time in the late 80s and early 90s where the Amiga was the standard for computer graphics. Remember SeaQuest? That was an Amiga. The intro to Better Call Saul? That’s purposefully crappy, to look like it came out of an Amiga. When it comes to the Amiga and video, the first thing that comes to mind is the Video Toaster, hardware and software that turns an Amiga 2000 into a nonlinear video editing suite. Digital graphics, images, and video on the Amiga was so much more than the Video Toaster, and at this year’s Vintage Computer Festival East, [Bill] and [Anthony] demonstrated what else the Amiga could do.

Today, getting an image onto a computer is as simple as taking a picture with a smart phone. Digital cameras were rare as hen’s teeth in the late 80s and early 90s, so the options for putting digital stills on a screen were a bit weirder. This meant scanners, capture cards, and bizarre video setups. Full-page flatbed scanners cost a small fortune in the bad old days, so the most common way to get pictures onto a computer were some strange devices built around 4 inch wide linear CCDs. These were hand-held digital scanners, a truly awful technology that deserves to be forgotten.

These handheld scanners had a single linear CCD and a small ‘one dimensional mouse’ on the underside of the scanner. Open up the included software, drag the scanner across an image, and eventually an image will appear on the screen. These handheld scanners rarely worked well, and for some reason the image produced from this scanner was ‘squished’. If you needed images in the early 90s, you could step up to a flatbed scanner. [Bill] and [Anthony] had the smallest and cutest flatbed scanner I’ve ever seen, a Sharp JX-100. This scanner also delivered color images over a serial connection. In 1990, this scanner cost $700.

There’s more than one way to skin a cat, and if you didn’t have a scanner, you could take a picture instead. Consumer digital cameras were terrible, but that didn’t mean you couldn’t find a cheap TV camera and digitize the output. That’s what [Anthony] and [Bill] did, using a black and white security camera to take color images of an Amiga 500 board.

How does a black and white camera produce color images? With a color wheel, of course. [Bill] and [Anthony] brought out a piece of kit built by NewTek, creators of the Video Toaster, that’s basically a black and white camera with a color wheel controlled by a servo. By taking three pictures through red, green, and blue color filters this Amiga 1200 can take full color images. Sure, the resolution is only as good as standard definition TV, but if you need images on your Amiga, this is the cheapest way to go about it. The entire setup, sans Amiga, cost $200 when it was released.

[Anthony] and [Bill] always have a great showing at VCF East, usually with an exhibit dealing with the artistic side of the Amiga. It’s a great look at how far technology has come, and a glimpse back at what the state of the art in computer video was 25 years ago. [Anthony] and [Bill] put together a video of them tearing through their old computer storage to find some of this hardware for the festival. You can check that out below.

54 thoughts on “Digital Images And The Amiga

    1. Yep, the Toaster was a video switcher with some funky transitions and a great 3D modeling program (Lightwave 3D). The failing sheep are my favorite! I would love to re-create that in After F/X. I was lucky to have a friend that had both a Toaster and a Flyer back then. I really need to add those to my collection. I vaguely remember Vlab Motion. I have to look that one up. Thanks! — Bill

    2. I had a VLab Motion. Cheaper, yes. Better quality, no. I had high quality Micropolis AV SCSI hard drives and a Phase 5 68060 accel, and at best motion JPG capture, the JPG artifacts made VLab Motion video look like a VHS on 6 hour record. I heard it was better if you bought the Amiga clone-ish computer that they produced for the VLab Motion.

  1. I must be missing something here. Color camcorders using VHS-C or Video-8 are widely available in the late 80s. Why use BW video camera?

    Also, the youtube video named Mac and IBM CGA as examples poor color in the era. That is misleading when Apple II that released years before Amiga has more colors than CGA. Also, VGA is also available by late 80s. There are also ways to display more than 256 colors with VGA by changing color palettes between lines.

    1. B&W security cameras can actually have pretty high resolution sensors, sometimes 5-600 lines and better sensitivity. If you want high res from NTSC you dont use a standard color CCD. I have a small C-mount panasonic head that uses 3 monochrome CCDs to get higher res video. Fancy for their time.

    2. Yes, much of NewTek’s reasoning for using a B&W camera was the higher resolution. Of course other variables such as lighting and optics also come into play. That is why you see higher quality Tiffen glass filters in the photos because I was experimenting with those vs. the gelatin filters that NewTek supplied with the DigiView. The gelatin filter color wheel also took a beating after 25 years in my dad’s barn! Even though the modern LED lights I had in the exhibit are great, they were not placed in the ideal position due to our space limitations. However, they are substantially better than the florescent lights that people used to use with DigiView back then. Those lights did not have a continuous spectrum, spiked in the green channel, and were not as bright as they needed to be. If you look closely in Brian’s photos you can also see a “color splitter” from SunRize Industries which would split a color composite video signal into Red, Green, and Blue from a color camera and allow your to use it with DigiView. I am looking forward to experimenting with that vs. the B&W camera, but using a vacuum tube B&W camera with the Digi-droid is just so much fun! Thanks for watching and the great discussion guys! — Bill

    3. Amiga was released in 1985, with Digi-view coming out before the Amiga 500 release in 1987 (this is known since the first versions of Digi-View are oriented vertically to sit neatly behind a 1000 and have the wrong gender parallel connector to directly plug into the later 500/2000, which was not corrected until Digi-View gold was released).
      Yes, Apple II with its 16 colors had a better display than the CGA PCs and Macs of the day. I even have a hand scanner for my Apple II! Could not mention every type of computer that was out there but still none came up to the Amiga HAM mode with 4096 colors in an affordable machine especially once the cost reduced 500 replaced the 1000. VGA when released was very expensive and took a long time to catch on for home users on a budget. In fact, IIRC VGA was first released in the PS/2 model line of computers from IBM, and not even in all of them since the low end models came without it. MCGA was accepted earlier by home users but that also was released at the same time in 1987 and on the same expensive PS/2 computers.

  2. Handheld scanners aren’t exactly bad. You just need a steady hand. A flatbed scanner is basically moving the scanner for you, mechanically. There were scanners made to replace the printer head of a dot matrix printer (ImageWriter?) and let the printer scroll the paper for a more accurate scan.

    1. Yeah, I had one for a while and it worked pretty well. As long as you did not go to fast they worked well and you could do some pretty funny stuff my moving them how they were not supposed to be moved.

    2. That was the ThunderScan. It had a single pixel sensor that fit in place of the printer ribbon cartridge. There used to be CD-ROMs of 1bit clip art that were scanned with those.

  3. The good days when computers was computers and seem to enjoy them more and feel you learn to program them and the Amiga was years ahead of its times. Shame it’s all gone now. I wonder what an Amiga would be like now if it was still about and ever ended like it did. Commodore was a great company run by arseholes who sank it, should still been a top dog today.

    1. I blame the media more than the people running Commodore. The Amiga was so far ahead of the PC at the time it made your head hurt but all the computer magazines had to keep the advertisers happy. Only Commodore would buy ads for the Amiga while you had all the clone makers buying ads for PCs. As to what an Amiga would be like that is a hard issue. Would we have 68k cpus running at the same speed as intel CPUs? Would we have seen a 68090? Honestly take a look at a PC today and you are seeing what a modern Amiga would look like. Hardware accelerated graphics, stereo sound, and a multitasking OS. A PC of today owes a lot more to the Amiga than it does a PC from 1985. The only thing that it shares with the original PC is the X86 ISA.

      1. Except, today its still impossible to find a system capable of mixing different horizontal resolutions on the same screen. Pretty sure the Amiga is the only retail hardware to ever pull off that feat. Sure you could emulate it with scaling, but it isn’t the same.

        1. The Amiga was certainly one of the only computers that was designed to do this.. the display coprocessor did the split with no CPU overhead. Some others could hack it a little bit — you might take a vertical blanking interrupt, use that to set a quick timer interrupt, etc. Or a software timing loop. Not pretty. Some systems had horizontal blanking interrupts (aka raster interrupts), like the Atari 8-bit systems. Of course, Jay Miner designed that ANTIC chip, before Agnus, Paula, and Denise. The Ataris also had a display list coprocessor, which, as on the Amiga, made this CPU-free.

          There are two big reasons that modern GPUs don’t bother with this. One is simple: they got 24-bit color in the 90s, so there was much less reason to want to change resolutions. Second is that every Amiga resolution uses the same 35ns pixel clock or some multiple thereof. So every resolution is an even multiple as well… 360 pixels, 720 pixels, 1440 pixels…. and a fixed refresh rate until the AA chips. Modern GPUs have clock synthesizers that can vary the pixel clock, so you can set display resolution and refresh independently. But it usually takes a frame or two to lock that synthesizer’s PLL. So no split screen.

          You may also find that many current display chips buffer their registers until VBI. So no changes on the fly, even if you don’t need to change the pixel clock.

      2. Blame the people running Commodore. Not all of them, certainly, but the folks at the top. In 1991, managemant changes in Commodore International, including engineering, brought in a number of people who had no business running a computer company.

        Next in line is piracy. We had video here in the US, that was professional and so less affected by piracy. But the volume markets were UK, Germany, most of Europe. When major game software companies saw a few hundred sales for millions of Amigas, they moved to the PC.

        We were looking at moving to RISC. If there had been an Amiga 5000, it would have had a CPU module based on PCI. The processor in 1994 or 1995 might have been a 68K, but the one after that, probably not. The x86 stayed relevant because Intel had the money to make it a RISC processor deep down inside, and the money to keep it competitive against most others. Only ARM has challenged it. Apple dudn’t gave the resources to keep 68K or PowerPC competitive, even a very successful Commodore wouldn’t have changed that.

        1. Piracy was a huge problem. A lot of really interesting software ended up dying because not enough people bought it. Loved my Amiga and somewhere I still have my books on exec. One big problem I feel with the Amiga 1000 was that they did not put sockets on the motherboard for extra memory or a hard drive controller that allowed people to boot from the hard drive. PC could boot from the HD since the XT. I feel that hurt the Amiga in the professional market. That and I am still mad at Borland for not shipping Turbo Pascal. I had a lot of code in Turbo Pascal at the time and was excited to port those projects to the Amiga 1000 I bought.

          1. Sockets on the motherboard would have been a problem, particularly given that case. The A1000 did have the front-panel memory expansion… in 1985, 512K was a pretty decent amount of memory. Of course, once there was lots of software, Amigas needed memory more than other systems, since no one else was multitasking, and everything had to live in main memory. Memory modules like SIMMs and DIMMs were the correct way to deal with memory expansion. Problem is, we were going in different directions than the PC industry for awhile. And of course, in the early days, an add-on memory card (on-the-side for an A1000 or A500, a Zorro II card for an A2000) was just as fast as any kind of built-in memory, faster than chip memory.

            The autoboot stuff was added in 1.3. But I agree, it should have been worked out in the beginning. Of course, hindsight is 20/20, and you could make a long list of what was missing from any system. PCs weren’t designed for multitasking until the mid-1990s, regardless of which OS they ran. Macs weren’t designed for multitasking until the Power Macs. Hardware matters — of course, that’s what you’d expect a hardware guy to say :-)

  4. I remember paying >500 DM as a kid for a Microtek ScanMaker II 3-pass scanner with 8 bit linear samples. It took ages to scan a single page. Wasn’t much of a problem, though, because in those days you didn’t have enough RAM to scan a full page at 300dpi. Before that my dad bought a crappy Qtronix Sagitta grayscale handheld scanner.

    The next five scanners I was given for free (UMAX Astra 1200S, broken Lifetec LT9350, Microtek ScanMaker E3, AGFA SnapScan 1212P, Canon N670U), but I still bought an HP ScanJet 8250 that now sits unused on my desk 360 days a year.
    The 1200S was a good scanner. If only they didn’t choose glue to attach the glass to the underside the of the plastic…

    1. I had a parallel port version of the UMAX Astra, 1200p. Same problem, the double sided tape holding the glass failed. I then tried Liquid Nails adhesive. That let the glass drop too. The fix that worked was driving four tiny screws with washers into the two little round posts at each end of the glass. I had to trim a couple of notches in the thin plastic piece on top of the scan head. No more problems with the glass falling in!

  5. The Amiga’s graphics were really never compatible with other computers so while .iff files were able to be converted to other computers, the end result looked awful.

    The Amiga 500 was a lot of money for a computer that didn’t come with a hard drive and an uncontrollable following which kept making it obsolete through asking for expensive upgrades while third parties were gouging customers because there wasn’t enough money from base sales.

    My computer department head steered me away from Amiga in those days because the business market belonged to IBM where most programmers would make their money. Why would anyone go with design when business wants speed to get spreadsheets done? Compact came out with a 33 MHZ computer while a stock Amiga was running at 7.16 MHZ.

    In the business world, quantity and speed always comes before quality because inspiration doesn’t always come and we’re not all artists that could draw or make music even though the Amiga could. So the real fault of Commodore was selling a high priced computer to which a self sustaining audience didn’t exist because the IBM clones won out and not everyone is using their PC as a gaming rig which means the market is not just games.

    The other answer is that the chips belonged to Motorola which means that you have a “computer” company that couldn’t compete because it wasn’t their design to speed up or simplify. Commodore didn’t have any chip scientists so basically they could never compete because they were basically offering some off the shelf parts but in the sense of research and development, you were stuck with off the shelf parts because they weren’t a real company because they didn’t put any money into making a faster chip.

    You don’t hire contractors like Microsoft and Motorola and then call yourself a computer company because there were things that Commodore couldn’t do which other computer companies did. So basically their business model was not sustainable and they didn’t have the R&D or the products to compete other companies that were in the game to win.

    The SID chip wasn’t fundamentally or drastically redesigned from the Commodore 64 which they only spent a month on so the fact that there was no development meant that they didn’t care about their customers.

    1. Your description also fits most of the PC clone makers that were nothing more than cake bakers with a staff of assemblers. They were even worse than Commodore since they had no in house technical staff worth noting. They sourced their BIOS from one vendor, a MB from another, LAN and Video cards from another.

      Their OS came from one place – Microsoft who often had draconian contracts to keep them from selling units with other OS’s pre-installed.

      The CPU came from Intel.

      Microsoft also made sure the major software application houses didn’t port their main products to other OS’s as well. If they were ported, the product tended to by flawed in some manner. That’s how they strangled IBM’s OS/2,

      Companies they couldn’t threaten, they just hired away their top programmers and designers for insane salaries so as to cripple them.

      We used to call it the WinTel monopoly.

      It and hasn’t been broken yet.

      1. It’s been broken for a long time. The thing is people finally realized that Microsoft is the basket to put all of your eggs in. Is it the best OS? Arguably so, arguably not, but that doesn’t really matter. Both Mac OS and *nix have fundamental holes in their software libraries. Macs are over priced and offer limited hardware configurations while a Linux pc historically struggles to provide good drivers for recent hardware.

        Long story short if you want an affordable, modern pc that can actually play popular games and run popular software, you need windows. This might change if developers supported the other operating systems better, but since the market is so small no company wanting to make money is going to bother. It’s a catch-22 like that.

      2. In fact, the extremely bizarre design of the Windows APIs was pretty effective at keeping development on Windows. You soft of built everything upside-down, so your program appeared as a series of subroutines called by the OS, rather than a proper program that’s calling a series of OS subroutines. This was apparently intentional — Microsoft knew UNIX and MacOS in those days, at the least, and designed Windows to be oddly different.

        Some of it was insane. When I was at Scala, I was having to build some interface drivers, between Scala’s MMOS layer (basically, they wrote their own multimedia operating system that could live stand-alone or on top of some other) and actual Windows drivers. I did the usual thing you do when writing a complex I/O driver — set up a few threads to handle things, had defined some signals between them.. and then I was amazed when it all basically fell down. Seems that every asynchronous thing I was doing got serialized and run through the fricking Windows message queue.

        As far as their still being a monopoly… not as much. For one, Microsoft themselves support other platforms for many things… makes sense, given that most of their money is not made directly on Windows anymore. And in 2015, there are about 4x as many Android systems shipped as all of Windows. Not that many people use Android on the desktop — but you could (try RemixOS — works good). And of course, Android runs on x86 too. Sure, but companies are still mighty (though Intel’s currently in the midst of laying off about 10% of workforce) but not as powerful as in the 1990s and 2000s.

    2. The Amiga did not use the SID chip.
      Motorola only built the CPU the graphics and sound chips were by Commodore.
      “The Amiga 500 was a lot of money for a computer that didn’t come with a hard drive..” No it was actually pretty cheap. I do not think you know how much computers costs back when the Amiga 500 came out.
      AKA you have no idea what you are talking about.

    3. PCs could handle some IFF files, once they got to VGA graphics. Before then, no, because they didn’t have enough colors. They couldn’t deal with HAM mode without at least 16-bit color. Don’t blame the Amiga, that was the PC being behind until the mid-90s or so.

      The Amiga 500 was a $500 computer. Compaq didn’t introduce a 33MHz PC until May of 1989… no surprise, since Intel didn’t release a 33MHz 80386 until March of 1989. The base price for that system was $10,499. Also in 1989, you could buy an Amiga 2500/30, which ran at 25MHz, for around $4000 with monitor and hard drive. And it was considerably faster than the 33MHz 80386 systems, since PCs still ran slow VGA graphics cards, no GPUs, and the Motorola FPUs were faster than Intel’s.

      As far as CPUs go, everyone bought an “off-the-shelf” CPU in those days. The only computer systems company to make their own CPU in the personal computer industry in those days as IBM… and they just made a rather crappy clone of the Intel 80386, which they didn’t use in many of their own systems. Most other chip companies: National Semiconductor, AT&T, Texas Instruments, a bunch of them, had lost the 32-bit race to Intel and Motorola.

      Neither Motorola nor Intel were “contractors” — they were not making these chips for any specific company at the time. They were two of a very few companies on the planet who could make a competitive CPU. It was just to complicated and expensive for anyone else. Most of the workstation companies were making their own RISC processors in those days: IBM had Power (not PowerPC yet ,that came later), HP had PA-RISC, Sun had SPARC, SGI had MIPS, DEC had Alpha. They were selling power at any price, so if it cost $2,000 or $4,000 per CPU, they could justify that in a $10,000 or $25,000+ workstation. But in the PC industry, you bought from Intel or you bought from Motorola. Period.

      As for the other chips, Commodore actually was one of the only personal computer companies that did have a chip design and manufacturing division, MOS Technology. That’s why the Amiga’s custom chips delivered about the performance of a 16MHz 68030 on graphics processing. There was nothing other companies were doing that Commodore couldn’t do — you need to learn a little about the history of personal computing before you start writing about it. And of course, it was everyone else buying Microsoft’s operating system, not Commodore.

      I’m not even sure what you’re point is about the SID chip — that was a Commodore 64 thing, and they made a revised version for the C128 that ran on 9V instead of 12V and used a more modern process. Audio on the Amiga was night and day more advanced than the SID chip — the SID was good back in the day when a CPU wasn’t powerful enough to do much for audio, so you made a chip that could do very, very basic ADSR envelopes on its own. Not that amazing, just better than most of the other chips that were doing the same thing and sold on the open market. Amigas used sample-based sounds, the same as every computer these days.

      1. Just a nitpick, but the A500 didn’t come with an FPU (though I would imagine there was a socket for one; I never really owned one, so I’m not sure).

        Also, there were a few computer companies who also designed their own microprocessors. Commodore is probably the most obvious, but there was also Texas Instruments (remember the TI-99/4A?), Acorn with their infamous ARM (though to be fair, it wasn’t used as the main processor until the Archimedes, which came out much later), and most Japanese computer companies, such as Toshiba, NEC, Sharp, and Sony (though the majority of those companies used clones of western designs for their microprocessors). If we were to talk about other ICs besides the processor, this list would just continue to grow.

        1. I didn’t claim the A500 had an FPU… again, a $500 computer (well, $699 list when introduced, dropping to the range $500 in about a year or so on the street). There was no FPU socket — the basic 68000 didn’t have the full coprocessor interface of the 68020/68030. But a $600 home computer wasn’t expected to have a CPU in 1987 — this was a very low-cost system… Commodore 64s still cost $300 in those days.

          All models of the high-end Amigas, like the Amiga 2500 and Amiga 3000, had FPUs.

          Texas Instruments wasn’t a computer company — they were a chip company. They were THE chip company for awhile… .their man, Jack Kirby, invented the Integrated Circuit at TI in 1958! For a short time, they dabbled in a number of consumer products: calculators, speak-and-spell, and that horrible TI 99-4A. They still are a chip company, number 7 in the world as of 2015. And they still make calculators… not many other consumer items.

          Yes, the first ARM processors were designed by Acorn and launched seriously in 1987 (the actual chips were made for them by VLSI Technology, but it was Sophie Wilson at Acorn leading the design). They had the unusual advantage of the backing of the BBC, which certainly helped in the cost analysis.

          The Japanese companies you mention were already major chip companies.. some of them got into the PC business as a result. Of course, Motorola made their own systems, too, and even Intel shipped finished systems for some markets, as well as main boards for the volume PC business. It’s not unusual for a chip company to make systems — it is unusual for a systems company to make custom chips… less so now, than in the 80s and 90s, before you had “pure play” IC fab companies like TSMC. Toshiba is still the #8 chip company in the world. NEC was a top 20 chip maker back in the 1980/1990s.. their chip operation merged with Renesas , which is currently the 14th largest chip company in the world. Sharp was the 19th largest semiconductor company in 2015. Sony was #16.

          Toshiba was one of the many companies in Japan supporting the MSX standard, essentially unknown outside of Japan. They used off-the-shelf Z-80 processors, and eventually Z-80 clones. This was 8-bit, a kind of late-entry competitor to the C64, first announced in 1983. The original design used all off-the-shelf parts, with a TI graphics chip and a GI sound chip. Toshiba put out a first generation “MSX Engine” that included pretty much everything but the Z-80, then a year or so later, a second generation MSX Engine that did include the processor. So a “customized” system if not particularly custom — the MSX Engine was register compatible, so software still worked. Toshiba’s a pretty big player in the PC business, I believe about #7, particularly in laptops. But they have always used off-the-shelf CPUs, Intel and AMD.

          NEC’s the one that really did push hard into personal computers. They launched an 8-bit personal computer in 1979 that included their own Z-80 clone, the µPD780C. But this was a licensed chip — in those days, it was pretty common for a CPU company to get other chip companies to second-source their processors. Zilog licensed the Z-80 to SGS Thompson and Sharp as well as NEC; MOS Technology licensed the 6502 to Rockwell and GTE, even Motorola had licensed the original 68000 to Hitachi. Again, this was a chip company making their own system, not a systems company all of a sudden deciding to make their own chips. NEC also had a series of x86 compatible chips, the V-series, These were originally x86 reverse engineered chips, actually a little faster than Intel’s, and they were briefly used in some personal computers. However, by the V60, NEC was tuning these for embedded systems, adding instructions, etc. No longer suitable for PCs — no one could keep up with Intel. To get a little perspective, about 250 engineers were involved in the V60. That’s why you need a chip company for these things. This chip became popular for arcade machines… you’ll find there’s a V60 emulator in the current MAME distributions.

          NEC made PCs, but most had off-the-shelf Intel processors. They were also selected as a source for the MIPS R4000 series — MIPS, of course, didn’t have their own fab, and NEC made a series of MIPS workstations based on these CPUs, licensing the system architecture from MIPS. So the CPUs were off-the-shelf, but it was actually their shelf :-). All of the PC/Workstation class MIPS processors in the early days, before SGI bought MIPS, were made to the same standards, so it didn’t matter which vendor made them.. there were six total companies making the R4000/R4400.

          Sony’s early computers followed the MSX standard. They also did a workstation in the 1990s based on Motorola’s 68020. They got into the PC business in 1996, with Intel chips, and never made any CPUs for their PCs far as I can tell. They did make their own SOCs for game machines in the early days — the “Emotion Engine” in the PS2 was a licensed MIPS processor with a Sony-designed vector engine.. but this was late 1990s. They of course switched to IBM and Toshiba for design work in the PS3, and AMD for the PS4. Sony’s big chip business today is camera sensors… they sell close to half of those used worldwide.

  6. Apple had the bizarre little cartridge replacement for the ImageWriter II that did line scanning into the IIgs, all you had to do was make sure the cable had enough play when the faux-printhead went slamming back and forth. Actually, I still have mine…wonder if it works

    1. The “Thunderscan”! Wow, does that bring back memories — though I swear I used it in my ImageWriter I back in the day. It resulted in some surprisingly decent bitmap scans if I remember correctly…

    1. > Yeah I would love an Amiga iPad, sitting there in the evenings relaxing and
      > playing some good old Amiga games.

      In case you own an android tablet, check out UAE4Droid, could be just what you’re looking for.

  7. My first computer was a VIC-20 (VC-20 fuer mein Freunde in Deutschland), with a C2N cassette unit. In 85, I purchased a 64-C with a 1541 with lawnmower money. When I went to High School (and became the A/V go to guy for the faculty), I had an A2000 with 1MB FAT AGNES, 2084 video display, dual 3.5 floppies, a JANUS 2088 card with 5 1/4 floppy (to do some PC stuff). Nothing like the leading ladies (Paula, Denise and Agnes) doing the lovely things like audio and video, while Gary handled the other hardware. Since it has been 2 years since my A2000 has passed away, I have the whole chipset laying on a breadboard on my workbench. It is right beside my first audio digitizer (using an AD0804 at the helm), and my serial MIDI controller used on that beast of a machine. I still have respect for the Commodore engineers and developers, as they sparked my interest in the earlier points in my life, and made me the hacker I am today. 73, KC8KVA

  8. Don’t forge Star Trek Voyager. The people over at Foundation Imaging produced all the CG content for the first two seasons on an Amiga cluster running LightWave with custom extensions.

  9. Thanks Brian and Hackaday for posting this as well as all your support for all the other great exhibits and the Vintage Computer Festival East XI in general. I absolutely love the Hacker Space at InfoAge and watching those kids accomplish amazing feats was incredibly inspiring. It was an awesome weekend that I hope continues to flourish for years to come. Keep up the great work and inspiration. All the best, Bill

  10. Yeah your right same here. I was the same with all the commodore stuff. I love playing game but love programming it too and learn a lot from all commodore computers. Good days for computers and I also take my hat of to the vic20,c64,Amiga 500 and the A1200. This is why am into electronics,modding and hacking because of you. Thanks Commodore for the good memories. Oh I’m still waiting for someone to make the Amiga iPad lol.

  11. Babylon 5 graphics were done by the great Rich Payne, a local fixture at the local users groups in central New Jersey. He’d blow everyone’s minds with his art work and parlayed that into a job doing advertising for commodore near the end. Rich is now out in LA and has built up quite a resume!

    The Amiga was so popular that between 1985-89 or so, there were 3 huge user groups meeting monthly in NJ alone, the one at Rutgers hosted by perry and Eric drew 100s of people each month. This fervor was being replicated all over the country. It was quite a phenomenon. The peak had to be the NY Amiga fest in the early 90s, the place was packed and stuffed with development.

  12. I built a digital photography/composite business around an Amiga 3000. Even had an interview with me published in Amiga Format magazine.

    The Amiga 3000 was a great machine. I was using a desktop A3000 with a Micronik case that gave me a lot more Zorro slots and bridgeboard slots. At first, I used a video camera to capture still images, but that just wasn’t high enough resolution for print. So as what I was doing expanded, I eventually ended up with a Phase 5 68060 accelerator and Cybervision 64 graphics card and a Polaroid digital camera that captured 800×600 tethered via the SCSI bus. Massive 128MB of RAM, a 1G Seagate SCSI and two 4G Fast SCSI 2 Micropolis AV drives.

    No Amiga drivers, so I ran Shapeshifter and captured with Apple’s Mac OS, System 6 (or 7, don’t recall now). Since no Mac ever had a 680×0 greater than a 68040 and the PowerPC Macs at that time emulated the 68k chip, my Amiga was faster as a Mac than any Mac.

    We went to sci fi conventions and composited people into custom backdrops in their costumes. I’d have an inkjet printing out a high res glossy, which took about 15 minutes then, while I had Real3D (a 3D modeling/rendering program) rendering several scenes or elements in the background, while working on a multiple layered graphic at 2400×3000 pixels in ImageFX, then I’d pop over to Shapeshifter to snap someone in their costume while my wife posed them. Printing and rendering continued uninterrupted, and everything ran smooth as silk.

    The first Norwescon that we took this to, I had six different Microsoft programmers watch me and talk to me, and they’d suddenly say “Bill Gate ruined software!” after finding out that I was doing all of this on one computer running at 60MHz.

    I had a Primera Pro dye sub printer that required an absolutely uninterrupted stream of data from the computer or the print would be ruined. If you were running Windows, it was best to make sure nothing was running but the print, and don’t touch the computer. I talked to other people using this printer with Windows 95 who had prints ruined because they moved the mouse, and it might take 20 minutes to over an hour for the computer to process the file for printing.

    It took my Amiga 3000 almost exactly 60 seconds between when I hit Print and the printer started. I tested it really hard one day – I hit Print, then quickly started Real3D and set three different high res 2400×3000 pixel images rendering, dialed into my ISP, started my web browser and opened a dozen tabs on different websites. The result? It took 2 minutes before it started printing, and the print itself was flawless. At no time did my typing or mouse pointer get jerky.

    My old website is still up, although the digital photography business is gone. I never took orders online, so no shopping cart. Before you look at it, keep in mind that I never claimed to be an artist.

    http://www.polyphoto.com

Leave a Reply to zergCancel reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.