The IBM PC That Broke IBM

It was the dawn of the personal computer age, a time when Apple IIs, Tandy TRS-80s, Commodore PETs, the Atari 400 and 800, and others had made significant inroads into schools and people’s homes. But IBM, whose name was synonymous with computers, was nowhere to be seen. And yet within a few years, the IBM PC would be the dominant player.

Those of us who were around at the time cherished one of those early non-IBM computers, and as the IBM PC came out, either respected it, looked down on it, or did both. But now, unless your desktop machine is a Mac, you probably own a computer that owes its basic design to the first IBM PC.

The Slow Moving Elephant

IBM System/360 Model 30 mainframe
IBM System/360 Model 30 mainframe by Dave Ross CC BY 2.0

In the 1960s and 1970s, the room-filling mainframe was the leading computing platform and the IBM System/360 held a strong position in that field. But sales in 1979 in the personal computer market were $150 million and were projected to increase 40% in 1980. That was enough for IBM to take notice. And they’d have to come up with something fast.

Fast, however, wasn’t something people felt IBM could do. Decisions were made through committees, resulting in such a slow decision process that one employee observed, “that it would take at least nine months to ship an empty box.” And one analyst famously said, “IBM bringing out a personal computer would be like teaching an elephant to tap dance.”

And yet, in just a few short years, IBM PCs dominated the personal computer market and the majority of today’s desktops can trace their design back to the first IBM PC. With even more built-in barriers which we cover below, how did the slow-moving elephant make this happen?

Doing It In A Non-IBM Way

IBM 5120
IBM 5120 by Marcin Wichary CC BY 2.0

Much of the push for a personal computer came from executive Bill Lowe and other IBMers who had been working on the IBM 5100 and 5120, IBM’s minicomputers that fit on a desktop but were too pricey for consumers. Consulting computer dealers about PCs, Lowe was told that store employees would have to be able to make repairs and that it must be made of standard parts. This was very much in contrast to the way IBM did things. IBM’s approach had long been a cradle-to-grave bundling of engineering, sales, installation, software design, and maintenance.

Despite IBM’s reputation for moving slowly, bursts of rapid development were possible in the company. The IBM 5110 had gone from conception to production in only 90 days. But even with the 5100 and other minicomputers, they’d failed to dominate the market giving way to Digital Equipment Corp (DEC) and Data General. And so IBM began to establish what amounted to little companies within IBM, called Independent Business Units (IBUs).

In 1980, Lowe discussed a proposal with IBM President John Opel, CEO Frank Cary and the Corporate Management Committee (CMC) to base a PC on the Atari 800. But instead, the committee allowed Lowe to form an independent group within IBM to develop IBM’s own PC from scratch. The group consisted of a dozen employees in Boca Raton, Florida, led by Bill Sydnes, who’d had managed the rapid, 90-day development of the 5110. They were tasked to develop a prototype in 30 days.

The first wire-wrapped prototype was completed in less time than that and presented to the CMC in August 1980. The group also delivered a business plan and it was that which convinced the committee to go further. Some of the plan’s main points were as follows:

IBM PC motherboard with 5 expansion slots
IBM PC motherboard with 5 expansion slots by German CC BY-SA 3.0
  • It would have an open architecture, a decision influenced by many IBM employees who were themselves users of the Apple II, which had expansion slots inside.
  • It would be sold in retail stores.
  • It would use industry standard parts. This was necessitated by the need to make something that would work well, have a low risk, and which store employees could repair. That also meant using an off-the-shelf processor.
  • That processor would be the 8-bit 8088 instead of the 8086. The 8086 was deemed too powerful and would compete against existing IBM products. Also, Intel could provide the 8088 at a lower price and in larger quantities, and the 8-bit bus made the computer cheaper.

Most of this flew in the face of IBM’s cradle-to-grave approach, but the committee agreed that it was the plan that was most likely to succeed. And so in October, the group was turned into an IBU codenamed “Project Chess” tasked with producing the “Acorn” by August 1981, less than a year later. Lowe was promoted and Don Estridge took over leading the group.

The first IBM PC
“The first IBM PC” by Ruben de Rijcke CC BY-SA 3.0

The approach to the software also departed from the IBM way of doing things. They decided to license Microsoft BASIC instead of using their own BASIC used on their mainframes. One reason was that Microsoft’s BASIC already had hundreds of thousands of users worldwide.

The OSes came from Microsoft and Digital Research. Microsoft licensed the rights to 86-DOS from Seattle Computer Products and offered to supply it to IBM. The IBM branded version was named PC-DOS while Microsoft’s was MS-DOS. From Digital Research came CP/M but IBM sold PC-DOS for far less than CP/M and so PC-DOS won the market.

On August 12, 1981, the Boca Raton group met their deadline and IBM announced the IBM PC.

Marketing Like Apple

It was priced to compete with Apple and other personal computers. The cost for the basic 16K RAM machine with the Color Graphics Adapter but no disk drive was $1565. The 48K Apple II with disk drive was around $2800, but once you added up the cost of a comparable 48K IBM PC with disk drive and game adapter, the price was similar at $2630.

To take advantage of the IBM name, the initial ads used phrases such as “Presenting the IBM of Personal Computers” and “My own IBM computer. Imagine that”. But that was followed by far more personal, and friendly ads using Charlie Chaplin’s The Little Tramp character and based on Chaplin’s films. They were simple, most with clean white backgrounds with Chaplin, the IBM PC and a red rose dominating the frame.

In keeping with the non-IBM approach, IBM didn’t produce its own software for the PC until 1984. Instead, launch titles included leading third-party software at the time: spreadsheet VisiCalc, accounting software PeachTree, and EasyWriter which was originally written for the Apple II.

Distribution was done through IBM Product Centers. But after studying Apple’s distribution network, it also sold through outside retail outlets such as ComputerLand and Sears Roebuck.

In the first year, they’d sold fewer than 100,000 units. By the end of 1982, one PC per minute was being sold during business hours, and by the end of 1983, 750,000 had been sold. IBM’s PC revenue had reached $4 billion by 1984, a far cry from the $150 million dollars total personal computer market size of 1979. By 1984, the IBM PC and clones made by other companies copying the PC had become the number one personal computer.

Why Did It Sell?

Did the IBM name make it sell, or was it the non-IBMness of the product, the open hardware, and software approach?

For the corporate customers, it seems to have been the name. Many of them had IBM mainframes and were used to IBM’s high quality service, and so they were comfortable with going with the IBM PC. A popular saying was that “No one ever got fired for buying IBM”.

On the other hand, due to the quality and quantity of documentation and support which IBM provided for the PC, one year after it hit the market there were 753 software packages available. Most of the major software houses were adapting their products to run on the PC. On the hardware side, IBM didn’t release a hard disk for two years but 3rd party hard disk subsystems were available in ComputerLand stores during the introduction in 1981. By November, Tecmar, a PC enhancements products manufacturer had 20 products including memory expansion, data acquisition, and external hard drives.

Getting Personal

Drawing from personal experience, I was a TRS-80 Color Computer owner in 1981. When taking a break from University, I could be found at Radio Shack with other CoCo enthusiasts showing off software we’d written, and examining the latest games and hardware. But invariably I’d head a few doors down to the IBM Product Center to check out the latest on the IBM PC. Why? Because it was IBM and I felt that even though the screen was a boring green on black, and the case was nowhere as sleek as either the CoCo or the Apple II, giving it all the attractiveness of a brick, I felt that IBM was a force that could not be ignored. Within a few years, I was programming on contract in Pascal and dBase on an IBM PC AT. Other than for small PowerPC, ARM and AVR based boards, IBM PCs, their clones and their descendants are all I’ve used since.

Why do I continue to buy PCs? There’s a huge variety of available hardware along with the software, both for sale and for free. If I have a problem then the solution from the online support community is only a few clicks away. All of that results from IBM’s initial decision to go with openness, and as a result, I see no reason to change.

What about you? Do you have stories of the days before or during the introduction of the IBM PC? If so, which one of the pre-PC personal computers did you use and how did you feel when IBM entered the market? Do you use a desktop at all these days? Being a hardware hacker we have to assume you have a keyboard machine of some sort for that occasional coding.

128 thoughts on “The IBM PC That Broke IBM

  1. One big reason they got to market so fast is that the 5110 is a straight-out-of-the-app-note implementation of the 8088. The Intel datasheets describe that design in some detail. IBM also got tons of help from Intel who realized what having IBM use their parts would mean for them.

  2. You missed an important part of the story. The initial design and manufacturing actually came from a manufacture in Huntsville Alabama named SCI. And without SCI, the project would have failed.

  3. From friends who worked on the original PC and the PC Jr, I’ve been told that the original proposal was a 6809 based system. However, IBM had several Intel blue-box development systems from the development of the word processing system, and those were rather expensive boxes. Rather than buy 6809 development systems, they elected to re-use those blue boxes in an effort to sell the project to management.

      1. Of course, the CoCo was a fairly direct copy of what Motorola showed in the datasheet or application note.

        And it ran Microware OS-9, a multitasking mutiuser operating system for the 6809. It was even “Unix-like”, and we were reading so much bout Unix around that time. I know I went for the CoCo because of the 6809 and OS-9, and it was the only 6809 system (Dragon excepted) which was within my price range. I could only drool over the Gimix and the SWTP 6809 systems.

        Michael

        1. I used Color BASIC. I didn’t become aware of OS-9 until around two years after getting my CoCo when I moved to the “big city” for uni. Would have been cool to have that RTOS experience earlier.

        2. The COCO was a very under rated machine it lacked the pizazz of the C64 but it had a good CPU and great basic .
          The COCO3 solved many of it’s short comings of the first two models and some say it was the best 8bit machine ever produced.

      2. We were developing color text displays for mainframes and serial using the 6809e. It was a great processor; I built a computer around one at home, and got it to boot Forth (eventually). There was a 6809 monitor program published in the back of one of the Motorola books, and I copied that faithfully. The monitor program provided something like user exits for reading a keyboard and writing to a display, and that’s where I started with debugging the hardware and software at the same time. I still have bits of it in my museum box.

  4. I attended engineering school in the fall of 1983, and was *required* to buy the $1,800 DRC Pro-350 desktop computer w/dual floppy drives, an HD, and running P/OS, an RSX-11m based OS. It was a PDP-11 in simple desktop form, required no tools to repair it, and ran almost the entire library of RSX-11m software… we also bought it at a 50% discount off list price.

    There were struggles, we were essentially a beta test site, all the computers soon needed hardware upgrades/repairs, but they were short-lived machines.

    One unique item about these wholly unique systems was that their keyboard had a “Do” key, intended to take the place of the non-existent, but often cited ‘any’ key.

    1. I once scored a surplus Pro-350 cheap, but I was an RT-11 partisan rather than RSX-11. The hardware wasn’t exactly friendly to RT-11. It didn’t have a standard console port, so RT-11 had to load a big console driver that included terminal emulation. That ruled out running RT-11SJ, my favorite version (although there is much to be said in favor of running RT-11XM, loading up VM: with a system image, and booting from it).

      There is a console port available on the printer connector if you have a special cable, but it was strictly to allow diagnostic access to MicroODT; it couldn’t generate interrupts, so RT-11 couldn’t use it as a console port.

  5. Let’s not forget that IBM contracted Tandon to help produce the PC, because IBM wasn’t able to fill all the order by themselves. Tandon, knowing the BIOS and its patents, developed a compatible BIOS without infringing on the patents.
    During those early days of the IBM PC, it was necessary to know “how compatible” a clone was. Some were barely compatible, but those based on Tandon’s BIOS were VERY compatible.

    1. I was following PCs closely at that time and this is the first I’ve heard of Tandon. A quick web search indicates they didn’t get into PCs until after selling their disk business in the late 80s. It was Compaq that did the first clean room replication of the BIOS in the early 80s, and fought the legal battles over it, which lead to others producing copies of the BIOS. The BIOS was the stumbling block in producing PC clones. Everything else was what we’d call today open source hardware.

      1. If my memory serves me Tandon did not design the IBM PC but did design the TANDY IBM Clone (Internal Tandon factory code name Project UTAH? However Tandon did build the 5.25″ full height Floppy Disk Drives used in the first IBM (shown in the above picture) and then the XT. The lore was that the FDD contract was done on a napkin and a handshake.
        Tandon’s Magnum (West Lake division) supplied 1/2 height 8″ disk drives to both Tandy and IBM. Other players in the 5.25″ FDD game at the time were Seagate and PERTEC (PEC). ( Background – I worked at both Pertec and Tandon during this formative “Personal computer” era. )

        1. Tandon built not only the drives but the floppy disk controller card, which had one of those unique IBM modules with a square aluminum case and a ceramic substrate from the IBM “SLT” (Solid Logic Technology) days.

    2. IBMs legal people in Winchester, UK wrote a program a long time ago that would run on a vendor’s machine, and assess how compatible its BIOS was to IBM. They then went and promptly sued people for illegally copying IBM BIOS, quite successfully in some cases. Less so in others – Apricot in the UK were surprised to be threatened with court action, since the motherboards they were using in their computers were manufactured by IBM themselves .. a very embarrassed IBM OEM EMEA General Manager had a meeting with Apricot to beg forgiveness (and to offer indemnity).

      1. Compatible was not actionable. Code-for-code copies standout. A guy at IBM told me ways he would rewrite the IBM BIOS so it appeared to be developed separately.. But he also disassembled DOS4.1 and made an IBM-Internal-Use-Only version that many people used because it supported stuff like 4 disk drives and “otherstuff”. I was at IBM Research for an Internal PC Coordinator’s meeting and some guy from Microsoft was there. One of the guys showed him something on lunch, and he said “How did you do that?”.. “Oh, that’s easy with DOS4.13” he was told. Wait… “There IS No Such Thing as DOS 4.13!” “Sure there is”, said the IBM guy, “Everybody here uses it..”.

    3. Dad had a second-hand 386 Tandon laptop from later years. The battery pack was toast and it weighed 14 pounds so we called it the transpushable. I tried using it as a school computer in middle school in the late 90s.

      1. My brother used a Toshiba T4800CT in high school in the late 2000s… It was pretty heavy, and it had a clicky keyboard. 486 DX4-75 with Windows 3.11, and the battery was garbage so he always had to have it plugged in.

  6. This is the first time I’ve read that IBM considered basing a PC on the Atari 800. A cursory web search didn’t turn up any convincing references. The mind boggles at how this would have changed personal computing history.

  7. I heard that Moto 68k was first choice, but yeild was too low (only experimental at the time) and that the 808x was actually the last choice of IBM. I also herd that there was a busness issue, that IBM would not buy Moto cpu’s because of some sort of anti-trust thing. So, in the end, we had to suffer with shmegma-offset addressing until about 1995 while Amigas and Apples were running rings around us poor PC folk.

    1. Actually, IBM had several efforts going on at the same time, Almost >.< competing.

      The IBM Lab computer, AKA System 9000, came out about the same time. It was Motorola 68k based, used VersaBus, ran Xenix (or an IBM OS of some sort). Could hold ~5MB of RAM iirc.

      One of my college friends worked on it in Danbury, Ct, then went over to the AIX group in Fishkill.

    2. I met with the Motorola guys who were competing for the PC business from IBM. Their 68000 CPU at the time was a huge 64-pin gold-pinned DIL ceramic package, and they couldn’t guarantee that their 40-pin multiplex version, the 68008 would be there in time for IBMs intended announcement. There was also a gentle nudge from IBM Corp to use the Universal Controller family from IBM (UC.5 probably) but the costs were too high for that also. 8086 was already in use in the DisplayWriter from Austin, TX, seemed a reasonable fit. Again, the 8088 variant was chosen since the data bus was 8 bit, and that cut down on external glue logic. Realistically it wasn’t until 386 arrived that we could get over this dumb idea of the Intel segmented address space, and get on with the business of developing software sensibly.

      1. Remember, at the time Intel was most concerned with backwards compatibility with the 8080 and 8085 so that the new 8086 could take advantage of all of the 8-bit software out there, and quite reasonably so. The segmented address space allowed the 8086 to have it both ways: 8-bit assembly source compatibility with little or no changes and memory beyond 64 KB for programs written to support it.

        Also I believe the 80286 supported linear addressing (in protected mode) before the 80386. The big catch was that the 80286 couldn’t get back to real mode without a hardware reset so an OS supporting both real mode programs (of which there was many) and protected mode programs (nowhere as many at first) had to pull some tricky maneuvers.

        1. There was some magic in the keyboard controller, including a switch that masked A20 – the top address bit when in real mode, since the 286 had a bug where address wrap didn’t work. Then the OS/2 developers discovered the LOADALL instructions, but they didn’t work on all 286 stepping levels .. though I think the AIX guys in Austin used something similar on 386 to support their DOS box?

    3. While in college and friend had an Amiga, while I had the original 5150 (8088) based PC that was sold at our university for a ridiculously low price including an IBM dot matrix printer you could drive a car over it was so overbuilt. Anyway I used to acknowledge the superior architecture of the Amiga, but point out that because of market share, the “C” compiler on my snail slow 8088 had such superior optimization that it was churning out numerical simulations twice faster than his 68000. Looking at the assembler listing for the same “C” code compiled on his Amiga vs my PC it was obvious. The 8088 “C” compiler was producing decently optimized assembly code, while the Amiga compiler was producing disastrously bad assembly code and heavily leaning on calling library routines for the most basic tasks (like addition). I guess the moral is that the ecosystem can make or break any computer.

      1. The original PC had a (usually) empty socket for the 8087 numeric co-processor.

        There was a really cool board that some of the internal folks built for the Amiga that had 4 (I think) 68881/2 numeric co-processors on it for doing serious graphics/ray-tracing.

    4. One thing I remember reading about around then was that IBM was influenced by the fact that Intel had a complete chipset (not just the CPU), while Motorola did not. I remember seeing Zilog ads for their Z8000 series chips stating that their support chipset would work well with the Motorola 68K (and offered the design docs to do so).

      In addition, Intel had a roadmap for their CPU tech going forward while Motorola did not. IBM could use this to plan the future – a big deal for IBM execs.

  8. Remember the dozens of ‘MS-DOS compatibles’? They were fully IBM compatible at the BIOS level, but so many programmers were bypassing the BIOS, or relying on timings unique to the HW that quite a bit of SW just wouldn’t run on anything else but a Genuine IBM PC.

    I worked for Tandy/Radio Shack in Ft.Worth in ’84/85 when we brought out the Tandy 1200, which was our first true PC clone. Prior to that (’83), it was the Model 2000, which was ‘compatible’, but had an 8MHz 80186 (almost 2x the speed of the 8088 in the IBM PC), supported 768K RAM, had an optional HD, thinline floppies (80 track vs the 40’s in the PC) and some rocking 640x400x16 color graphics, all long before the PC/AT. While it ran MS-DOS quite nicely (and rocked AutoCad), it also ran Xenix, the Microsoft (via a little company called The Santa Cruz Operation, SCO) version of System III Unix. I still have a copy on floppy here someplace.

    Remember the PC Junior?

        1. Tandy didn’t make the Keyboard Mistake IBM did. The Tandy PC computers are NOT PC Jr compatible despite using the same Texas Instruments audio chip (same one TI used in their 99-4/A) and using essentially identical “Enhanced CGA” video modes.

          A PC Jr. can be made Tandy 1000 compatible with a couple of simple hardware mods so it will support both PC Jr and Tandy 1000 modes of accessing the sound and video. There isn’t much software that was written to support both. If it’s written to specifically support how the PC Jr does things, you won’t get the TI sound and 320×240 16 color graphics on a Tandy. Likewise for software written specifically for the T-1000 way of doing things, won’t get TI sound and 16 color graphics on the Jr.

          I did the Tandy video mod on my PC Jr. Never could find the info on how to hack the sound. I found on IBM’s Canadian website how to hack the video. It’s simply piggybacking a common TTL chip onto one under the floppy drive, cutting one or two traces and soldering a couple of wires. TADA! The hardware then presents itself as both Jr and T-1000 video compatible.
          I figured the sound hack had to be as easy and didn’t want to pay PC Enterprises’ inflated price for whatever it took. :P IIRC they wanted the same price for the video mod that included the cheap chip, a bit of wire and the instructions IBM was providing for free.

      1. There was a company called PC Enterprises that made lots of upgrades for the PC Jr. Some other companies did too. It was possible to jack up a Jr to 640K RAM, add EMS, 8 bit Soundblaster, VGA, a hard drive, second floppy on top (typically along with RAM expansion) and more.

        PC Enterprises also made custom 486 upgrade motherboards for some of the pre-486 models of IBM PS/2 and other brands of proprietary PCs. They briefly flirted with the idea of an upgrade motherboard for the PC Jr. but decided not because it would’ve required replacing everything except the bare case and floppy drive.

        It was just that one really stupid decision to have the chicklet keyboard… had IBM gone with a normal keyboard from the start, and left out the pretty much useless IR wireless, it could’ve been slightly cheaper and never been stuck with the ‘toy computer’ label it got from its original keyboard.

        IBM wasn’t done with it, they made a black version called the PC Jx, with dual 720K floppies in the base case. Never sold in North America.

        I had one with a NEC V20 CPU. Sped up DOS a bit but with 22-Nice it could run CP/M software as fast or faster than any Z-80 system.

  9. I decided that my current ATX tower will be my last. Just no reason to have one anymore when I have gigabit Internet and can spin up an instance from any of a dozen cloud computing services to do the work those two quad core Xeons from 2007 used to do.

    1. Or have one’s personal “cloud” somewhere in the house and a much smaller computer on the desktop. Bonus in that one can migrate from cloud to local, and back with a good connection.

      1. I just built two ‘itty-bitty’ servers based mainly on stuff I had lying around:

        4x 16 Gig DDR3 Ram (bought when RAM was cheap)
        2x 2 TB SATA drives
        2x Marvell NICs
        2x mATX cases

        To which I added:

        2x 500w ATX PS ($25/ea AR)
        2x 120 GB SSD ($40/ea)
        2x AMD FX-8230e “sorta 8 core) CPU ($40/ea AR)
        2x Asus AM3+ MB ($Free/ea after rebate)

        So I have 2x reasonably well powered VM hosts theatre fairly low-power and allow me to spin up local VMs at will without the possibility of cloud charges.

        Of course, I do pay for electricity, but these systems are reasonably low-power. These machines are going to cause me to shed a small number of older Server-grade hardware (PE1950, PE2950) that sound like jet engines, and suck power like it’s nobody’s business. I liked my server-grade hardware, but these boxes represent a great compromise between speed/power/cost.

    2. I upgraded my tower about a year ago. It was about 13 years old. It had more than enough grunt for software development and CAD/CAM applications. I had to upgrade because I needed USB 3 – none of the add on cards I tried worked correctly.

      1. Maybe, and you’re probably being somewhat facetious, but the similarity is superficial. A big difference was that back then if you used the mainframe you had to do it the mainframe way. Now with the cloud, you can do it the platform’s way (PaaS) or do it your way (IaaS) as much or as little as you want and have it available anywhere in the world with someone else building and maintaining the servers, the datacenter, the network, etc.

        1. Depends if you’re a user or a developer.
          The cloud is all about lock in. The phrase “it’s just someone elses computer” is made for good reasons.
          A lot of the cloud “solutions” are driven by business looking at MRR (monthly recurring revenue) ie: you never own the data or the hardware, you just keep forking out for it every month. And if you total that up year on year you’ll realise you’re a shmuck for paying it.
          Until the company gets bored and turn the tap off because that business model didn’t quite pan out and then you’re screwed.

          The users experience is very much do it the mainframe way, until your shared access is terminated.

  10. Actually, there’s a major missing part of the story. I saw the “White Room” with an earlier “Personal Computer” based on an IBM Processor (I believe it was the Woodstock). Probably 1979? It worked, but there was no operating system and no applications. It was killed and the Boca guys like Lewis Egglebrecht built the Intel-based system.
    The reason Tandon (and others) were able to write a compatible BIOS quickly was that the BIOS listing was in the IBM PB TechRef. I was part of the push to keep it there, when the “OMG! IBM Never gives out ANY Source Code” contingent was made to realize that there were DISASSEMBLERS out there, and IBM guys had (on their own time and equipment) disassembled, modified (added different CRT and Printer drivers) and reassembled the Ohio Scientific OD-65D disk operaing system. They were shown the listing. “Get that OUT of this building right now!!”.. The BIOS stayed in the TechRef.

    1. Woodstock 2 .. yes, that is the one I couldn’t remember. There was UC.5, UC1, Thebes at the higher end and Woodstock 2 at the lower end. Really interesting architecture that allowed fast context switching (for controllers, I believe) where it was always running at one of 8 interrupt levels; priority determined by the interrupt number. I recall working on the 3790 and 3740 families in Hursley which were UC based, when we were developing the NDS terminals.

      There was a huge set of convergence projects in IBM at the time. Terminals were being developed for 3270 in Hursley and in Yamato, Japan. There was 5100 being developed in Rochester, MN and Displaywriter in Austin, TX. IBM Corp decided that we should have one common architecture supporting all of those projects. We had 68000 boards, 8086 boards, Woodstock 2 boards all plug-compatible so we could build whichever they wanted. And then, quietly and without ceremony the IBM PC happened, and our projects all got turned into a series of adapters that plugged into the iBM PC adapter sockets. No more processor board development for us in Hursley. Shame; we could have built a really good family of displays on the 68K family.

      1. I coded on the 5100 back in high school. It had 8mm storage cartridge and a switch for Basic (yawn) and APL (wtf is dat?). A byte contest for matrix multiplication later, I had code that would do math on a n dimension matrix. Got disqualified because APL was interpreted.

        The 5100 was portable in the same way tanks with handles were portable.

        1. APL was (IS) an amazing language I met Ken Iverson at IBM when we were using mainframe APL to run some chip testers (I/O from an IBM Internal “Device Coupler”. Iverson write the book “A Programming Language” laying out the concepts of a mathematical based language using the Greek letters as in formal math. He was hired by IBM Research in Yorktown, NY. Legend has it that the first week he was there he was called to a meeting. The discussion was “How can we implement APL on a 360, and can we make a Selectric Ball with the APL characters?”. Iverson said, WHat?? You’re actually going to MAKE it?? ” Print a 10 by 10 matrix of the numbers from 0 to 99? “10 10 rho iota”

          1. We used to log into the Santa Teresa APL machine from Hursley, UK to run chip delay programs .. we modelled critical paths in the silicon – clock trees, latches and so on – to make sure that in the extremes of temperature allowed in the final product that the designs would still work. The algorithms used in APL were a little more sophisticated than the simpler delay models built into EDS (IBMs Engineering Design System, once upon a time the second largest suite of programs in the world, second only to CAEDS/CADAM). When I started, we were using IBM 2741 golfball consoles with APL golfballs on. Later on, it was all on 3278 / 3279 displays with a real APL font. I wrote a full screen editor in APL using the canonical representation functions .. worked pretty well if I say so myself. I still have APL on the mac here – though these days I scarcely use it for anything other than doing a random set of numbers for the lottery (in about 7 bytes). I still hanker after a golfball typewriter with a serial interface .. I still have my APL golfball *somewhere*.

        2. Nick said ” I wrote a full screen editor in APL using the canonical representation functions .. worked pretty well if I say so myself.”
          I used to have a printout of that editor on my wall at IBM. It was about 10 lines of dense greek characters. I got so I could read most (some?) of it.
          APL has a character that looks like a little lamp shining down, called “Illuminate” which meant “comment”.
          At the end of the code was a line with the Illuminate character, followed by
          “APL is inherently self-documenting ”
          :-)

  11. Boca’s reward for all this was to shutdown a few years later. Further “innovation” in the form of the microchannel bus was able to end IBM’s PC adventure. (and also kill off Tandy’s computer adventure)

      1. Oh, it was a great architecture, I don’t deny that!
        It was just that IBM had punitive licensing fees on MCA.
        Someone “way up there” thought since IBM had the greatest market share, but was losing it to clones, that they could force the whole PC industry into submission by offering the next greatest thing, and in order for clones to survive, clone makers would have to comply.
        But, by then, the clone makers were large enough to form their own standard 32 architecture, (which was compatible with the existing ISA) and go there separate way.

        1. It’s an irony that PCI went on to be the proper replacement for the AT bus (EISA being a small excursion) and used the same connector. Probably a lot of the same people on the standards body as well.

          The main architectural difference between Boca and Austin was that the address on the PC version of MCA was a physical address. On the PC, the address translation between CPU and slot happened in a memory management unit. That effectively meant that any MCA bus masters were a little crippled, or very complex since they would either have to work with pinned pages, or bus snoop and try to keep in sync with the main MMU. In Austin’s version, the addresses on the bus were virtual, and were then translated into physical addresses. This meant that 32-bit bus masters could operate very effectively, increasing performance of disk adapters, display adapters ..

          1. MCA’s major problem, especially later, after IBM gave it the axe, was requiring special setup floppies for every board. Couldn’t even move a card to another slot without having to boot from the special floppy. Thanks to the MCA Mafia all or most MCA peripherals have their information and configuration floppies available.

            But wait, there’s worse! You didn’t dare allow Windows 95 to look at an MCA configuration floppy. Download the disk making/image software and write the disk on 95 *but do not look at it with Explorer*. Dunno what it would do to the disk but it did *something* that would make it not work properly in the PS/2.

            The mess with the floppies could have been avoided by stealing Texas Instruments’ method as used in their 99-4 and 99-4/A. Put the firmware expansions 100% on the peripheral. That made the TI essentially infinitely forward compatible for new hardware. Anything you can conceive of can be made to work if you can write a Device Service Routine to interface with the core console firmware. Plug in new hardware and its functions seamlessly merge with the core system, accessible via commands and CALLs built into the peripheral. The core system knew nothing about any hardware not built into the main console. Plug in the speech synthesizer and along with it come all the system routines for using it, without having to install any software or using any of the system’s RAM for drivers.

            IBM could’ve had MCA so after plugging in or moving a board it would automatically boot to configuration, mostly to confirm changes. Same with removing a device, boot to configuration to confirm that its removal was desired. That’d also help with troubleshooting should something fail and ‘go missing’. If it’s plugged in but the firmware insists it’s not, that’s a problem. That would’ve avoided possible lawsuits from TI because it wouldn’t have been a direct copy of the TI method.

        1. I liked Jumper and Stay. Knowing how the hardware had to be set, I could set the jumpers to what they had to be and it would work. Set every computer the same for sound, video, parallel and serial ports etc. Toss in the same optimized CONFIG.SYS and AUTOEXEC.BAT (the OAK universal CD-ROM driver made this so easy) then install Windows 3.1x followed by whatever special drivers it needed for sound and video. Easy peasey. It ‘just worked’ and the software couldn’t screw it up.

          Then came Plug-n-Pray. When Windows 95 decided to play nice, it worked great. But it had a special antagonism towards many soundcards, insisting that it would ignore jumper settings and try to force them to IRC, DMA, memory locations etc in direct conflict with other resources. It was possible to do manual settings to force 95 to obey the jumper settings, but with some of the early PnP soundcards where there were no jumpers, 95 could reprogram the card to the wrong settings it insisted upon, and if one tried using manual settings in Device Mangler it would go into a sulk and refuse to allow the soundcard to work because its settings were ‘wrong’.

          One could either report the problem to the card manufacturer and hope for a fix to the drivers, or try a different soundcard – while the problem one might work fine on a different motherboard.

          What worked in some instances was disabling the PnP OS option in the BIOS setup, but then that required using the manual settings for more things if 95 failed to detect them at their proper settings.

          1. Never bothered with cheap PNP sound cards. An actual Soundblaster 16 in an ISA slot moved from computer to computer. By the time I got one with no ISA, the sound chip was built into the motherboard, and millions like it. No driver problems there.

      1. OS/2 wasn’t bad compared to Desqview on DOS. Is/2 worked fine except for that damn dos box. By version 4, it was a superior OS, but not one was using it. Was more reliable that any windows system to this day. We had multi taskingcode that ran for decades on os/2 on p80s with only 4MB without the dog box (not a typo). And you could set up a classroom and say build yourself like that machine, go get dinner, come back and you had a working classroom or grid system.

        And I loved coding in os/2 REXX. Probably why I love Python today. But I digress.

        I had a front row seat to ibm and os/2 and Microsoft spats, and even caused one or two of them. I had the largest OS/2 install outside IBM for awhile in the early 1990s. What killed OS/2 and PS/2 was the memory costs. RAM costs were $100/meg and over $1k for some ECC needed for servers. And installing token ring wasn’t cheap.

        Still… a glorious time to be in computers. Much like data science is now. This time there is so much info. Back then answers were hard to find.

        And now I have a drawer full of $9 Linux computers on a chip.

          1. I was in the room when Ed Lucente gave a state of the union speech, and told us that we’d spent more money on developing OS/2 than it cost to launch the Hubble space telescope, AND go repair it. When IBM went on a marketing blitz, bundling OS/2 with a couple of memory sticks for less than the retail price of the memory alone, it sat on the shelves. Ed Lucente said we couldn’t even give it away. Warp was the closest we ever got to the success that Windows enjoyed. But that paled in comparison with the launch of Windows 95, when there were people queuing up to buy it who didn’t own a computer.

            Warp sold well to some industry verticals, indeed. There was a time when pretty much every ATM shipped in Europe had some version of OS/2 on it, and Warp did well as they introduced multi-media.

  12. Part of this is the “Goldilocks” thing.

    Zilog’s Z8000 was a small hand-packed die, so inexpensive, but took too long to design, so was late to market.

    Motorola’s 68000, was architecturally superior, and a well-structured design, but was a huge die and very expensive. The 68008 version, had various problems at the time.

    Intel’s 8086 was in-between. Less-structured than the Motorola part, so a smaller die, and cheaper. But not so over-optimised as the Zilog part, so they hit the market window. The 8088, with the 8-bit bus, allowed for a cheap low-end entry product with the architectural expansion to 16-bits and beyond.

  13. Back in the 1970s, I did my master’s thesis on alphanumeric and graphic CRT display terminals. Consequently, thru one of his contacts, my thesis advisor hooked me up for a summer jobs at IBM Boca Raton to do similar work there in the summer of 1979. He said it was kind of hush-hush, but didn’t say why.

    Anyway, on arrival, I was suddenly re-assigned to a *different* project, and spent most the summer writing APL programs on an IBM 5100, to control some GPIB factory automation stuff. Interesting, and rewarding, but not so challenging. Now and again I’d hear snippets of what was going on in the other building.

    Upon graduation, I went to work for Hewlett-Packard in California (I was much impressed at the time with HP’s high-resolution monochrome, and color, graphics products).

    Anyway, when the IBM PC was announced, I got a call from my old professor. Apparently, the PC was the “secret” project at Boca Raton that I was supposed to work on that summer. Now, at the time, IBM made some world-class photocopiers at their Bolder, Colorado facility. Apparently, some jerk student-hire there had run off a few months earlier with a bunch of confidential material and tried to sell it to a competator’s. Consequently, an edict came down from on high, that no temporary hires would be allowed to work on certain senstive projects, and Project Chess at Boca Raton, was apparently one of them. Rats.

    Later when I saw the IBM PC I thought the monochrome alphanumeric display and controller were respectible. I’d seen a few IBM 3101 “green screen” terminals around Boca Raton in testing, and apparently these monitors were the starting point for the IBM PC monochrome display.

    In contrast, the IBM PC color graphics display was a pathetic joke. Just 320×200 display and only 4 colors at a time. Damn, graphics have come a long was since then! I’ve often wondered about the design choices that dictated that design. From my thesis work, it was clear that graphics frame buffers (being memory intensive) would be very expensive at all but the lowest resolutions. Likewise, existing CPUs were pretty limited at the time. But it wasn’t too long before the PC introduced respectible color graphics with the EGA, then VGA, (and even PGA) displays.

    One thing I’d learned in that era was “never bet against memory”. By that I meant, it was always getting denser and cheaper. So it was worth taking a risk on larger format, full-color frame buffers and display. The prototype cost might kill you, but a year into production memory chips had quadrupled from 16-kbit to 64-kbit at half the price. How I wished I’d gotten on that project. I can only imagine what might have been!

    1. Having been working on developing display adapters for IBM since the late 70s, we were astonished that this little maverick skunk works managed to produce anything half reasonable on the monochrome. Shame they went for a 8 pixel wide character box, and not 9 wide as we grown ups knew was the right answer. We were working on proportional spacing in hardware, multiple fonts and font sizes in hardware, scan line scrolling of text within a partitioned display, multiple attributes for each character – foreground and background colors separately encoded, fonts, multiple highlights/underscores etc. The IBM PC font was such a mess as well, with its half-hearted attempt at national language support, but with a bunch of single and double stroke box drawing characters, it’s complete avoidance of APL (some people in IBM thought that was important at the time) and the font design looked like it had been designed by children. We had Human Factors Engineers with Ph.Ds out the yar who cringed at the IBM PC font.

      It was therefore a huge, huge shock when this maverick kid kicked our asses, and we found ourselves relegated to developing mere adapter cards for it, rather than our own intelligent high functioning alphanumeric and graphics displays.

      But as you say, the CGA was an embarrassment. It wasn’t really until the PGA – and the VGA copying the 640 x 480 mode that color graphics came to the PC properly.

        1. Thanks for that! I shall be geeking out on that one. I found a .ttf file that was purportedly an IBM 3270 character set – thought I would use it for the Mac command line. I did use it for several minutes, looking fondly at the familiar shapes of the characters from 30 years ago. Then I put Monaco back, because it was easier on the eyes. Some things are best left fondly remembered, rather than revisited.

        1. No they don’t, they add up to 133% super-blue. And black and white are the two other colours in the CGA black / white / cyan / magenta palette. The alternative palette has red, green, yellow, and blue. No black or white.

          And then there’s the trick palettes but they’re not an intended part of CGA.

          So why, in the useful palette that has black and white, pick the 2 horriblest eye-searingest colours least useful to any sort of artwork or graphics? They even look ugly in graphs.

  14. Remember the first IBM PC keyboard? It was terrible. Then IBM got smart and built the AT “Clicky” keyboard in Lexington I believe. The intent was to make a “quality keyboard” that would last. Today in 2017, we have five original IBM clicky’s on all our personal computers.

    1. I have a buckling spring keyboard that says IBM on it. I learned to type IBM Selectric typewriters that used the same keyboard and to this day it’s the only keyboard that I like to use. This one is from the mid 90s and is still working as good as new. I love ’em!

  15. Of course the computers that really broke IBM were the PS/2 systems that had the MCA bus.

    The IBM v. Compaq court decision on reverse-engineering the BIOS had made it possible for everyone to produce clones, and because even the best clone was still cheaper than a real IBM PC, IBM was rapidly losing market share by the end of the 1980s.

    Meanwhile, processors were getting faster and the old ISA bus from the PC, PC/XT and PC/AT wasn’t designed for high speed. So IBM thought it would be a good idea to introduce a new system bus to replace ISA and bring customers back. Most of the PS/2 systems (the second generation of IBM PC’s) used the MCA bus which made it easier to configure hardware: no more jumpers and DIP switches; all you needed was a floppy disk that booted into a configuration program.

    But IBM charged high licensing fees so clone manufacturers (lead by Compaq I think) ignored MCA and came up with alternatives: first EISA, then VLB, then PCI. As it turned out, the MCA bus ended up making IBM irrelevant in the PC industry halfway into the 1990s: IBM systems were regarded as expensive, slow and difficult to get parts for. The Aptiva line of PC’s that came after the PS/2’s only made IBM’s reputation worse. Eventually IBM only made Thinkpad laptops and ultimately they sold that division to Lenovo.

    ===Jac

    1. Well, I think there was also a trend toward integrating more and more functionality onto the motherboard, and using the bus for relatively simple devices that didn’t need high falutin’ 32-bit multi bus master DMAing adapters on the channel. So the choice of a bus became a non-issue, and the problem that MCA was trying to solve really didn’t exist.

      Well, until 3D graphics cards arrived – and then, even PCI couldn’t hack their demands. So, we invented a whole new kind of slot dedicated to graphics cards.

  16. Does anyone else remember Seequa? They were a clone manufacturer out of Annapolis Maryland. They made an 8088 clone of the PC that was Luggable (direct competition for Compaq). They had two things going for them that never really took off: the systems ran CP/M as well as MS/PC-DOS, and an add campaign with a Charlie Chaplin character and a tag line that read “Paying $5000 for a computer would make a tramp out of anyone”. Unfortunately CP/M wasn’t much of a draw, and IBM’s lawyers jumped ALL OVER them for the Tramp. I *so* wish I had gotten my hands on one of those Tramp posters.

  17. Modern Macs are PC-compatible x86-64 boxes, so they too owe their basic design to the IBM PC. They just run a different OS, and can run Windows pretty easily.

    Also, the 8088 is 8-bit, not 16 (at least on the outside). I suspect this is a typo as you basically state that in the next sentence/

  18. I don’t recall how, precisely, IBM PCs started invading my workplace. We were a DEC and CP/M shop, running VAXen, PDP-11s and a mix of CP/M machines that consisted mostly of Televideo 802s. I don’t recall any PCs at work before the AT.

    Home, however, was different. I had picked up a DEC Rainbow at a bankruptcy auction for about $20. The Rainbow had an 8088 and a Z-80; it could run generic MS-DOS, CP/M-86, or CP/M-80. Problem with MS-DOS was that the Rainbow was not PC compatible. This became a problem with Turbo C 2.0.

    A friend had bought Turbo C 1.5, and I decided that it was pretty cool and I should pick a copy of Turbo C myself. However, by the time I bought, Turbo C 2.0 was what I was able to get my hands on. It turned out that even the *command line* version of the compiler required BIOS compatibility so that it could poll for ^C during compilation. That didn’t work on the Rainbow, so it was useless to me. I traded my copy of 2.0 for my friend’s 1.5 and was able to use it, but the handwriting was on the wall. It wasn’t long after that I had to give up on the Rainbow as an MS-DOS machine.

    Unfortunately, it also wasn’t a terribly good CP/M machine, either. I had a DECmate II with a Z-80 auxiliary processor that was much, much better; it remains my favorite CP/M machine of all time (mine also had a 5MB hard drive which, oddly enough, is the only hard drive I never managed to fill). So the Rainbow fell by the wayside.

    I also had OS/8 to run on the PDP-8 side of the DECmate II, By the time I scored the DECmate II, I was no longer doing any serious PDP-8 work; it was more a novelty that I fiddled with once in a while.

    I didn’t score a PC personally until I managed to get a cheap surplus unit. My wife was the primary user. I had a Kaypro 10 and a MicorVAX 2000 into which I had gummed a TZ-30 and an ST-251; made a nice portable package for computing on the go.

    1. Oh lordy no! Average computers had 4 or 16K of memory, expandable to 48K (though CP/M machines went to 64K), the original IBM PC would have required multiple memory cards to hold 512K, if one could afford such a thing.

      I remember one software package (Lotus 1-2-3?) was briefly offered bundled with an add-in memory card.

  19. Nice blast from the past. Original 12 principal Dr Dave Bradley wrote a retrospective article for Byte Magazine titled “The Creation of the IBM PC” listed in a footnote of the Wikipedia article Project Chess section. Wire wrap boards and “maker” techniques from Big Blue- who knew? Ultimately chipset companies like Chips and Technologies and VTI in combination with clean room BIOS from suppliers like Award and Phoenix made clone efforts so attractive that even original engineering powerhouses IBM and Compaq eventually produced “clone” designs.
    AFAIR Intel originally drove (sorry) PCI bus that abandoned both MCA and (E)ISA busses in mainstream consumer PCs. https://ia802707.us.archive.org/21/items/byte-magazine-1990-09/1990_09_BYTE_15-09_15th_Anniversary_Summit.pdf

      1. AFAIR both the Thinkpad L40 and 700T tablet used VTI chipsets with lower cost monochrome STN displays. I can’t recall the BIOS provider. Around the same time as Value Point IBM also had an education market “Eduquest PC” line that even used IBM designed x86 compatible processors.

        L40
        http://www.zdnet.com/pictures/photos-from-the-first-pcs-to-the-thinkpad-classic-ibm-machines/21/
        700T
        https://www.youtube.com/watch?v=99l1YR3DhCo

  20. I had several of the first ibm-at computers. Pulled out my 8bit gateway network cards and plugged them into the 16bit bus, flipped the switch and..smoke! Sent the machine back, got the replace,end ditto! WTF? Figured it out later that the first few had a metal clip between the 8 bit but and the 16 bit version in the AT. If an 8 bit card was a bit oversized, it push the clip in such a was to short the 12v to the 5v line and dump 17v of smoky goodness right into the motherboard.

  21. Cool stuff here. Still have a working IBM PC-XT put together from yard sale parts,original CRT (original kbd got lost somewhere during a move) – oh – and a whopping 10MB HD – seem to recall it took days to low-level format back then. Also a box of 5 1/4 floppies with various software incl. the first MS Windows. What a treat. Kinda bring tears to your eyes when spinning it up for show and tell….

  22. I didn’t have time to read all 115 comments and someone else may have mentioned this, but IBM was a thriving monopoly until they got envious of Apples closed house approach and tried to sell the Micro-Channel after the superb success of the XT. Within one year they lost pretty much all their mind share and brand loyalty and the IBM clone AT motherboard sales sky-rocketed. Folks never trusted IBM hardware after that. While a more loyal and less experimental Apple consumer base got away with it, IBM forgot a major portion of their customers where pirating and the hacker crowd. Remember Norton copy utilities (and freeware like PC File) was the biggest seller for IBM machines!

  23. After a brief dalliance with CP/M machines (with Polyforth Forth, UCSD Pascal, and MS Basic) I worked on MS DOS compatible PCs that in theory used higher clock speed processors (V30) and better bus designs but sacrificed full “bug for bug” compatibility. There were even compatibility tricks to intercept BIOS calls and direct hardware access and redirect them to the corresponding target design. Another handicap of the MS DOS compatible systems called for large NRE payments to independent software companies for custom ports of popular applications like 123 to said compatibles. In the end “customers don’t want to be fooled” so bug for bug compatibility emerged somewhat enabled by IBM having published schematics and source code technical references and then accelerated via chipset and BIOS providers with legally permissable clean room designs allowing affordable systems of acceptable quality built in a dorm-room. Later “teardown”/reverse engineering mechanisms advanced clone system development so that (often under NDA) a third party clone option would typically materialize within weeks of any new “PC Architecture” enhancement or pivot – EGA, VGA, MCA, VESA, EISA, PCI. A fun and predictive aspect came from late 80s 386 machines running “Interactive Unix” – full featured Unix systems developed to host CAD/CAE tools to design PC VLSI chips and boards reliably running on PCs rather than RISC or 68K workstations. AFAIR the first mainstream 386 *ix system emerged a few years years prior in the Sun 386i RoadRunner running SunOS. After using Unix on a PC it seemed retrograde to observe VLSI developers using legacy mainframe tools and terminal emulation software to complete designs that easily ran on a suitably equipped 386 or 486 (by then even dual processor) Desktop PC.

    1. I recall MCA adapter cards with much less PCB real estate so MCA boards demanded ASIC components while much larger ISA bus / AT Bus cards could use traditional PALs and MSI Logic as FPGAs were not yet widely deployed in volume business/consumer PC applications. MCA clearly improved over ISA with mechanisms such as Interrupt activation and logic level, System Configuration Utility and Programmable Option Select (POS) register based configuration and resource assignment without switches and jumpers, the ability for system software to disable a board, DMA arbitration, and other advances- but no schematics or “logics” in the Technical Reference. Deep dive bus design debates discussed the absence of a cycle governing clock signal in MCA, Fetch and Deposit DMA rather than “Fly By” DMA, etc.

  24. Actually the reason that IBM developed the PC was strictly for cost and serviceability. I worked for Cargill at the time and they were using Series/1 in there meat processing plants. The overhead weigh scales used a variation of devices and needed better quick fix solutions when the computers failed. The availability of replacement parts in the field was one of the main objectives. IBM never intended on using the 6809. There plan was to produce an inexpensive PC and then add to it boards that would run older mini computer systems. One of these boards was a 68000 board that completely replaced a series/1 mini computer and ran various versions of Unix along with EDL and RPS. The PC 8088 was chosen because of its input and output capabilities. The ms-dos was running on the wire wrapped version by April of 1989 as I was to fly down to Boca Raton at that time to try out my recommendation. I also got to try out the 68000 board inside the PC which was so advanced at the time that there was not enough Unix software written to merit its sale.

  25. One other reason that IBM chose the 8088 for the CPU was that it gave them the option of very rapidly porting the impressive collection of CP/M software that was available. If you look at CP/M’s list of BDOS functions accessed via CALL 0005H, they are pretty much identical to the list of MS/PC-DOS calls available via INT 21H.
    This meant that the assembly source of an 8080 CP/M program could be run through a converter than switch the 8080 opcodes to 8088/8086 opcodes, make a few necessary tweaks with operating system calls, and any other pain points that come up, and you had a functioning MS/PC-DOS program with very little effort.
    This immediately gave them the option on a large library of software at or very soon after launch: WordStar, Excel, Lotus 1-2-3, you name it …

Leave a Reply to Michael BlackCancel reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.