Programming 1949 Style!

What was it like to program an early digital computer? [Woven Memories] wanted to know and wants you to know, too. [Maurice Wilkes] and his team wrote a book about their EDSAC and the 18 instructions that it used. These days, you can even run an EDSAC program on a number of emulators.

It is hard to realize how things we take entirely for granted had to be invented by [Wilkes] and his colleagues. The book, “The Preparation of Programs for an Electronic Digital Computers” has, among other things, the first recorded use of a software library and the first API. Even the subroutine needed inventing by [Wilkes’] student [David Wheeler], which was known for a while as the “Wheeler Jump.”

Like many things in old computers, the Wheeler Jump required code to modify itself. Even indexing modes were often implemented by changing an address inside the program.

While we frown on techniques like this today, you have to start somewhere. We are big fans of EDSAC and [Dr. Wilkes] had a long and distinguished career long after EDSAC, too. The original plans for EINIAC led to EDSAC, EDVAC, and a slew of other early machines. You can see a video of the machine with an introduction by [Wilkes] below.

If you want to try your hand with the EDSAC, try your browser. There’s also a very nice Windows emulator that runs fine under WINE.

28 thoughts on “Programming 1949 Style!

  1. Was the ENIAC a real computer in the conventional sense, even?
    It didn’t use binary yet, like the Z3 did. 🤷‍♂️
    It’s more like a big calculator/cash register to me.

    1. Anyway, the EDSAC is interesting.
      There’s a game named OXO (tic-tac-toe) that can be played on it. ^^
      There are even two vintage emulators, one for classic Mac OS and one for Windows 3.1/95.

    2. “Conventional sense?” That doesn’t mean anything in the context of the era. Binary doesn’t make a computer, analog computing was a big thing for quite a while.
      The term “computer” has always roughly translated to “big calculator.”

      1. The decimal part was the weird one, I mean.
        Analog/Binary both is a common concept.

        Computer.. Yes, that’s one definition. A good mathematician was also a “computer”, maybe.
        Computers historically weren’t good at math, though. Rather doing arithmetic (hence ALU).

        They can merely count up/down, with addition being the main method for everything.
        Multiplication and division are being done through addition, if I remember correctly. Subtraction, too, sometimes.

        That’s at least what I remember about basic principles of pocket calculators. Speaking under correction, thus. 😅

        1. ENIAC was an automated calculator built to calculate ballistic tables for the US Army. If you have decimal input and want decimal output, a decimal architecture is not a bad choice. It also allows re-using proven principles of existing mechanical and electro-mechanical decimal gear, like adding machines, tabulating machines and automated telephone exchanges. btw: Babbage’s difference engine was also decimal.

    3. ENIAC wasn’t a stored-program computer, reprogramming involved changing sets of jack plug cables and even modifying some of the circuitry.

      The first stored-program computer was the SSEM, or Manchester Baby, built at Manchester University in 1948 and running its first program in June 1948. However, the SSEM wasn’t very practical, it was designed to test the Williams-Kilburn CRT-based memory tubes and could only store about 32 x 32-bit words. EDSAC, the second stored-program computer, on the other hand a usable amount of memory, in the form of mercury tubes!

    4. If you refer to being “von Newman architecture computer”, it wasn’t in origin, but was retrofitted to use the constant tables as program storage. It was slower, since then it has to do sequentially things that, before, could be done in parallel, but was far easier to program.

  2. I wonder if those old computers could emulate Ninetendo GB or GBC. Could be a real boon if it turned out how far our progress in science and computing has gone with ability to virtualize different architectures.

      1. I don’t think you estimate just how miserable GB architecture is. It’s pretty much a trimmed down Intel 8086, the same thing that powered various toys for pre-teen children. Before you speak, buy books and learn something about computers.

        1. Wasn’t the DMG CPU some Z80 derivative with a built-in sound generator?

          I’ve always found the Z80 to be more sophisticated than the 8080 and 8088.

          Next best thing would be NEC V20, perhaps. In terms of sophistication, I mean. The 8086 is respectable, but I don’t know what to make of it. The STS (Space Shuttle) had used it, right?

        2. “It’s pretty much a trimmed down Intel 8086” is not something you should say before “Before you speak, buy books and learn something about computers.” I do admire the confidence though.

          That said: Z80, and they probably could, but would be so unfathomably slow as to be nothing more than the niche-est of niche curiosities.

          You greatly overestimate how powerful these old computers are. A Z80 is closer to emulating an EDSAC than vice versa.

  3. “Was the ENIAC a real computer in the conventional sense, even?
    It didn’t use binary yet, like the Z3 did. 🤷‍♂️
    It’s more like a big calculator/cash register to me.”

    Use of particular number-base is not a requirement for any computing machine; some early machines used base-10.
    Use of electronics–even relays–is not a requirement, either.
    A calculator is a computer, as is any modern-day (electronic) cash register.
    Old(er) cash registers are mechanical computers, as were Charles Babbage’s Difference Engine and Analytical Engine–which used base-10, by the way.

    1. Hm. By that logic, the early automatd telecommunications exchanges were big computers, too?
      They worked on relays (Striwger swutches etc), used logic circuits and established/disconnected telephone connections. Or so I heard. This was before my time.

      1. Actually thats more true then you realize. My father worked for at&t, he was hired in 1969 as a configuration-technician for the “electro-mechanical” switches. with in a couple years he went from jumpers and electronics tools to computer programmer. he worked on some of the first production UNIX systems in the world because of it too!

      2. Yes.
        Automatic telephone exchanges have always been big special-purpose computers ever since the SXS (‘step-by-step’) switch you reference replaced the totally-manual exchange; even more so ever since the Common Control office (using the now common ‘digital switch’) replaced the ‘step office’ (SXS office).
        [And the correct term is ‘Strowger’ switch, named after the man who invented it–an undertaker]

    1. Yes, COBOL included the now-deprecated ALTER statement. It was popular for switching the outermost logic: first initialization, then the unit-record loop , then final cleanup and termination.
      Fortran was even more fun. The target of a Go To was usually an integer literal, corresponding to a numeric statement label — not an actual line number as in BASIC. But the target could also be an integer variable, computed to be, one hoped, some valid statement label.

      Source: I was there, and worked with the RCA Spectra compiler developers.

  4. I remember using Hollerith Card stacks, and a Hollerith Card reader (eater). If you accidently dropped a stack on the floor, you would have to resort them. They usually had a card number printed on one of the corners. Some stacks had Hundreds and over a thousand cards in them. There were special boxes made to handle and move large stacks. Think SD card X 1,000,000 in size and weight. Yeah, a large stack can get quite hefty. Oh, those were the bad old days.

    1. Clarification for young readers: A card sequence number was normally punched into each card, in reserved columns which varied with the language or application. If you dropped a deck, you scooped up the cards, got them all facing properly — helped by a one-corner cut — and handed the tray to the tab-machine people for sorting.

  5. The subroutine was actually invented by the ENIAC programmers to run the first programs on the ENIAC …. otherwise the complex computation would not have fit on the device. So they were forced to come up with a way to do it. See Kathleen McNulty and ENIAC Programmers

Leave a Reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.