John Horton Conway, Creator Of Conway’s Game Of Life, Has Died

Programmers everywhere are familiar with Conway’s Game of Life: whether they’ve written a version themselves or simply seen the mesmerizing action resulting from the cellular automata, it’s a household name in all homes where code is spoken. On Saturday April 11th, 2020 its inventor and namesake, John Horton Conway passed away from COVID-19 at the age of 82.

Born in Liverpool, Conway received his PhD in mathematics in 1964 from Gonville and Caius College, Cambridge. He accepted a position at Sidney Sussex College, Cambridge which he held until joining the faculty of Princeton University in 1987. A brilliant mathematician, he received numerous awards and was well known for his work in combinatorial game theory, group theory, and theoretical physics.

Many readers will be familiar with his Doomsday algorithm which can be used to deduce the day of the week for any given date in your head. But by far the rockstar mathematics moment of developing Conway’s Game of Life in 1970 cements him a perpetual place of legend in computing lore. His original work on the concept used pencil and paper as the computing revolution had yet to make digital resources easily available, even to mathematics researchers like Conway.

The game uses an infinite grid of squares where all of the edges of the grid wrap around. Four simple rules (which can be boiled down to three if you’re clever) determine which cells live and which cells die during each frame of the “game”. The only parameters that are needed are the number and position of living cells at the start of the game, and the delay between each game frame. But the effect of this simplicity is not to be understated. The game can be coded by a novice — and it’s become a common challenge in University course work. Small errors, or intentional tweaks, in the implementation have profound effects on behavior of the game. And the effect on the person programming it for the first time can be long lasting. You could call it a mathematics gateway drug, grabbing the curiosity of the unsuspecting mind and pulling it down the rabbit hole of advanced mathematics discovery.

We’d love to celebrate his life by hearing your own stories of programming the Game of Life in the comments below. If you haven’t yet had the pleasure, now’s a great time to take on the challenge.

[Game of Life example shown in this article is John Conway’s Game of Life – 1.0 written in Python by Nick Jarvis and Nick Wayne]

56 thoughts on “John Horton Conway, Creator Of Conway’s Game Of Life, Has Died

  1. I’ve programmed Conway’s Game of Life numerous times on different platforms. My fondest memory is the most recent adventure. On my way home from the 2018 Hackaday Belgrade conference I had the retrocomputing badge in hand. It ran a BASIC emulator and had 16 save/load slots. I used those slots like revision control as I programmed the game on the badge itself.

    It worked, but definitely had a few bugs. Debugging was very difficult, and working with memory was very interesting to me as I was dealing with swapping out arrays for buffering each frame.

    The game is simple, it’s everything else you have to think of along the way that gets to be hard. But I love the game and the feeling of magic you get from seeing it play out through each iteration.

    1. I’m sorry but the day of week formula is about 1000x more relevant and interesting: politics and physics smashed together. Caesar’s vanity and Kepler’s laws as a coherent whole.

    2. “I’ve programmed Conway’s Game of Life numerous times on different platforms.”

      I was introduced to his work in one of the popular science magazines of the day (name escapes me). It certainly looked fun, if not useful at the time.

  2. The game of life belongs to a wider category called cellular automata.

    https://en.wikipedia.org/wiki/Cellular_automaton

    It was invented before Conway, but his work popularized the topic. By varying the rules of the game, it can be shown that some systems break down to chaos while others always stop, and a subset of possible rules that exist “on the edge of chaos” produce systems that are capable of universal computation. Conway’s Game of Life is one of them. Rule 110 is another.

  3. It was 199[56], when I grabbed a free (incapable of producing exe files) version of Visual Basic off a cover disk of maybe PC Format. I had a not too comprehensive book and little knowledge of Windows so I created some kind of grid (I don’t remember how) and put different characters in the cells (X and O) to mark their states. It worked. Although, I am not sure I programmed edge wrapping properly.

  4. When I was a child, I did a science fair project on Conway’s Game of Life, where certain cells served as fixed input/output nodes. I thank him for creating something that was inspirational in my interest in computer science. Rest in peace.

  5. In 1972, I was attending Rice University, taking an intro to computing course. They course language was APL – a curious and annoying language – but, with its matrix operations, just right for writing the Game of Life. It was the first program I ever wrote for my own amusement…followed by many many more in the decades that followed.

    1. You mentioned APL in another thread so I looked it up and it’s quite interesting.

      If a had of known about it back in the day I would have written a Z80 asm implementation of it.

      Sadly back then I had no access to literature or software so everything I knew came from reverse engineering.

      The software, I had to write myself in machine code , I had no assembler.

    1. Knowing full well that this is real, possible and believable. I call bullshit. Next thing you’re going to tell me that you can build a flying machine that is not dependant on a lighter than air gas and which can carry humans across oceans.

    1. But if you have an infinite number of squares, then the side of your square playfield is root infinity, which is less than infinity, so you’re good, wrap at root infinity plus one.

    2. Sure it can. The wrapping around in both dimensions is accomplished by mapping it onto the surface of a torus, and your torus can have a major radius of infinity (which makes it indistinguishable from a hollow cylinder, but I digress)

  6. Life

    Life is a game computers play.
    Rules on a grid by John Conway.
    With tiny cells, simplicity
    Grows into great complexity.

    Math is a way to model life.
    Disect it with a subtle knife.
    But life will grow; without a doubt
    You can’t predict how it turns out.

  7. FORTRAN class in College, oh, 1977-78 or so. debugging is an exercise in patience when you have a 2-3 hour cycle for each iteration: get time on the card punch, punch cards, carry cards to IO window, wait, wait, wait, get cards and printout, try to figure why it didn’t work, rinse and repeat.

    But try and tell the young kids that :-)

    1. We had mark sense cards which were even more fun.

      If the scribe you used wasn’t dark enough then the cards would be misread.

      The solution was to capture a black hole, form it into a cylindrical shape and force it into a pencil.

      Then there was no way any light could escape the markings so they were a little more reliable.

    2. When my mum learnt to program, cards were collected from students and taken to the computer in another facility. A few days later, you got your output back.
      Explains why she was always so good at debugging my code without seeing it run.

    3. Go through cards to find typo, fix card, rerun, wait, wait, get printout, fix, rerun, Bad cards from running 3x, start over, complete cards. oops dropped cards everywhere!…..

  8. Yep, I remember I coded the Conway game of life, back when too. I used Borland Turbo Pascal at the time I think. Back in the middle 80s anyway…. RIP. Back in the day where it took all night for the Mandelbrot set to generate on the o’ DEC Rainbow one pixel at a time. Yes I was lucky to just ‘miss’ the punched card era :) . Seniors were the last to ‘punch cards’. Yessss. But in high school we used keyboard with box of paper under ’em. Still Fortran and Cobol programming at the time was ‘card’ oriented. Column 7 for for special char, don’t go past column 78, etc.

  9. Sad to hear this, especially due to the cause.

    I remember having to write life in a FORTRAN class on a teletype, reams of paper was printed. With about 20 or so people in that class, I am sure we emptied out a small field of trees.

    RIP in peace John.

  10. A sad day in our game of ‘life’.

    Not many rise to history like Conway. And for good reason. We learn from shoulders of giants. He set a pattern for the rest of us to follow and learn from.

  11. The first time I tried the code out was as an 8th grader on a dial-up-connected Teletype model 33 ASR and an HP mini run by the school system around 1975 or so…. Program was in a book, maybe BASIC Computer Games or something like that. Had a robot on the cover. If anyone remembers the book, please fill me in.

    Trying to get programs that weren’t written for your particular system was half the fun.

  12. I came across the Game of Life in Scientific American around 1990, and proceeded to write an implementation in Borland’s Turbo Pascal on my 10MHz 286 PC which boasted 640kB of RAM. I was a bit crestfallen by the slow speed of execution, which took around 30 seconds per iteration.

    With necessity being the mother of invention, I realised I could use the 80×24 character video RAM to not only display a solid cell, but could use one half of the byte as a state variable for the cell, i.e. I could use the video RAM as the cell array structure too. I rewrote the code in x86 assembler and achieved 15 iterations per second on the same hardware.

    I managed to try the same assembler code on an unaffordable 486 33DX (or something like that) in a computer shop at the time and if I recall correctly, achieved around 45 iterations per second.

    Conceptually, the stochastic models being used right now by the epidemiologists at Imperial College London are not that different to Conway’s life….

    RIP J.H.C.

  13. To say the least i am not exactly a religious person, but if there’s ever a religion which praises the 4 simple mathematic rules which created the life, perhaps i will change my mind. It really touched me when i saw and understood the elegance of Game of Life for the first time. RIP

  14. There’s some missing information in the article. I quote from i-programmer.info:

    As we reported back in 2014, see Does John Conway Hate Life, the popularity of GOL had been something of a millstone to to Conway himself – he regretted the way it overshadowed his other, more important, achievements.

  15. Back in 1979, I was just starting my career as a software engineer. I worked in a medical lab where we had a PDP-11/34 running Multiuser Basic on top of RT-11. I knew next to nothing about computers or coding at the time, but managed to get a version of Life running on the DEC machine that output to our only device: a line printer. About 30-45 seconds between iterations. I wish I had saved that first version … I’d love to see just how horrible my code was back then!!

    I believe it was a Martin Gardner column in Scientific American that introduced me to Conway and the Game of Life, years before I wrote that first implementation. Anyway.

    And even though Conway regretted the way LIfe overshadowed his other work, I’m sure he’d appreciate the way it has brought together so many people in this life.

  16. I tried to figure a concept of a modular (1 module pro cell), hardware-based GoL, but I could not figure out how. And AFAIK there is no such HW implementation of GoL.

    Microcontrollers driving a pixel array, yes, there is, a lot.
    Modular HW GoL? never seen it

    Did anyone tried to figure an HW version of GoL ?

    1. I’m not entirely sure what you mean, but it is mathematically impossible to represent a GoL cell using only combinational logic alone, state feedback is absolutely required.
      That will put some “limitations” on how a cell can be represented in hardware, in that its prior state must be kept track of by some means.
      A microcontroller, or even a CPU, does that with latches from gates built of out transistors. Ofc there are other options, but humanity has gotten pretty good at building out of transistors so that’s going to be your best bet :P

  17. Raise your hand if one of your early programming experiences was finding a listing for the Game of Life in a magazine and discovering it had nothing to do with the board game.

  18. I like many here wrote a GOL on my C-64 in BASIC way back then, I can’t remember if I ever coded it in some of the other languages I learned since then. I think that might be my next project: GOL in many languages on many platform. I might start with my HP-42S, real and emulated.

    John Conway was a positive influencer.

  19. I was first introduced to Conway’s Game of Life by a Piers Anthony Sci Fi book called OX. It had a life pattern at the beginning of each chapter. It was published in 1975.
    I later wrote a Conway’s Life emulator on a Commodore Pet, first in ROM BASIC (written by Microsoft when they had less than 10 employees) and later in 6502 assembly language (which was actually quite fast).
    I have nothing but fond memories of John Conway’s game of life.
    Rest in Peace, John.
    You will always be remembered fondly.

  20. I remember coding this on my ZX80! After that I think it was an Apple II, Commodore Pet and then VAX 11 (not the vacuum cleaner!) and ICL 2966 mainframe including an awesome 32MB or RAM. Things have moved on a bit since then but the ‘Game of Life’ is a timeless masterpiece, as relevant now as it always has been. It’s a great way to illustrate how there can be hugely complex hidden consequences to making ‘simple’ changes to something you think you understand: Take note those who advocate gene editing is safe!

  21. Ah, Life! I read about it in 1970 in the Scientific American article, but it wasn’t until I started university in 1974 that I could program it. Did it with punched cards with FORTRAN (!) and the result would print on fanfold paper on the huge IBM chain printer. Ran it until my paper or CPU limit was reached for some really long generations like R-Pentamino.

    Then in 1978 I built my first single board 1802 microprocessor-based computer at home. It had 4k(!) of wire-wrapped RAM, a surplus KSR35 teletype and Tiny Basic. I was able to run yards and yards of printouts! I still have the microcomputer but donated the KSR35 to UCLA’s Boelter Hal. They were recreating the original Arpanet node from which the first message was sent (origin of the Internet) and apparently it was sent from a KSR35.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.