Visualizing A Nanosecond

We’re so glad to have run across this video where [Rear Admiral Grace Hopper] explains how to visualize a nanosecond. Now we had never heard of [Grace Hopper] before, but once you watch the clip (also embedded after the break) you’ll want to know who this person is. We work with divisions of seconds all the time when developing with microcontrollers. But those concepts are so abstract we never had a need to think about them as a physical distance. After all they’re a measure of time, right?

You can’t make it out, but she’s holding a length of wire between her hands. It is 11.8 inches long and represents how far electricity can travel in one nanosecond (one billionth of one second). She goes on to explain that this is a calculation of the distance which light can travel in one nanosecond, then really hits the concept home when she uses it to explain latency in satellite communications. For us, the waste of not putting a chip into sleep mode when it’s just stuck in the loop waiting for an interrupt is where we made the connection.

So back to the woman herself. We think you’ll really enjoy reading through her Wikipedia biography page. [Grace] was a computer science pioneer. She is credited with writing the very first computer compiler. She postulated and articulated the concepts that led to the development of COBOL, and popularized the term ‘debugging’. In short, she is one of the giants whose shoulders we all stand upon.

[via Reddit]

98 thoughts on “Visualizing A Nanosecond

    1. It’s quite possible to never have heard of the woman who wrote BASIC, because John George Kemeny and Thomas Eugene Kurtz both appear to be men.

      Grace Hopper’s achievements were about a decade earlier, with FLOW-MATIC, CODASYL and COBOL. As for not knowing her, shame indeed

  1. Man, I guess this shows the line between present day geeks and us. We knew about people like Von Neumann and Turing and Grace Hopper (who I used to kid was close friends with the countess Ada). Good for you to post this, perhaps it will help new hackers today learn a bit more about the brilliant people who created what we have today.

    1. Here, you can have it. With silly rules like this I don’t need to be in your club.

      You know what, I’m not an history geek. I take my hat off for the pioneers, but I don’t need to know them all and what they did.

      1. A computer geek not knowing who Grace Hopper was is akin to a plane buff staring blankly at you when you mention the Wright Brothers.

        There may not be a need to know them all, but is it too much to ask to know the handful of pioneers?

      2. You’re right Daid, you don’t have to know about historical figures like Grace. However spending part of one’s life learning the history of topics they find interesting has lots of benefits. It often gives you some perspective on the topic — why certain things are the way they are. It may help you resolve some issue because someone has faced the same problem. It’s just fun to learn about the people who created the things we often take for granted, usually in what we would consider a primitive environment — and that can be encouraging in our own efforts.

      3. Let me say I think you are wrong on this assumption. Knowledge of the past is the base that the present and future sit on. To understand the structure and syntax of current programming languages, you have to take a tour of what was done in the past.

        You might care more to know that the very device you use to communicate has ties to past programming. Grace Hopper worked for DEC. I don’t suppose you realize that part of micro$soft’s NT kernel came to be by former DEC programmers who borrowed parts of the VMS operating system. I would not say she had direct ties to it, but most likely gave direction.

        Don’t get angry, but do some research. It’ll make the trip more enjoyable.

      4. Let me say I think you are wrong on this assumption. Knowledge of the past is the base that the present and future sit on. To understand the structure and syntax of current programming languages, you have to take a tour of what was done in the past.

        No not really. If you know the present well enough you don’t need to know any past to predict the future. And I don’t see why you would need to “take a tour of what was done in the past” to understand current systems.

        Logic is timeless.

        It’ll make the trip more enjoyable.

        Sure, but the amount of “more enjoyable” varies for person to person. For me, it’s not enough to grant explicitly studying. I only know history from random bits I stumble upon during my quest to understanding the present.

      5. Easy there, Daid. I didn’t mean to be overly judgemental, just trying to express my astonishment at modern geeks/nerds not knowing about this important figure in the history of computing.

    2. Not to know who Grace was shows how far the ipad nerds have gotten away from their roots their beloved fans have gotten. I knew Grace, as did Steve Jobs and Bill Gates among others who received her graciously while she lived.
      Long live the Queen.

      1. No, but Pavlov does…

        The reason Grace Hopper sticks in my mind is the story of the “debugging” of an early computer, by removing (and taping to the computer log) a moth from the mechanisms.

    3. Agreed. Because knowing WHO created/perfected a scientific achievement is much more important than the achievement itself.

      She seems like an interesting person, though.

    4. Daid, you can’t claim to be ‘anything’ of any category if you have no willingness or desire to ignore the history of it. For example, should I say I’m a fantasy buff yet not know who Tolkien is? Shall I declare myself to be a physicist, but not know who Newton was? Am I anything serious if I have no desire to actually study that subject? NO! I’m a poser. Daid, your not a geek or nerd, just ignorant.

    1. What do you mean with electron movement? What context?

      Because a half an hour just doesn’t seem right. I don’t think so, else cathode ray tube televisions would never have worked properly.

      1. This illustrates exactly why some of you need to learn some damn history!

        What you are asking about is called…

        Thermionic effect or thermionic emission

        It has NOTHING to do with current flow through a piece of friggen wire!

        And because I’m a prick that ENJOYS “throwing a cat amongst the pigeons”…

        What you smugboys call “current” is actually called,

        CONVENTIONAL CURRENT

        IE current flow from positive to negative.

        Actual current flows from negative to positive!

        THIS ILLUSTRATES EXACTLY WHY YOU “BOYS” NEED TO LEARN SHIT!

      1. Current flows at the speed it does, because it’s one electron pushing against the next electron, which then pushes against the next, and so on. This happens really really astonishingly blindingly (add more superlatives to taste) fast because it’s electromagnetic forces at work.

        Were you to follow one particular electron, you’ll find that it barely moves at all, just hopping to the next atom once in a while.

        1. An analogy I always enjoy is that it’s the difference between the time it takes for a traffic jam to travel a given distance down a motorway, and the time it takes for a wave of angry honking to travel the same distance.

      2. actually the information in the wire travel to almost the speed of light, but the electron himself can not go that fast, simply due to the density and randomness of the atoms.
        It’s kind like the 100 000 years it takes to a photon to escape the sun’s internal layers.

        A simplified explanation would be the newton’s craddle. The media (impulse) travels way faster than the medium (steel balls).

      3. The one that kept squirrelling my brain was the “hole flow” opposite of the electron flow.
        I learned about time=distance while doing work for the DNA (Defense Nuclear Agency) the nice folks out at Yucca Flats making things go boom.
        Our instruments were built and calibrated to about 28ns, about the time it took the EMP to go past the gauge and before the shock wave hit.
        We got to the point where we would refer small distances in nano-seconds. instead of fractions of an inch.

  2. For the newbies here…

    I was fortunate to meet Admiral Hopper…twice. I was her military escort for a day while I was in the Marine Corps, and the second was when she spoke to a DPMA conference in Denver, CO. I still have my “nanosecond” she passed out at the end of her speech. She was probably the most brilliant person I’ve ever met…and was able to remember me at that conference…at the age of 75 or so, ELEVEN years after spending one day with me. I was flabbergasted.

  3. Oh…and BTW…she also termed a word us geeks use EVERY DAY…she was the first to DEBUG a program by scraping a moth off the logic board. She made an entry in the log, and taped the moth in there too. It was originally in a Naval Historical museum, but now I hear it is displayed at the Smithsonian.

    1. It wasn’t a “circuit board”!

      She pulled a moth out of a relay!

      Damn I feel old now!

      Before PCB’s it was a horrible mess of point to point wiring on the back of a chassis!

      If you have ever restored an old radio, TV or electro-mechanical pinball machine it’s freakin culture shock!

      :-)

      1. If you have ever looked inside a fairly modern piece of military equipment you’d know that wirewrap is still alive and well. Wirewrap has a lot of give and flex and can keep a contact, a blastwave going thru a PCB can shatter it or pull components off the board. but you mostly see wirewrap behind the card edge connectors that PCBs plug into any more. But yeah, we still use vacuum tubes too.

    2. I’m glad someone mentioned this.

      She’s credited with inventing the term “bug”, ffs!

      I’m with whoever said that not knowing of her should be punishable by geek death. Of course, you can recover it by watching the video…

  4. Cool anecdote Rich! Yes, she was the real deal. I don’t know many in the computer and info science fields who don’t know about her.

    And my exposure to the speed of light through a medium came with my amateur radio license. All that coax has a thing called the velocity factor. In essence the electron on the skin of the wire goes only at a certain percentage the speed of light due to attenuating factors.

    And from physics all those years ago I still remember two things, f = m*a and 3×10^8m/s^2

    1. Close…the electromagnetic wave propagates at some percentage of the speed of light in free space. As someone pointed out earlier, the electrons actually transition through the medium much, *much* more slowly. Visualize it like a bus: a new rider gets on and everybody moves back a seat. The last seat becomes occupied far before the actual rider who boarded gets there.

  5. I, too, was privileged to hear her “nanosecond” talk, as a graduate student at UMass/Amherst in the late 70s. She was sharp as a tack, and the nanoseconds handed out were single strands of telephone 25-pair cable, such as found in PBX systems.

    The significance of the length of wire is that it represents the distance electricity can travel (to an order of magnitude) in a nanosecond. “A nanosecond a foot” is a good rule of thumb.

  6. Now we had never heard of [Grace Hopper] before

    Oh what a sheltered life you must lead at HaD.

    As Grace herself would point out, the velocity of a signal on a wire (transmission line) is somewhat less than “c”.

    But those concepts are so abstract we never had a need to think about them as a physical distance. After all they’re a measure of time, right?

    Gawd. You didn’t KNOW you had the need. “Sleep mode”? Where it really connects with micrologic is the need to place bypassing capacitors as close as possible to the devices being bypassed, right across the chip supply pins, not somewhere at the far end of the board.

    {and Kemeny and Kurtz were at Dartmouth College while Rear Admiral Hopper was mainly at Harvard}

    Okay, hands up everybody who hasn’t heard of Alan Turing, the Turing Machine, or the Turing Test, and doesn’t know what his contributions to computing (or winning the Second World War) were.

    Sure @Daid you don’t have to know anything about the history of electronics and computing, but you will be the poorer for it as a tech or programmer – “Look what I’ve just invented! A round thing I call a ‘wheel'”. {Newby PIC programmers read “triple-five”.}

        1. LOL

          I’m 49! Built mah first IMSAI 8080 back in ’74.
          Think I was 12, can’t recall, gettin’ old.

          Pig of a thing it were, no bootstrap ROM, only “input” was dang switches.
          Then went to a SC/MP with an 1802, then A Signetics 2506.

          Then I got me a Synertek SYM-1, which I just dragged a’kicken and a’screamin’ into the 21st century.

          No we was all wearin’ turnips on our belts, that were the fashion of the day….

  7. And from one of the other points of view, that’s the peak-to-peak length of 1 GHz that she’s holding (983.6 MHz if you want less rounding). From there, you can observe that 4GHz CPU’s come in a package that’s about half a clock tick across. Which hopefully helps clear up the driving logic behind LGA/BGA/Chip-on-board packaging, the race for the smallest transistors, and the increasing importance of thread safe libraries.

    @Andr0id: sounds like you’re accusing geeks of being nearly as low context as end lusers. Was that your intent?

  8. All praise be to talented lecturers. I feel so lucky to have had professors who cared enough to instill a passion for engineering in their students. I credit Rear Admiral Grace Hopper not just with having a great mind but also with having a great heart for sharing knowledge.

    – Robot

  9. As a young Naval officer, my Father had the opportunity to work for Adm. Hopper and assisted in writing some of her earliest budget requests to Congress. I think she was a big part of his decision to go into computers after his time in the military. In the ’80’s (i think) my dad had the opportunity to go to a presentation of hers and he came back with a few of those pieces of wire. He explained her presentation to me and I was fascinated by it. Thanks for the video and reminding me of this story.

  10. Wow, up next you guys will be putting up a post that says you don’t know who Doug Engelbart is… Really? Never heard of “Amazing” Grace? Every time you use the term ‘bug’ to represent an error in a program, you’re calling her name. Learn your history. It’s like not knowing about Univac.

  11. When I first read Mike Szczys’s post, I thought he was joking when he suggested this was the first he’d heard of Adm. Hopper. When I entered college ‘waaaaay back in ’79, her mug was in the first few pages of just about any comp sci intro book. But, as best as I can tell, Mike wasn’t a comp sci major.

    As I typed this, it occurred to me that this might be fodder for a subsection for the next Beloit College Mindset list. Should stop by my local college to see if Grace Hopper rings a bell for the tech faculty, never mind the students.

    http://www.beloit.edu/mindset/

  12. I remember meeting her as well. She used to hand out packets of pico seconds. These were pepper packets. A grain of pepper representing about the distance light traveled in a pico second.

    I have always remembered that.

    Her lecture was great.

  13. Classic Grace Hopper quote,
    “If it’s a great idea,
    Go ahead and do it!
    It’s much easier to apologise than get permission!”
    On YouTube
    http://www.youtube.com/user/ComputerHistory

    Or just do a search for Grace Hopper.

    For those thinking ancient computer history is not that interesting or relevant,
    think again, how stuff got done with limited resources, like RAM, slow I/O, etc, especially if you are working with micro-controllers which have more grunt than they did, you can learn a lot and mayhap pick up some ideas.

    ps

    One of my dogs names was Ada Lovelace
    [IMG]http://i464.photobucket.com/albums/rr3/andrew_t1000/2008-07-27044.jpg[/IMG]

    A Force of nature!

    RIP Sweety!

  14. I remember when I got my first Athlon processor, at 1.2 GHz, and my dad was saying “But… light only travels a little bit over a foot in a nanosecond!” to which I responded that “That’s why it’s so damn tiny!”. I don’t think either of us would believe that there were production chips clocking along at 3.8 GHz less than a decade later.

    I wonder how long it will be before they HAVE to integrate memory, graphics processing, sound processing and so on onto a single chip, in order to maintain the steady advance in processing power?

    It’s kind of amusing looking at little wiggly traces on motherboards, where it’s clear that they just needed a teeny tiny delay to keep things in order, and the designers used pure distance to make it.

    1. when you read about how the engineers @Cray had to use propagation delay to get everything in sync it’s amazing, some of it was wire traces, some of it 7400’s as NOT gates, long, long chains of them.

      As an aside have you seen the old way of storing a video line, before SAW devices were used?

      It’s a coil, about 3-4″ long, 5mm in diameter, sorry about mixing units there!

      The other place this is relevant is mercury “memory”, long spiral tubes of mercury, with pick up points all along it.

    2. The wiggily traces are there for several reasons, one, as you say, is as a delay, used to get the signal to show up at the right time. Another is to change the impedance, capacitance or inductance of the trace (you can wiggle in 3d using vias) to reduce ringing and/or reflection in the signal when it hits the other end of the path. Lastly is to reduce RF radiation (resulting in RFI), which can cause all sorts of problems both within your circuit and off board.

      Any straight length of wire (or trace) that is 1/4, 1/2, or multiple thereof can act as an antenna or also as a shorted stub, killing off your signal. At 1ghz, 1/4λ is just under 6cm.

      RF board designers do this sort of thing all the time. And any signal going at better than a few mhz has a significant RF component to it. :)

    3. At one point in my career I was working for a company that built Cray class supercomputers. The machine was incredibly dense to avoid speed of light delays. You had to take it into account for just about every aspect of design. For instance:

      The backplane (actually a mat of wire) was more than 3 clock cycles long. That is, if you issued a single clock instruction, it was done before the clock signal got to the other end of the wire. If one of the wires broke, fixing it may have made it shorter. That had to be accounted for. Or the mat has to be replaced. $100k+ plus downtime.

      The clock was distributed to the CPU modules via fiber while the data came via wire. The speed of light is different in these two media. That had to be accounted for. Same for wire vs silicon.

      Certain structures in PCB design act as RF filters. You have to change the speed of the signal, avoid the structure, or the signal doesn’t come out the other end.

      The wire in the backplane mat has a different impedance than the wire on a pcb trace, which is different than a trace on a ceramic substrate, which is different from a path on silicon. Take it all into account.

      The system, when fully implemented, called for up to 512 CPUs in all at different distances from the memory the next program to run was going to be loaded into. The operating system had to know if it was faster to schedule a job in a distant cpu, including the time to move the code to memory near that cpu, or if it should wait for a closer cpu to free up. And the scheduler itself might be running in any CPU. :)

      Great stuff. I kept one of the Rear Admiral’s nanoseconds in my desk as a frequent touch point.

  15. I teach Information Technology at a career college in Northern California. We certainly cover the accomplishments of Grace Hopper, Alan Turing and Charles Babbage in our Introduction to Computers class.
    I had the opportunity to meet Admiral Hopper when I was in High School. She was a very nice person.

  16. Even if the name Grace Hopper didn’t ring a bell, you’ve certainly heard this quote (or one of the variations):

    It is much easier to apologize than it is to get permission.

    So, you can keep your geek card, you are just bad with names.

    pix

  17. Loved the post! I actually had the privilege of hearing her speak when I was in college in 1972. At the end of her talk, she passed out “nano-seconds” to everyone who was there. I kept mine on the wall of my office for years to remind me of her story.

    Geek or not – we all have a debt of gratitude to pay to Grace Hopper. She also spurned the statement: “but we’ve always done it this way” and threatened to materialize beside us if we ever said that. She was a true innovator and rebel in the fledgling computer industry. If you don’t know who she was, do yourself a favor & read her bio – you will be enlightened too.

  18. You have never heard of the woman who coined the terms “bug” and “debugging” ? Who had introduced the concept of high level languages to have machine independentant programming languages ? ( things were made in machine code those days ). Shame on you !

    P.S.: For the fellows who said she developed BASIC language: not ! BASIC was developed in Darthmout College by John George Kemeny and Thomas Eugene Kurtz. Grace was involved in the development of COBOL.

  19. this has come up from time to time in discussions, namely about why you can’t just replace the hard drive with an internet connection. i usually word it: “it will always be faster for data to travel a smaller distance than a larger one.” and “you are fundamentally limited by the laws of physics” baring some weird quantum mechanics or space-time bending, it will always be faster to travel a smaller distance than a larger one, and no increase in internet speed can overcome this (now there may come a time where internet speed exceeds hard drive IO speed for a while, but then hard drives will get faster and it will still be faster to read from the hard drive)

    but this gives me a simple rule of thumb to highlight the issue: a nanosecond is a foot. fewer feet means fewer nanoseconds and more feet means more nanoseconds. very simple and concise…

  20. I hated COBOL what a brain dead language. To think people still prorgram in that crap aaaaaaaaahhhhh makes my blood boil.

    But still I like a women who tells me to just do it, and not ask for permission, just forgiveness post event.

  21. From the Teach Yourself Programming in Ten Years

    “Approximate timing for various operations on a typical PC:

    execute typical instruction 1/1,000,000,000 sec = 1 nanosec
    fetch from L1 cache memory 0.5 nanosec
    branch misprediction 5 nanosec
    fetch from L2 cache memory 7 nanosec
    Mutex lock/unlock 25 nanosec
    fetch from main memory 100 nanosec
    send 2K bytes over 1Gbps network 20,000 nanosec
    read 1MB sequentially from memory 250,000 nanosec
    fetch from new disk location (seek) 8,000,000 nanosec
    read 1MB sequentially from disk 20,000,000 nanosec
    send packet US to Europe and back 150 milliseconds = 150,000,000 nanosec “

Leave a Reply to CLCCancel reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.