The feature of being easier to write than assembly is often seen as the biggest advantage of high-level programming languages. The other benefit that comes with them is portability. With high-level languages, algorithms can be developed independently from the underlying hardware. This allows software to live on once the hardware becomes obsolete.
The compiler was a concept that was met with resistance when it was first introduced. This was at a time when computers were custom-built machines bearing individual names like ENIAC, UNIVAC and Mark I. A time when the global demand for computers was estimated to be around five units by the CEO of IBM. In this scenario, it took a visionary to foresee a future where the number of computers would outgrow the number of programmers and hardware would evolve so much faster than software that a compiler would make sense. One visionary was [Grace Hopper].
The Compiler
Compiled code often runs slower than optimized assembly code. Surprisingly, this wasn’t what got into the way of implementing them. They were regarded as technically impossible. One could think that computer science was at that time too young to fully understand the potential of its subject. One could also conclude that already then it was old enough that its practitioners were set in their beliefs. In this atmosphere, it took a special attitude to turn the idea of programming languages from a challenged concept to one of the most influential technologies of the following decades. The quote “It’s easier to ask forgiveness than it is to get permission.”, which today is used to explain dynamic binding in python, is attributed to her.
Although a pioneer, [Hopper] wasn’t that far ahead of the curve. [Alick Glennie] developed another compiler in the same year as she did. [Glennie]’s ‘autocode’ compiler was much less successful than [Hopper]’s A-0 which should later evolve into COBOL — the Common Business-Oriented Language. There are two probable reasons for this. Autocode was relatively close to machine code and tailored to the British Mark 1 computer at the University of Manchester- portability was not part of its design.
The second reason is that [Hopper] explicitly wanted her languages to provide enough abstraction to allow more people access to the power of computing. In an Interview from 1980, she identifies two distinct groups among the early users of computers: mathematicians and people interested in data processing. She found the data processors to be word-centric in their thinking as opposed to the symbolic approach of the mathematicians. Developing the first compiled language was in her case intended to make computers useful to a bigger audience. COBOL was aimed at business applications in an era where computers were a mostly military and academic affair.
The Person
The list of outstanding achievements in her life is long. [Grace Hopper] had been the first woman to receive a Ph.D. in mathematics at Yale. Which is even more impressive as she didn’t want to study mathematics in the first place. She entered the Navy after the attack on Pearl Harbour and had to retire three times before they finally let her go in 1986. Yale has named one of its residential colleges after her. This puts another quote of hers into perspective:
“If you ask me what accomplishment I’m most proud of, the answer would be all the young people I’ve trained over the years; that’s more important than writing the first compiler.”
She had been teaching after receiving her master’s degree at Vassar College in 1931 and kept holding lectures at different institutions throughout her career. Her lectures to professional audiences have been featured here before.
Compared to her outstanding professional career, her personal life is much less documented. Wikipedia deals with a marriage of 15 years in a single sentence and other sources are sparse on details as well. We know that she battled alcoholism in her forties and overcame it with professional help. She was also known to take her work home and start the morning with presenting solutions to problems that had been discussed on the way home the day before.
[Grace Hopper] was a person that managed to get herself into the right place at the right time. She did so with ingenuity and the drive to overcome obstacles, not by smashing through them but by looking for ways around. She also approached her work with an eye for how useful it would be.
Heroes are a difficult thing. In the wrong hands, they set standards that are impossible to reach. Nobody can be Grace Hopper, only she could. The compiler is invented and high-level programming languages have replaced machine code in most applications. We can, however, strive to be like Grace Hopper: aspire to be good mentors, build things that are useful to others and reach for help when we meet our limits.
Further Reading
The Interview mentioned above is a good starting point to get a glimpse at who[Grace Hopper] was. Its good to keep in mind that the conversation took place two years before the ZX Spectrum was released, or otherwise the technological parts don’t make sense. With 54 pages it’s a long interview, and reading it might take even longer than expected, because it’s easy to get distracted by reading up on all the people who are mentioned.
Her biography ‘Grace Hopper and the Invention of the Information Age’ by [Kurt W. Beyer] is often considered a bit light on the biographical part but nevertheless a sound historical account of the era shaped by her.
Never forget… She was the killer of an innocent moth!
it was bugging her.
Like fall leaves it ended up in a book.
I heard she FOUND it, already dead.
That’s what the establishment want you to believe man. #BUGLIFE
Yes, squashed by the relay contacts it had been preventing from making contact.
if she didn’t kill i you wouldn’t be able to write with your computer right now
I wonder how computers would be diffrent today if we didn’t have compilers.
Computers that teach themselves how to program.
“Si Dieu n’existait pas, il faudrait l’inventer.”
The idea of ever-greater abstractions simplifying complex, but repetitive, systems is so fundamental to computation at every level, you might as well ask what people would be like without a concept of language.
you would be reading this article on a monochrome text terminal in 2017
the color values for each line would be helpfully printed along side them. you are expected to imagine the color
We did get compilers, they came along sort of naturally. We were lucky that a really good example was one of the first and thus not squeezed into obscurity by a multitude of also-rans.
I saw her speak in person back in the 1970s when I was in college. I got a “nanosecond” of wire that she handed out to each person. She was a gifted speaker.
I remember her on “The Tonight Show, Starring Johnny Carson” and she handed him a “nanosecond” after explaining it to him. One hell of a smart lady!
Cool! I am jelly.
I took a blood sample from her for physical. I was a lab tech in the Navy stationed in Bethesda MD. She gave me a nanosecond. I knew who was and that was the high spot in my life. We talked for about one hour.
Listened in on her chit-chat with John Bakus of FORTRAN fame at the 1977 ACM Conference in Seattle.
“Compiled code often runs slower than optimized assembly code”
This statement does require some clarification as it is in many cases (you could say “often”) this is not true.
There are lot’s of examples where compiled code is smarter then a piece of assembly code.
Now it is stated “optimized assembly code”, well I must say, why not optimized compiled code. Otherwise the comparison isn’t fair. There are lot’s of good compilers out there.
However I have to agree, that when speed matters, it never hurts to open up the result files from the compiler and check what it created, sometimes you can learn from it, sometimes you can improve it with ease. As always, just check everything you make and what is being made for you. Don’t assume you can build it better by hand (assembler) and don’t take things for granted that are being build for you (compiler).
As desired tasks and hardware become more complex, it definitely seems that “hand-optimized code is faster” grows less and less true. The compiler will know tricks that operate in more dimensions than you can hold in your mind at one time. And, the projects are huge…you can’t take a week to hand-optimize 0.001% of the code base. The compiled code is going to be faster than the theoretical hand-optimized assembly code that never existed.
It would be really fast to just go your cupboard and pull out ingredients, stir them together in a bowl, put them on a cookie sheet, and have a batch of warm chocolate chip cookies in under an hour. But what if you want to make 10,000 chocolate chip cookies per day? The end-to-end process of making each cookie might take 10 times longer as it passes through all the factory automation, and they might not taste homemade, but at the end of the day you got 100 times more work done.
Reminds me of The Story of Mel.
+1 – Was about to leave the same comment. About the only time nowadays that hand written ASM is faster than compiled code is for very simple programs (less than ~1000 lines of ASM). So, if you are writing anything beyond a small driver or some embedded code for a calculator, it’s wise to use a compiler instead.
“The other benefit that comes with them is portability. With high-level languages, algorithms can be developed independently from the underlying hardware. This allows software to live on once the hardware becomes obsolete.”
Like going from x86 to ARM. ;-)
My first paid programming job was to write COBOL and I have spent many years turning business requirements into application designs. During that time, I never ceased to be amazed at how large the “impedance mismatch” between programmers and users often is. Compilers should allow solutions to be described in terms of the user requirements but even relatively “high level” languages and frameworks usually require an immense level of knowledge about the technical environment. When I started my career, I expected by now that coders would no longer be required and “programmers” would become solution designers mainly focusing on accurately and clearly specifying the requirements and how to verify that they had been met. Then the immense power of modern computing would take over. Sadly, we are still a long way from that. Machine learning is moving in the right direction to be able to do that but it is still at least one major step from being able to do so. Of course, computers that can create systems themselves directly from user requirements could have serious unintended consequences until they have a level of world understanding similar to us humans.
amazing Grace
/stands up a little straighter/
o7
She used to give out “nanoseconds”, lengths of wire about 9 inches long. Somewhere I have a copy of a videotape of her giving a talk I really ought to dig out and get digitized.
As a COBOL programmer, I tend to get misty eyed around this time of the year.
I work at Aberdeen Proving Grounds, Hopper Hall. I see her bronze face every day on my way to the lab. My job alternates between finding bugs, and writing new ones!
https://www.army.mil/article/66745/new_campus_built_on_tradition_of_excellence
A computer program is a mathematical prof and should have been treated as such.
Instead (in the name of practicality) computer languages evolved into “bug” prone glorified text editors that made software development an incredibly expensive and unreliable activity. Slowly computer industry is getting back to their senses by emploing mathematical methods in software development and verification prompted by proliferation of “smart phones”.
Hooooly fuck. Computer science is more than Visual Basic and JavaScript, man.
In the near future, we still won’t escape that fact that “someone” still needs to write the software buffer between what the “hardware is” and what the “user wants from it”. What we have managed to accomplish is an acceptance of sloppiness that was never tolerated in the days of say the Whirlwind or early PDP systems. Smarphones don’t demand better code… since every year we toss more CPU power and Memory storage at the problem to make up for layers and layers of abstractions and the laziness continues.
She was an *amazing* lady. Years ago, I wrote this little ode to her.
http://www.sunrise-ev.com/poems/hopper.htm
Grace Hopper was a truly amazing woman. When she passed away, I wrote this little ode to her. http://www.sunrise-ev.com/poems/hopper.htm