Like many Victorian gentlemen of means, Richard Carrington did not need to sully himself with labor; instead, he turned his energies to the study of natural philosophy. It was the field of astronomy to which Carrington would apply himself, but unlike other gentlemen of similar inclination, he began his studies not as the sun set, but as it rose. Our star held great interest for Carrington, and what he saw on its face the morning of September 1, 1859, would astonish him. On that morning, as he sketched an unusual cluster of sunspots, the area erupted in a bright flash as an unfathomable amount of energy stored in the twisted ropes of the Sun’s magnetic field was released, propelling billions of tons of star-stuff on a collision course with Earth.
Carrington had witnessed a solar flare, and the consequent coronal mass ejection that would hit Earth just 17 hours later would result in a geomagnetic storm of such strength that it would be worldwide news the next day, and would bear his name into the future. The Carrington Event of 1859 was a glimpse of what our star is capable of under the right circumstances, the implications of which are sobering indeed given the web of delicate connections we’ve woven around and above the planet.
Continue reading “The 1859 Carrington Event”
Somewhere in the recesses of my memory there lives a small photograph, from one of the many magazines that fed my young interests in science and electronics – it was probably Popular Science. In my mind I see a man standing before a large machine. The man looks awkward; he clearly didn’t want to pose for the magazine photographer. The machine behind him was an amazing computer, its insides a riot of wires all of the same color; the accompanying text told me each piece was cut to a precise length so that signals could be synchronized to arrive at their destinations at exactly the right time.
My young mind was agog that a machine could be so precisely timed that a few centimeters could make a difference to a signal propagating at the speed of light. As a result, I never forgot the name of the man in the photo – Seymour Cray, the creator of the supercomputer. The machine was his iconic Cray-1, the fastest scientific computer in the world for years, which would go on to design nuclear weapons, model crashes to make cars safer, and help predict the weather.
Very few people get to have their name attached so firmly to a product, let alone have it become a registered trademark. The name Cray became synonymous with performance computing, but Seymour Cray contributed so much more to the computing industry than just the company that bears his name that it’s worth taking a look at his life, and how his machines created the future.
Continue reading “Seymour Cray, Father of the Supercomputer”
When Marie and Pierre Curie discovered the natural radioactive elements polonium and radium, they did something truly remarkable– they uncovered an entirely new property of matter. The Curies’ work was the key to unlocking the mysteries of the atom, which was previously thought to be indivisible. Their research opened the door to nuclear medicine and clean energy, and it also led to the development of nuclear weapons.
Irène Joliot-Curie, her husband Frédéric, and many of their contemporaries were completely against the use of nuclear science as a weapon. They risked their lives to guard their work from governments hell-bent on destruction, and most of them, Irène included, ultimately sacrificed their health and longevity for the good of society. Continue reading “Irène Joliot-Curie and Artificial Radioactivity”
If you’re like me, chances are pretty good that you’ve been taught that all the elements of the modern computer user interface — programs running in windows, menus, icons, WYSIWYG editing of text documents, and of course, the venerable computer mouse — descended from the hallowed halls of the Xerox Corporation’s Palo Alto Research Center in the early 1970s. And it’s certainly true that PARC developed these technologies and more, including the laser printer and object-oriented programming, all of which would grace first the workplaces of the world and later the homes of everyday people.
But none of these technologies would have existed without first having been conceived of by a man with a singular vision of computing. Douglas Engelbart pictured a future in which computers were tools to sharpen the human intellectual edge needed to solve the world’s problem, and he set out to invent systems to allow that. Reading a Twitter feed or scanning YouTube comments, one can argue with how well Engelbart’s vision worked out, but there’s no arguing with the fact that he invented almost all the trappings of modern human-computer interaction, and bestowed it upon the world in one massive demonstration that became known as “The Mother of All Demos.”
Continue reading “The Mother of All Demos, 50 Years On”
Here’s a fun exercise: take a list of the 20th century’s inventions and innovations in electronics, communications, and computing. Make sure you include everything, especially the stuff we take for granted. Now, cross off everything that can’t trace its roots back to the AT&T Corporation’s research arm, the Bell Laboratories. We’d wager heavily that the list would still contain almost everything that built the electronics age: microwave communications, data networks, cellular telephone, solar cells, Unix, and, of course, the transistor.
But is that last one really true? We all know the story of Bardeen, Brattain, and Shockley, the brilliant team laboring through a blizzard in 1947 to breathe life into a scrap of germanium and wires, finally unleashing the transistor upon the world for Christmas, a gift to usher us into the age of solid state electronics. It’s not so simple, though. The quest for a replacement for the vacuum tube for switching and amplification goes back to the lab of Julius Lilienfeld, the man who conceived the first field-effect transistor in the mid-1920s.
Continue reading “Julius Lilienfeld and the First Transistor”
For most of human history, musical instruments were strictly mechanical devices. The musician either plucked something, blew into or across something, or banged on something to produce the sounds the occasion called for. All musical instruments, the human voice included, worked by vibrating air more or less directly as a result of these mechanical manipulations.
But if one thing can be said of musicians at any point in history, it’s that they’ll use anything and everything to create just the right sound. The dawn of the electronic age presented opportunities galore for musicians by giving them new tools to create sounds that nobody had ever dreamed of before. No longer would musicians be constrained by the limitations of traditional instruments; sounds could now be synthesized, recorded, modified, filtered, and amplified to create something completely new.
Few composers took to the new opportunities offered by electronics like Daphne Oram. From earliest days, Daphne lived at the intersection of music and electronics, and her passion for pursuing “the sound” lead to one of the earliest and hackiest synthesizers, and a totally unique way of making music.
It’s easy to forget how much illness and death was caused by our food and drink just one hundred years ago. Our modern food systems, backed by sound research and decent regulation, have elevated food safety to the point where outbreaks of illness are big news. If you get sick from a burger, or a nice tall glass of milk, it’s no longer a mystery what happened. Instead we ask why, and “who screwed up?”
In the early 20th century though, many food-borne illnesses were still a mystery, and microbiology was a scientific endeavor that was just getting started. Alice Catherine Evans was an unlikely figure to make a dent in this world at the time, but through her research at the United States Department of Agriculture’s (USDA), and later at the Hygienic Laboratory (now the National Institute of Health) she had a huge impact on the field of bacteriology, the dairy industry, and consumer safety. Continue reading “Alice Evans: Brucellosis, or Why We Pasteurize Milk”