How Did We Get To The Speed Of Light?

Every high school physics student knows c, or the speed of light, it’s 3 x 10^8 metres per second. More advanced or more curious students will know that this is an approximation, and the figure of 299,792,458 metres per second that forms the officially accepted figure comes from a resonance of the caesium atom from which is derived a value for the second.

Galileo
Galileo Galilei, whose presence in this story should come as no surprise. Justus Sustermans, Public domain.

But for those who are really curious about measuring the speed of light the question remains: Just how did we arrive at that figure and how long have we been measuring it? The answer contains some surprises, and some exceptionally clever scientific thought and experimentation over the centuries.

The nature of light and whether it had a speed at all had been puzzling philosophers and scientists since antiquity, but the first experiments performed in an attempt to measure it were you will not be surprised to hear, performed by Galileo sometime in the early 17th century. His experiment involved his observation of assistants uncovering lanterns at known distances away, and his observations  failed to arrive at a figure.

Later that century in 1676 the first numerical estimate of the speed of light was made by the Danish astronomer Ole Rømer, who observed an apparent variation in the period of one of Jupiter’s moons depending upon whether the Earth was approaching it or moving away from it. From this he was able to estimate the time taken for light to cross the Earth’s orbit, and from there the mathematician Christiaan Huygens was able to produce a figure of 220,000,000 metres per second.

Spinning Cogs And Mirrors: Time Of Flight

The mile-long evacuated tube used in Michelson's time-of-flight experiment. H.
The mile-long evacuated tube used in Michelson’s time-of-flight experiment. H. H. Dunn, Public domain.

The experiments with which we will perhaps be the most familiar are the so-called time of flight measurements, which take Galileo’s idea of observing the delay as light travels over a distance, and bring to it ever higher precision. This was first performed in the middle of the 19th century by the French physicist Hippolyte Fizeau, who reflected a beam of light from a mirror over several kilometres, and used a toothed wheel to chop it into pulses. The pulses could be increased in frequency by moving the wheel faster until the time taken for the light to travel the distance from wheel to mirror and back again matched the separation between teeth and the returning pulse could be observed. His calculation of 313,300,000 metres per second was successively improved upon through the work of succession of others including Léon Foucault, culminating in the series of experiments by the American physicist Albert A. Michelson in the 1920s. Michelson’s final figure stood at 299,774,000 metres per second, measured through a multi-path traversal of a mile-long evacuated tube in the California desert. In the second half of the century the techniques shifted to laser interferometry, and in the quest to define the SI units in terms of constants, eventually to the definition mentioned in the first paragraph.

The most fascinating part of the story probably encapsulates the essence of scientific discovery, namely that while to arrive at something takes the work of many scientists building on the work of each other, it can then often be rendered into a form that can be understood by a student who hasn’t had to pass through all that effort. We could replicate Fizeau and Michelson’s experiments with a pulse generator, laser diode, and oscilloscope, which while of little scientific value nearly a century after Michelson’s evacuated tube, is still immensely cool. Has anyone out there given it a try?

Header image: Tommology, CC BY-SA 4.0.

Linux Fu: Fusing Hackaday

Unix and, by extension, Linux, has a mantra to make everything possible look like a file. Files, of course, look like files. But also devices, network sockets, and even system information show up as things that appear to be files. There are plenty of advantages to doing that since you can use all the nice tools like grep and find to work with files. However, making your own programs expose a filesystem can be hard. Filesystem code traditionally works at the kernel module level, where mistakes can wipe out lots of things and debugging is difficult. However, there is FUSE — the file system in user space library — that allows you to write more or less ordinary code and expose anything you want as a file system. You’ve probably seen FUSE used to mount, say, remote drives via ssh or Dropbox. We’ve even looked at FUSE before, even for Windows.

What’s missing, naturally, is the Hackaday RSS feed, mountable as a normal file. And that’s what we’re building today.

Writing a FUSE filesystem isn’t that hard, but there are a lot of tedious jobs. You essentially have to provide callbacks that FUSE uses to do things when the operating system asks for them. Open a file, read a file, list a directory, etc. The problem is that for some simple projects, you don’t care about half of these things, but you still have to provide them.

Luckily, there are libraries that can make it a lot easier. I’m going to show you a simple C++ program that can mount your favorite RSS feed (assuming your favorite one is Hackaday, of course) as a file system. Granted, that’s not amazing, but it is kind of neat to be able to grep through the front page stories from the command line or view the last few articles using Dolphin. Continue reading “Linux Fu: Fusing Hackaday”

The Legend Of Zelda: Decompiled

Keeping source code to programs closed is something that is generally frowned upon here for plenty of reasons. Closed source code is less secure and less customizable, but unfortunately we won’t be able to convince everyone of the merits of open source code any time soon. On the other hand, it is possible to decompile some of those programs whose source remains behind locked doors in an attempt to better understand that code, and one of the more impressive examples of that of late is this project which has fully decompiled The Ocarina of Time.

To get started with the code for this project, one simply needs to clone the Git repository and then use a certain set of software tools (depending on the user’s operating system) to compile the ROM from the source code. From there, though, the world is your rupee-filled jar. Like we’ve seen from other decompiled games, any number of enhancements to the original game can be made including increasing the frame rate, improving the graphics, or otherwise adding flourishes that wouldn’t otherwise be there.

The creators of this project do point out that this is still a work-in-progress as only one of the 18 versions have been completed, but the fact that the source code they have been able to decompile builds a fully-working game when recompiled speaks to how far along it’s come. We’ve seen similar processes used for other games before that also help to illustrate how much improvement is possible when re-writing old games from their source code.

Thanks to [Lazarus] for the tip!

Continue reading “The Legend Of Zelda: Decompiled”

Classic 80s Text-To-Speech On Classic 80s Hardware

Those of us who were around in the late 70s and into the 80s might remember the Speak & Spell, a children’s toy with a remarkable text-to-speech synthesizer. While it sounds dated by today’s standards, it was revolutionary for the time and was riding a wave of text-to-speech functionality that was starting to arrive to various computers of the era. While a lot of them used dedicated hardware to perform the speech synthesis, some computers were powerful enough to do this in software, but others were not quite able. The VIC-20 was one of the latter, but thanks to an ESP8266 it has been retroactively given this function.

This project comes to us from [Jan Derogee], a connoisseur of this retrocomputer, and builds on the work by [Earle F. Philhower] who ported the retro speech synthesis software known as SAM from assembly to C which made it possible to run on the ESP8266. Audio playback is handled on the I2S port, but some work needed to be done to get this to work smoothly since this port also handles the communication with the VIC-20. Once this was sorted out, a patch was made to be able to hear the computer’s audio as well as the speech synthesizer’s. Finally, a serial command interface was designed by [Jan] which allows for control of the module.

While not many of us have VIC-20s sitting at home, it’s still an interesting project that shows the broad scope of a small and inexpensive chip like the ESP8266 which would have had a hefty price tag back in the 1980s. If you have other 80s hardware laying around waiting to be put to work, though, take a look at this project which brings new vocabulary words to that old classic Speak & Spell.

Continue reading “Classic 80s Text-To-Speech On Classic 80s Hardware”

FPGA Retrocomputer: Return To Moncky

Part of the reason that retrocomputers are still so popular despite their obsolescence is that it’s possible to understand the entire inner workings of a computer like this, from the transistors all the way up to the software. Comparatively, it will likely be a long time (if ever) before anyone is building a modern computer from discrete components. To illustrate this point, plenty of 8-bit computers are available to either restore from original 80s hardware or to build from kits. And if you’d like to get even deeper into the weeds you can design your own computer including the instruction set completely from the ground up using an FPGA.

This project, called the Moncky project, is a step above the usual 8-bit computer builds as it is actually a 16-bit computer. It is built around an Arty Spartan-7 FPGA dev board running around 20 MHz and has access to 2 x 128 kB dual-port RAM for memory. To access the outside world there is a VGA output, PS/2 capability, SPI, and uses an SD card as a hard drive. This project really shines in the software, though, as the project creator [Kris Demuynck] builds everything from scratch in order to illustrate how everything works for educational purposes, and is currently working on implementing a C compiler to make programming the computer easier.

All of the project files, as well as all of the code, are available on the project’s GitHub page if you’d like to follow along or build on this homebrew 16-bit computer. It’s actually the third iteration of this computer, with the Moncky-1 and Moncky-2 being used to develop the more basic building blocks for this computer. While it’s not the first 16-bit computer we’ve seen implemented on an FPGA, it is one of the few that builds its own RISC instruction set and associated software rather than cloning a known existing processor. We’ve also seen some interesting x86 implementations on an FPGA as well.

Thanks to [koen-ieee] for the tip!

Ello Is A Tiny Computer With A C — Interpreter?

When we talk about a retrocomputer, it’s our normal practice to start with the hardware. But with [KnivD]’s ELLO 1A while the hardware is interesting enough it’s not the stand-out feature. We are all used to microcomputers with a BASIC interpreter, but how many have we seen with a C interpreter? The way C works simply doesn’t lend itself to anything but a compiler and linker, so even with a pared-down version of the language it still represents a significant feat to create a working interpreter.

The hardware centres around a PIC32MX, and has onboard SD card, VGA, sound, and a PS/2 keyboard port. The PCB is a clever design allowing construction with either through-hole or surface-mount components to allow maximum accessibility for less advanced solderers. Full information can be found on the project’s website, but sadly for those wanting an easy life only the PCB is as yet available for purchase.

We’re privileged to see a huge array of retrocomputing projects here at Hackaday, but while they’re all impressive pieces of work it’s rare for one to produce something truly unexpected. This C interpreter certainly isn’t something we’ve seen before, so we’re intrigued to see what projects develop around it.

Hackaday Links Column Banner

Hackaday Links: January 24, 2021

Code can be beautiful, and good code can be a work of art. As it so happens, artful code can also result in art, if you know what you’re doing. That’s the idea behind Programming Posters, a project that Michael Fields undertook to meld computer graphics with the code behind the images. It starts with a simple C program to generate an image. The program needs to be short enough to fit legibly into the sidebar of an A2 sheet, and as if that weren’t enough of a challenge, Michael constrained himself to the standard C libraries to generate his graphics. A second program formats the code and the image together and prints out a copy suitable for display. We found the combination of code and art beautiful, and the challenge intriguing.

It always warms our hearts when we get positive feedback from the hacker community when something we’ve written has helped advance a project or inspire a build. It’s not often, however, that we learn that Hackaday is required reading. Educators at the Magellan International School in Austin, Texas, recently reached out to Managing Editor Elliot Williams to let him know that all their middle school students are required to read Hackaday as part of their STEM training. Looks like the kids are paying attention to what they read, too, judging by KittyWumpus, their ongoing mechatronics/coding project that’s unbearably adorable. We’re honored to be included in their education, and everyone in the Hackaday community should humbled to realize that we’ve got an amazing platform for inspiring the next generation of hardware hackers.

Hackers seem to fall into two broad categories: those who have built a CNC router, and those who want to build one. For those in the latter camp, the roadblock to starting a CNC build is often “analysis paralysis” — with so many choices to make, it’s hard to know where to start. To ease that pain and get you closer to starting your build, Matt Ferraro has penned a great guide to planning a CNC router build. The encyclopedic guide covers everything from frame material choice to spindle selection and software options. If Matt has a bias toward any particular options it’s hard to find; he lists the pros and cons of everything so you can make up your own mind. Read it at your own risk, though; while it lowers one hurdle to starting a CNC build, it does nothing to address the next one: financing.

Like pretty much every conference last year and probably every one this year, the Open Hardware Summit is going to be virtual. But they’re still looking for speakers for the April conference, and just issued a Call for Proposals. We love it when we see people from the Hackaday community pop up as speakers at conferences like these, so if you’ve got something to say to the open hardware world, get a talk together. Proposals are due by February 11, so get moving.

And finally, everyone will no doubt recall the Boston Dynamics robots that made a splash a few weeks back with their dance floor moves. We loved the video, mainly for the incredible display of robotic agility and control but also for the choice of music. We suppose it was inevitable, though, that someone would object to the Boomer music and replace it with something else, like in the video below, which seems to sum up the feelings of those who dread our future dancing overlords. We regret the need to proffer a Tumblr link, but the Internet is a dark and wild place sometimes, and only the brave survive.

https://commiemartyrshighschool.tumblr.com/post/640760882224414720/i-fixed-the-audio-for-that-boston-dynamics-video