Australia Didn’t Invent WiFi, Despite What You’ve Heard

Wireless networking is all-pervasive in our modern lives. Wi-Fi technology lives in our smartphones, our laptops, and even our watches. Internet is available to be plucked out of the air in virtually every home across the country. Wi-Fi has been one of the grand computing revolutions of the past few decades.

It might surprise you to know that Australia proudly claims the invention of Wi-Fi as its own. It had good reason to, as well— given the money that would surely be due to the creators of the technology. However, dig deeper, and you’ll find things are altogether more complex.

Continue reading “Australia Didn’t Invent WiFi, Despite What You’ve Heard”

40,000 FPS Omega camera captures Olympic photo-finish

Olympic Sprint Decided By 40,000 FPS Photo Finish

Advanced technology played a crucial role in determining the winner of the men’s 100-meter final at the Paris 2024 Olympics. In a historically close race, American sprinter Noah Lyles narrowly edged out Jamaica’s Kishane Thompson by just five-thousandths of a second. The final decision relied on an image captured by an Omega photo finish camera that shoots an astonishing 40,000 frames per second.

This cutting-edge technology, originally reported by PetaPixel, ensured the accuracy of the result in a race where both athletes recorded a time of 9.78 seconds. If SmartThings’ shot pourer from the 2012 Olympics were still around, it could once again fulfill its intended role of celebrating US medals.

Omega, the Olympics’ official timekeeper for decades, has continually innovated to enhance performance measurement. The Omega Scan ‘O’ Vision Ultimate, the camera used for this photo finish, is a significant upgrade from its 10,000 frames per second predecessor. The new system captures four times as many frames per second and offers higher resolution, providing a detailed view of the moment each runner’s torso touches the finish line. This level of detail was crucial in determining that Lyles’ torso touched the line first, securing his gold medal.

This camera is part of Omega’s broader technological advancements for the Paris 2024 Olympics, which include advanced Computer Vision systems utilizing AI and high-definition cameras to track athletes in real-time. For a closer look at how technology decided this historic race, watch the video by Eurosport that captured the event.

Continue reading “Olympic Sprint Decided By 40,000 FPS Photo Finish”

For years, the first Air Force One sat neglected and forgotten in an open field at Arizona’s Marana Regional Airport. (Credit: Dynamic Aviation)

The First Air Force One And How It Was Nearly Lost Forever

Although the designation ‘Air Force One’ is now commonly known to refer to the airplane used by the President of the United States, it wasn’t until Eisenhower that the US President would make significant use of a dedicated airplane. He would have a Lockheed VC-121A kitted out to act as his office as commander-in-chief. Called the Columbine II after the Colorado columbine flower, it served a crucial role during the Korean War and would result the coining of the ‘Air Force One’ designation following a near-disaster in 1954.

This involved a mix-up between Eastern Air Lines 8610 and Air Force 8610 (the VC-121A). After the Columbine II was replaced with a VC-121E model (Columbine III), the Columbine II was mistakenly sold to a private owner, and got pretty close to being scrapped.

In 2016, the plane made a “somewhat scary and extremely precarious” 2,000-plus-mile journey to Bridgewater, Virginia, to undergo a complete restoration. (Credit: Dynamic Aviation)
In 2016, the plane made a “somewhat scary and extremely precarious” 2,000-plus-mile journey to Bridgewater, Virginia, to undergo a complete restoration. (Credit: Dynamic Aviation)

Although nobody is really sure how this mistake happened, it resulted in the private owner stripping the airplane for parts to keep other Lockheed C-121s and compatible airplanes flying. Shortly before scrapping the airplane, he received a call from the Smithsonian Institution, informing him that this particular airplane was Eisenhower’s first presidential airplane and the first ever Air Force One. This led to him instead fixing up the airplane and trying to sell it off. Ultimately the CEO of the airplane maintenance company Dynamic Aviation, [Karl D. Stoltzfus] bought the partially restored airplane after it had spent another few years baking in the unrelenting sun.

Although in a sorry state at this point, [Stoltzfus] put a team led by mechanic [Brian Miklos] to work who got the airplane in a flying condition by 2016 after a year of work, so that they could fly the airplane over to Dynamic Aviation facilities for a complete restoration. At this point the ‘nuts and bolts’ restoration is mostly complete after a lot of improvisation and manufacturing of parts for the 80 year old airplane, with restoration of the Eisenhower-era interior and exterior now in progress. This should take another few years and another $12 million or so, but would result in a fully restored and flight-worthy Columbine II, exactly as it would have looked in 1953, plus a few modern-day safety upgrades.

Although [Stoltzfus] recently passed away unexpectedly before being able to see the final result, his legacy will live on in the restored airplane, which will after so many years be able to meet up again with the Columbine III, which is on display at the National Museum of the USAF.

A Modern Take On An Old Language

Some old computer languages are destined to never die. They do, however, evolve. For example, Fortran, among the oldest of computer languages, still has adherents, not to mention a ton of legacy code to maintain. But it doesn’t force you to pretend you are using punched cards anymore. In the 1970s, if you wanted to crunch numbers, Fortran was a good choice. But there was another very peculiar language: APL. Turns out, APL is alive and well and has a thriving community that still uses it.

APL has a lot going for it if you are crunching serious numbers. The main data type is a multidimensional array. In fact, you could argue that a lot of “modern” ideas like a REPL, list types, and even functional programming entered the mainstream through APL. But it did have one strange thing that made it difficult to use and learn.

[Kenneth E. Iverson] was at Harvard in 1957 and started working out a mathematical notation for dealing with arrays. By 1960, he’d moved to IBM and a few years later wrote a book entitled “A Programming Language.” That’s where the name comes from — it is actually an acronym for the book’s title. Being a mathematician, [Iverson] used symbols instead of words. For example, to create an array with the numbers 1 to 5 in it and then print it, you’d write:

⎕←⍳5

Since modern APL has a REPL (read-eval-print loop), you could remove the box and the arrow today.

What Key Was That?

Wait. Where are all those keys on your keyboard? Ah, you’ve discovered the one strange thing. In 1963, CRTs were not very common. While punched cards were king, IBM also had a number of Selectric terminals. These were essentially computer-controlled typewriters that had type balls instead of bars that were easy to replace.

Continue reading “A Modern Take On An Old Language”

Australia’s Controlled Loads Are In Hot Water

Australian grids have long run a two-tiered pricing scheme for electricity. In many jurisdictions, regular electricity was charged at a certain rate. Meanwhile, you could get cheaper electricity for certain applications if your home was set up with a “controlled load.” Typically, this involved high energy equipment like pool heaters or hot water heaters.

This scheme has long allowed Australians to save money while keeping their water piping-hot at the same time. However, the electrical grid has changed significantly in the last decade. These controlled loads are starting to look increasingly out of step with what the grid and the consumer needs. What is to be done?

Continue reading “Australia’s Controlled Loads Are In Hot Water”

Laser Cutters: Where’s The Point?

It is funny how when you first start doing something, you have so many misconceptions that you have to discard. When you look back on it, it always seems like you should have known better. That was the case when I first got a low-end laser cutter. When you want to cut or engrave something, it has to be in just the right spot. It is like hanging a picture. You can get really close, but if it is off just a little bit, people will notice.

The big commercial units I’ve been around all had cameras that were in a fixed position and were calibrated. So the software didn’t show you a representation of the bed. It showed you the bed. The real bed plus whatever was on it. Getting things lined up was simply a matter of dragging everything around until it looked right on the screen.

Today, some cheap laser cutters have cameras, and you can probably add one to those that don’t. But you still don’t need it. My Ourtur Laser Master 3 has nothing fancy, and while I didn’t always tackle it the best way, my current method works well enough. In addition, I recently got a chance to try an XTool S1. It isn’t that cheap, but it doesn’t have a camera. Interestingly, though, there are two different ways of laying things out that also work. However, you can still do it the old-fashioned way, too. Continue reading “Laser Cutters: Where’s The Point?”

The Long, Slow Demise Of DVD-RAM

While CDs were still fighting for market share against cassettes, and gaming consoles were just starting to switch over to CD from cartridge storage, optical media companies were already thinking ahead. Only two years after the introduction of the original PlayStation, the DVD Forum had introduced the DVD-RAM standard: 2.58 GB per side of a disc in a protective caddy. The killer feature? Essentially unlimited re-writeability. In a DVD drive that supports DVD-RAM, they act more like removable hard drive platters. You can even see hard sectors etched into the media at the time of manufacture, giving DVD-RAM its very recognizable pattern.

At the time, floppy drives were still popular, and CD-ROM drives were increasingly available pre-installed in new computers. Having what amounted to a hard drive platter with a total of 5 GB per disc should have been a killer feature for consumers. Magneto-optical drives were still very expensive, and by 1998 were only 1.3 GB in size. DVD-RAM had the same verify-after-write data integrity feature that magneto-optical drives were known for, but with larger capacity, and after the introduction of 4.7 GB size discs, no caddy was required.

Continue reading “The Long, Slow Demise Of DVD-RAM”