Making YouTube Work In The Netscape 4.5 Browser On Windows 98

The World Wide Web of the 90s was a magical place, where you couldn’t click two links without getting bombarded with phrases such as the Information Super Highway and Multimedia Experience. Of course, the multimedia experience you got on your Windows 9x PC was mostly limited to low-res, stuttery RealMedia and Windows video format clips, but what if you could experience YouTube back then, on your ‘multimedia-ready’ Celeron PC, running Netscape 4.5?

Cue the [Throaty Mumbo] bloke over on that very same YouTube, and his quest to make this dream come true. Although somewhat ridiculous on the face of it, the biggest problem is actually the era-appropriate hardware, as it was never meant to decode and display full-HD VP9-encoded videos.

Because the HTTPS requirement has meant that no 1990s or early 2000s browser will ever browse the modern WWW, a proxy was going to be needed no matter what. This Python-based proxy then got kitted out with not just the means to render down the convoluted HTML-CSS-JS mess of a YouTube page into something that a civilized browser can display, but also to fetch YouTube videos with yt-dlp and transcode it into MPEG1 in glorious SD quality for streaming to Netscape on the Windows 98 PC.

Because the same civilized browsers also support plugins, such as Netscape’s NPAPI, this meant that decoding and rendering the video was the easy part, as the browser just had to load the plugin and the latter doing all the heavy lifting. Perhaps unsurprisingly, with some tweaks even Netscape 2.0 can be used to browse YouTube and play back videos this way, with fullscreen playback and seeking support.

Although these days only a rare few modern browsers like Pale Moon still support NPAPI, it’s easy to see how the introduction of browser plugins boosted the multimedia future of the WWW that we find ourselves in today.

Continue reading “Making YouTube Work In The Netscape 4.5 Browser On Windows 98”

Making WiFi Sound Like Dial-Up Internet

Dial-up modems had a distinctive sound when connecting, with the glittering, screeching song becoming a familiar melody to those jumping online in the early days of the Internet. Modern digital connections don’t really have an analog to this, by virtue of being entirely digital. And yet, [Nick Bild] decided to make WiFi audible in a pleasing tribute to the modems of yore.

The reason you could hear your dial-up modem is because it was actually communicating in audio over old-fashioned telephone lines. The initialization process happened at a low enough speed that you could hear individual sections of the handshake that sounded quite unique. Ultimately, though, once a connection was established at higher speed, particularly 33.6 k or 56 k, the sound of transmission became hard to discern from static.

Modern communication methods like Ethernet, DSL, and WiFi all occur purely digitally — and in frequencies far above the audible range. Thus, you can’t really “listen” to a Wi-Fi signal any more than you can listen to the rays of light beaming out from the sun. However, [Nick] found an anachronistic way to make a sound out of WiFi signals that sounds vaguely reminiscent of old-school modems. He used a Raspberry Pi 3 equipped with a WiFi adapter, which sniffs network traffic, honing in on data going to one computer. The packet data is then sent to an Adafruit QT Py microcontroller, which uses the data to vary the amplitude of a sound wave that’s then fed to a speaker through a digital-to-analog converter. [Nick] notes this mostly just sounds like static, so he adds some adjustments to the amplitude and frequency to make it more reminiscent of old modem sounds, but it’s all still driven by the WiFi data itself.

It’s basically WiFi driven synthesis, rather than listening to WiFi itself, but it’s a fun reference to the past. We’ve talked a lot about dial-up of late; from the advanced technology that made 56 k possible, to the downfall of AOL’s long-lived service. Video after the break.

Continue reading “Making WiFi Sound Like Dial-Up Internet”

The Internet We Didn’t Get

Collective human consciousness is full of imagined or mythical dream-like utopias, hidden away behind mountains, across or under oceans, hidden in mist, or deep in the jungle. From Atlantis, Avalon, El Dorado, and Shangri-La, we have not stopped imagining these secret, fantastical places. One of these, Xanadu, is actually a real place but has been embellished over the years into a place of legend and myth, and thus became the namesake of an Internet we never got to see like all of those other mystical, hidden places.

The Xanadu project got its start in the 1960s at around the same time the mouse and what we might recognize as a modern computer user interface were created. At its core was hypertext with the ability to link not just other pages but references and files together into one network. It also had version control, rights management, bi-directional links, and a number of additional features that would be revolutionary even today. Another core feature was transclusion, a method for making sure that original authors were compensated when their work was linked. However, Xanadu was hampered by a number of issues including lack of funding, infighting among the project’s contributors, and the development of an almost cult-like devotion to the vision, not unlike some of today’s hype around generative AI. Surprisingly, despite these faults, the project received significant funding from Autodesk, but even with this support the project ultimately failed.

Instead of this robust, bi-directional web imagined as early as the 1960s, the Internet we know of today is the much simpler World Wide Web which has many features of Xanadu we recognize. Not only is it less complex to implement, it famously received institutional backing from CERN immediately rather than stagnating for decades. The article linked above contains a tremendous amount of detail around this story that’s worth checking out. For all its faults and lack of success, though, Xanadu is a interesting image of what the future of the past could have been like if just a few things had shaken out differently, and it will instead remain a mythical place like so many others.

A Love Letter To Internet Relay Chat

Although kids these days tend to hang out on so-called “Social Media”, Internet Relay Chat (IRC) was first, by decades. IRC is a real-time communication technology that allows people to socialize online in both chat rooms and private chat sessions. In a recent video [The Serial Port] channel dedicates a video to IRC and why all of this makes it into such a great piece of technology, not to mention a great part of recent history. As a decentralized communication protocol, anyone can set up an IRC server and connect multiple servers into networks, with the source code for these servers readily available ever since its inception by a student, and IRC clients are correspondingly very easy to write.

Because of the straightforward protocol, IRC will happily work on even a Commodore 64, while also enabling all kinds of special services (‘bots’) to be implemented. Even better, the very personal nature of individual IRC networks and channels on them provides an environment where people can be anonymous and yet know each other, somewhat like hanging out at a local hackerspace or pub, depending on the channel. In these channels, people can share information, help each other with technical questions, or just goof off.

In this time of Discord, WhatsApp, and other Big Corp-regulated proprietary real-time communication services, it’s nice to pop back on IRC and to be reminded, as it’s put in the video, of a time when the Internet was a place to escape to, not escape from. Although IRC isn’t as popular as it was around 2000, it’s still alive and kicking. We think it will be around until the end days.

Continue reading “A Love Letter To Internet Relay Chat”

Engineering For Slow Internet Even When Not Stuck In Antarctica

With the days of dial-up and pitiful 2G data connections long behind most of us, it would seem tempting to stop caring about how much data an end-user is expected to suck down that big and wide broadband tube. This is a problem if your respective tube happens to be a thin straw and you’re located in a base somewhere in the Antarctic. Take it from [Paul Coldren], who was stationed at a number of Antarctic research stations as an IT specialist for a total of 14.5 months starting in August of 2022.

Prepare for hours of pain and retrying downloads. (Credit: Paul Coldren]
Prepare for hours of pain and retrying downloads. (Credit: Paul Coldren]

As [Paul] describes, the main access to the Internet at these bases is via satellite internet, which effectively are just relay stations. With over a thousand people at a station like McMurdo at certain parts of the season, internet bandwidth is a precious commodity and latency is understandably high.

This low bandwidth scenario led to highly aggravating scenarios, such as when a web app would time out on [Paul] while downloading a 20 MB JavaScript file, simply because things were going too slow. Upon timing out, it would wipe the cache, redirect to an error page and have [Paul] retry and retry to try to squeeze within the timeout window. Instead of just letting the download complete in ~15 minutes, it would take nearly half an hour this way, just so that [Paul] could send a few kB worth of text in a messaging app.

In addition to these artificial timeouts – despite continuing download progress – there’s also the issue of self-updating apps, with a downloader that does not allow you to schedule, pause, resume or do anything else that’d make downloading that massive update somewhat feasible. Another thing here is distributed downloads, such as when hundreds of people at said Antarctic station are all trying to update MacOS simultaneously. Here [Paul] ended up just – painfully and slowly – downloading the entire 12 GB MacOS ISO to distribute it across the station, but a Mac might still try to download a few GB of updates regardless.

Continue reading “Engineering For Slow Internet Even When Not Stuck In Antarctica”

End Of The Eternal September, As AOL Discontinues Dial-Up

If you used the internet at home a couple of decades or more ago, you’ll know the characteristic sound of a modem  connecting to its dial-up server. That noise is a thing of the past, as we long ago moved to fibre, DSL, or wireless providers that are always on. It’s a surprise then to read that AOL are discontinuing their dial-up service at the end of September this year, in part for the reminder that AOL are still a thing, and for the surprise that in 2025 they still operate a dial-up service.

There was a brief period in which instead of going online via the internet itself, the masses were offered online services through walled gardens of corporate content. Companies such as AOL and Compuserve bombarded consumers with floppies and CD-ROMs containing their software, and even Microsoft dipped a toe in the market with the original MSN service before famously pivoting the whole organisation in favour of the internet in mid 1995. Compuserve was absorbed by AOL, which morphed into the most popular consumer dial-up ISP over the rest of that decade. The dotcom boom saw them snapped up for an exorbitant price by Time Warner, only for the expected bonanza to never arrive, and by 2023 the AOL name was dropped from the parent company’s letterhead. Over the next decade it dwindled into something of an irrelevance, and is now owned by Yahoo! as a content and email portal. This dial-up service seems to have been the last gasp of its role as an ISP.

So the eternal September, so-called because the arrival of AOL users on Usenet felt like an everlasting version of the moment a fresh cadre of undergrads arrived in September, may at least in an AOL sense, finally be over. If you’re one of the estimated 0.2% of Americans still using a dial-up connection don’t despair, because there are a few other ISPs still (just) serving your needs.

Microsoft’s New Agentic Web Protocol Stumbles With Path Traversal Exploit

If the term ‘NLWeb’ first brought to mind an image of a Dutch internet service provider, you’re probably not alone. What it actually is – or tries to become – is Microsoft’s vision of a parallel internet protocol using which website owners and application developers can integrate whatever LLM-based chatbot they desire. Unfortunately for Microsoft, the NLWeb protocol just suffered its first major security flaw.

The flaw is an absolute doozy, involving a basic path traversal vulnerability that allows an attacker to use appropriately formatted URLs to traverse the filesystem of the remote, LLM-hosting, system to extract keys and other sensitive information. Although Microsoft patched it already, no CVE was assigned, while raising the question of just how many more elementary bugs like this may be lurking in the protocol and associated software.

As for why a website or application owner might be interested in NLWeb, the marketing pitch appears to be as an alternative to integrating a local search function. This way any website or app can have their own ChatGPT-style search functionality that is theoretically restricted to just their website, instead of chatbot-loving customers going to the ChatGPT or equivalent site to ask their questions there.

Even aside from the the strong ‘solution in search of a problem’ vibe, it’s worrying that right from the outset it seems to introduce pretty serious security issues that suggest a lack of real testing, never mind a strong ignorance of the fact that a lack of user input sanitization is the primary cause for widely exploited CVEs. Unknown is whether GitHub Copilot was used to write the affected codebase.