In the days of yore, computers would scream strange sounds as they spoke with each other over phone lines. Of course, this is dial up, the predecessor to modern internet technology, offering laughable speeds compared to modern connections. But what if dial up had more to offer? Perhaps it could even stream a YouTube video. That’s what the folks over at The Serial Port set out to find out.
The key to YouTube over dial up is a little known part of the protocol added right around the time broadband was taking off called multilink PPP. This protocol allows for multiple modems connected to a PC in parallel for faster connections. With no theoretical limit in sight, and YouTube’s lowest quality requiring a mere 175 Kbps, the goal was clear: find if there is a limit to multilink PPP and watch YouTube over dialup in the process.

For the ISP setup, a Cisco IAD VoIP gateway with a T1 connection to a 3Com Total Control modem was configured for this setup. On the client side, an IBM Net Vista A21I with Windows ME was chosen for its period correct nature. First tests with two modems proved promising, but Windows ME dials only one modem at a time, making the connection process somewhat slow.
But for faster speeds, more ports are needed, so an Equinox com port adapter was added to the machine. However, drivers for Windows ME were unavailable, so a Windows 2000 computer was used instead. Unfortunately, this still was an unusable setup as no browser capable of running YouTube could be installed on Windows 2000. Therefore, the final client side computer was an IBM Think Center A50 from 2004 with Windows XP.
But a single Equinox card was still not enough, so a second eight port com card was installed. However, the com ports showed up in windows numbered three through ten on both cards with the driver unable to change the addresses on the second card. Therefore, a four port Digi card was used instead, giving a total of thirteen com ports including the one on the motherboard.
Testing with a mere four modems showed that Windows XP had far better multilink support, with all the modems creating a cacophony of sound dialing simultaneously. Unfortunately, this test with four modems failed due to numerous issues ranging from dial tones to hardware failures. As it turned out the DIP switches on the bottom of the modems needed to be set identically. After a few reterminating cables, three of the four modems worked.
The next set was eight modems. Despite persistent connection issues, five modems got connected in this next test with just over 200 kbps, 2000 era broadband speeds. But a neat feature of multilink is the ability to selectively re-dial, so by re-trying the connection of the three unconnected modems, all eight could work in parallel, reaching over 300 kbps.
But still, this is not enough. So after adding more phone lines and scrounging up some more modems, an additional four modems got added to the computer. With twelve modems connected, a whopping 668.8 kbps was achieved over dial up, well in excess of what’s needed for YouTube video playback, and even beating out broadband of the era. Despite this logical extreme, there is still no theoretical limit in sight, so make sure to stay tuned for the next dial up speed record attempt!
If you too enjoy the sounds of computers screaming for their internet connection, make sure to check out this dialup over Discord hack next!
Absolute lunacy. And I mean that in a good way. Well done!
Only because people didn’t bother to fix the modem settings to turn off the external speaker.
The screeching got on my nerves the first few times, so I looked up how to change the settings. Blessed silence while using the internet.
I also had ours setup to connect on demand. No manually dialing in before using the internet. Want to check your email? Open Eudora, hit “receive,” and the computer dialed in automatically and silently. Search for something on the internet? Open Navigator, and within seconds the computer was online, searching for whatever it was you were after.
The screeching was a sign of:
Parents listening to know when the kids were on the internet.
Lack of knowledge among the supposedly computer literate who were on the internet all the time but didn’t actually bother to learn about the technology they were using.
Yes, just prepend M0 and the speaker is off.
Also many didn’t know of the +++ATH0 DoS.
I was poor in 2000 after having been fired from working at tech support for a local cable ISP. Which meant I lost my cable modem. All I had was dial-up but multiple people needed to get online. So I built a small Linux box with a 56k modem running I think it was IPCop that automagically connected when anyone on the LAN requested something from the internet. Worked swimmingly. We didn’t stream or download large files, though.
I was able to finally leave dial-up behind when I got a better job in 2001.
I put together a dialup gateway machine at a startup where I was working with some friends in 1998, on Linux on a used 486 that I got cheap. Broadband wasn’t available. We had 100mb ethernet switches (which seemed like a bit more than we really needed at the time) so I needed to find a fast-ethernet ISA card for the 486: a 3c515. I also installed the squid proxy on it, so that frequently-accessed pages and images were cached on the hard drive. It also ran the company mail server, cyrus; at the time I had some experience with the same kind of setup at home. The result was that the office could offer simultaneous internet for several people at the same time, dialed up on demand, and actually felt fast enough to be useful. We got by with that for the year-and-a-half or so that the business survived.
Around the same time, I put together a similar gateway machine for my dad, because I got a cable modem at home, already had a second phone line that I had been using for dialup before, and wanted to try giving him dialup for free so that he would stop hunting around for whatever dialup was cheapest (he was actually using ad-supported Juno for a long time). He complained that it was dialing up too often, for no apparent reason, often interrupting voice calls. He was a Linux skeptic and claimed that the two machines simply had a love affair with each other. So that was discouraging, and I never did figure out which process thought it needed a connection periodically.
Or +++ATH0M0DT
Growing up in the country, our telephone line was pretty rubbish (when we later switched over to ADSL, we managed a grand total of 500kbps down), so I’d always leave the modem sound on, to make it easier to diagnose the connection glitches.
Leaving the speaker on was also necessary in cases where the phone line was a shared resource. If you didn’t hear a dial tone first, you’d immediately hit “cancel” to avoid disrupting some else’s voice call. If you interrupted some else’s data call, it would usually kill it, until modem recovery algorithms got better and more widespread.
sight instead of site ?
{Sightation needed}
“sigh”…
Proof of concept, but would have required twelve telephone lines, at a not inconsiderable cost. Still, a very good use for all that now-obsolete equipment (except for the VOIP server).
… at which point the telco sales rep would be asking you “can I interest you in an ISDN package, or a fractional T1 line?” (a T1 line is 24 channels wide and would be more cost effective to install, even in a residence.)
honestly i wonder which would be faster, using the data circuit of a PRI in its 1 mbps state, or using the same ammount of time division channels to do multiple bonded dialup connections over the full amount of voice circuits a PRI could provide.
We had a T1/PRI at work for our Internet connection, until the IT department compared what we were paying per month (over $1k) with the cost of a Comcast Business connection ($couple hundred). I can only imagine how much effort it would have taken to get a T1 to your house during the dial-up era. ISDN never made it as a residential service here in Massachusetts, though I know some people who managed to get their home connected (with corporate help and funding), for testing purposes
nevermind it comes out to about the same. 23 voice channels comes out to about 1.2 mbps and the pri if using all data channels and no voice can do about 1.5 mbps raw theoretical throughput.
I used to listening to streaming radio stations and watch streaming video over dialup. A Youtube like site over dialup a absolutely could have existed. The bigger challenge would have been the storage not the bandwith.
How?
Back in the day there was a company called “Real Media”. First they came out with a product called “Real Audio” and later “Real Video” which were a proprietary codec (I think) for highly compressed audio and highly compressed video plus the players and creator software for each.
With Real Audio on a good day I could listen to stereo music with roughly similar quality to FM radio which was still common at the time over a 28.8 or 56k modem. With a 14.4k modem you might have to fall back to mono.
Video, again, on a good day could be not much worse than the standard definition television of the time but with a lower framerate. Sure, it might have artifacts here and there but then our TVs would have static crashes too!
I’m pretty sure “kids these days” and adults who have become used to the present would turn up their noses at such streams. Well, ok. But you really don’t need theatre quality 20 foot tall video just to follow the plot of a story or to watch a newscaster. You really can do a lot with less if you aren’t demanding that your image look accurate even if you take a microscope to the screen!
I’ve listened to Real Audio and Shoutcast (Winamp) via dial-up, too.
And it wasn’t via 56k or V.90/V.92, but the slightly older 33k6 (V.32bis) standard.
Even watched low-res films via ShoutCast TV stations..
About YouTube.. In the early days YouTube used 240p videos in Flash Video format (FLV)..
That’s when many of us had cheap 640×480 camcorders with SD card slots, at best.
Many used digital cameras that could record videos, too but in 320×200.
And I’m not talking about 1995 here, but 2005! ;)
It’s hard to believe, considering that DVD existed for years by then and that Blu Ray/HD DVD appeared about that same time.
But if we have a look at the CCD-based webcam at the time, it makes sense.
Up until 2010s, HD resolution webcams weren’t a thing yet.
Rather, 176×144 pixels (QCIF) or 352×288 pixels (CIF) were common resolutions in the early-mid 2000s, with 640×480 (full VGA) being quite good.
Long story short, before the 2010s video quality on free video platforms usually was below TV standard.
An analog PAL/NTSC video camera from the 1970s and a cheap video grabber would have been high-end for YouTube video production.
When it had used 240p Flash Video, still, I mean. In the 2000s.
In the 90s, digital video that could be decoded without a decoder card was in 160×120 resolution, often.
Let’s have a look at Microsoft Video-1 that’s used in Video for Windows, for example (.AVI container). Or Apple’s QuickTime format (.MOV).
Merely IBM’s Ultimotion codec stood out here. It could do 320×240 on a bog standard 386/486 PC running OS/2..
(There also was a Windows 3.1x port of the player, but it wasn’t as smooth.)
Hi, back then, in the 90s, the default desktop resolution was 640×480 pixels also.
Macintosh users had just used 512×384 pixels before, by comparison.
So 640×480 pixels was a common base resolution at the time,
which most 14″, 15″ or 16″ CRT monitors and LCD panels (notebooks etc) could handle.
On such a screen, a video in about 320×240 pixels (or 160×120 pixels) took quite some screen space without even the need for scaling.
Of course, there also were other resolutions in wide use, such as 800×600 pels or 1024×768 pels.
In fact, 800×600 16c (Super VGA) was quite popular for productivity software, such as Excel or MS Word.
Windows 3.1 (not WfW 3.11) in 1992 did ship with an 800×600 16c driver that worked with most VGA cards with 256KB of RAM, out-of-box.
WfW 3.11 then included 256 color SVGA/VBE drivers for 640×480, 800×600 and 1024×768 resolutions.
1024×768 pixels was common for DTP and photo editing, I think. Be it in 256c or high color/tru color.
Professional CRT monitors could do up to 1600×1200 pixels at the time.
But the average 386/486 user was happy with 640×480 or 800×600 pixels in 256 colors already..
Which was about good enough to render GIF pictures on the world wide web.
Lossy videos also tried to not exceed 256 colors, I think.
Before MPEG-1/Video CD got more wide spread, I mean.
Video CD used 352×240 pixels (NTSC) or 352×288 pixels (PAL),
which looked quite big on a bog standard PC monitor in 640×480 pixels (VGA res).
Speaking under correction.
The default for Windows 3.11 was 800×600 with 16 colors. That was usually available even with older hardware.
640×480 was a bit too small for practical use and was mostly for fallback if nothing else worked. With Windows 95 not even all the default UI elements and dialogs would fit on screen with that.
Hi there! Yes, Windows 3.11 was able to run in 800×600 16c.
The SVGA driver was included from Windows 3.1x onwards.
In WfW 3.11, the 256c SVGA driver was added (with the old 16c driver still in place).
But that’s not even so important, since VGA cards shipped with native Windows drivers since the 80s.
Which worked much better and generally provided 800×600 16c, too and beyond.
The reason I mentioned the SVGA drivers in Windows 3.1x at all
was because many PC owners had lost their driver diskettes
or got their VGA card in a plastic bag without anything else.
This happened when they got a cheap VGA card by a friend or as second-hand.
The driver disks were always the first victims here, I think.
They had been forgotten or couldn’t be found anymore.
In such a case, the SVGA drivers in Windows were a life saver.
Because even if you had no modem or a couple of shareware CDs at your hand (to get drivers), you could still run Windows in a meaningful way.
The two Super VGA drivers had built-in code to handle popupar chips from Paradise, S3, ATI, Trident, OAK, Tseng etc.
In additon, they could set a VESA VBE video mode if the VGA card had a VESA BIOS in ROM. Or if a VESA TSR was loaded in memory.
That’s why the 16c and 256c Super VGA drivers in Windows 3.1x kept functioning for years to come,
when the once classic ISA graphic cards fell out of fashion.
About 640×480 being a bit too small.. It depends, I think.
At home (in mid-90s) we had older “single-frequency” VGA monitors from late 80s
that couldn’t go past 640×480 res without manual adjustment of the trim pots on the back.
Using interlacing, 1024×768 with slow Hz (43Hz?) might have been possible, though, maybe.
Like with the original IBM PS/2 monitors from 1987 or so. Not sure.
I suppose many of the older, entry-class 14″ or 15″ VGA monitors
without an on-screen display but with physical knobs were like that.
They were dumb monitors without any microcontroller.
(They used a single VGA pin 12 for monochrome monitor detection, still.
If it was grounded, the monitor was mono. The pin was later re-used for DDC.
Which in turn later had caused a confusion with newer monitors on old VGA cards, because users saw no color.
Making a short adapter cable with pin 12 cut solved it often, AFAIK.)
So in short, unless the monitors were so-called “multi-sync” monitors,
they could merely do Standard VGA without a headache, in short.
(Multi-sync monitors often were the professional type that could do digital RGB and EGA timings, too and had switches on the back.)
That’s why some ISA VGA cards had DIL switches for monitor type on the back.
You know, those little red or blues switches with their white little “piano keys” that had to be flipped with a screwdriver.
You could configure them for IBM VGA, 8514/A, NEC etc.
And with the default setting, any Super VGA mode was disabled sometimes.
This was all in the manual, which often was lost along with the driver/utility diskettes. 🥲
If you grew up on Windows 3.11/95 by the time PCI graphics cards, Pentium PCs and more recent 15″ or 17″ monitors with OSDs and DDC/EDID features were common,
this was already a more or less a thing of the past.
Then there were some Windows 3.1x games that used 640×480 in a “fake” fullscreen mode.
Games like Myst or The Daedalus Encounter, I think. Or the shareware game “Fortress”.
If you had been using 800×600, such a game would play in a small, centered box with a big dark border around it.
Unless it re-sized itself, which some games surely could, I admit.
Personally, I saw a few Windows 95 installations in 640×480 myself, but with 256c colors and up.
The original Windows 95, I mean. Without Active Desktop.
It had a very space efficient GUI, still. Windows NT 4 was similar here.
Anyway, I don’t mean to say you’re wrong whatsoever.
Laptops such as Compaq Armada had 800×600 panels installed, while the older Aero or Contura series had 640×480 panels.
By the time Windows 98 was out, 800×600 was the new practical minimum resolution, while 1024×768 already was considered normal.
Windows XP during installation even tried to auto-switch to 800×600 (in 256c or up) as soon as it could..
As an example for an entry-class Windows 3.1 PC the Amstrad Mega PC comes to mind.
Specs:
https://segaretro.org/Mega_PC
https://en.wikipedia.org/wiki/Amstrad_Mega_PC#Technical_specifications
Someones video: https://www.youtube.com/watch?v=95TkFdvnQls
It combined a 386SX mainboard with a Sega Genesis/MD card.
The little monitor could do normal Standard VGA in 31 KHz, as well as 15 KHz video for the Sega.
That’s the type of PC generation that I do associate with Windows 3.1x most often.. 😅
Of course, advanced users did use 800×600 resolution since the days of EGA, already.
800×600 was a popular Super EGA resolution from before the days VGA was released even (’87).
It had required the universal, “multi-sync” kind of pro monitors at the time,
since normal CGA/EGA digital RGB monitors couldn’t handle the frequencies.
Launch (launchcast) started streaming music videos in 2000. That is where I got a lot of my musicvideos early on. https://www.streamingmedia.com/Articles/News/Online-Video-News/LAUNCH.com-Integrates-Music-Videos-Into-LAUNCHcast-Music-Service-62287.aspx
Back in the day (and really even after 56k became obsolete) video players would buffer an entire video given appropriate time. I recall sitting patiently many times for a video to buffer far enough for me to watch uninterrupted.
Now it seems you only get about 10-30 seconds of buffer and it stops; rather annoying when you’re on a crappy hotel wifi.
This was meant as a reply to Zanger. If a mod could delete this thread I’d greatly appreciate it!