The BBC Microcomputer System (or BBC Micro) was an innovative machine back in the early 1980’s. One feature that impressed reviewers was a “tube” interface that allowed the machine to become an I/O processor for an additional CPU. When the onboard 6502 became too slow, it could become a slave to a Z-80 or even an ARM processor. The bus was actually useful for any high-speed device, but its purpose was to add new processors, a feature Byte magazine called “innovative.”
[Hoglet67] has released a very interesting set of FPGA designs that allows a small board sporting a Xilinx Spartan 3 to add a 6502, a Z80, a 6809, or a PDP/11 to a BBC Micro via the tube interface. There’s something satisfying about a classic computer acting as an I/O slave to a fairly modern FPGA that implements an even older PDP/11.
There is a set of lengthy threads on this project, but the easiest place to start is probably this pathfinder thread. You may also want to read more about the tube interface. There’s also an interview with one of the BBC Micro’s designers in the video below.
In our recent roundup of classic computer emulators, we neglected the BBC Micro but [PKM] provided a link. If you don’t have a BBC Micro lying around, maybe just doing a whole 6502 computer on an FPGA will do. Maybe you can even add your own version of the tube interface to it.
Thanks to [Ed S] for the tip!
Kudos to the people that can do things like this, but…
I lived through the era of slow, nearly impossible to use computers, and I never want to go back.
Yeah… they’re unbearably slow, nowadays, and a pain to use, relatively speaking. But you can learn a shitload from these old designs.
Something modern um…. hobbyists desperately need to learn about.
Props for the PDP-11. Probably the only of the bunch that has somewhat functional GCC support, although 6809 gcc is also buildable.
Writing code on them should be a compulsory part of Computer Science. It teaches you to think about how the machine works, and not to throw memory around willy nilly in languages like Java and C# and expect Moore’s law and the Garbage collector to do all the work for you. I learnt to program on the machine that became the MK14 (NS Introkit) and am very grateful for it.
It is remarkable for example, given that in 30 years processors are much faster, have multiple cores and offload most of the graphics work onto other processors, and backing storage is massively faster and uses DMA and so on, how things like Windows aren’t actually that much faster than they were.
People always bring up Windows this way, as if the only thing different between something like GEM on the Atari ST and Windows 10 is colours. Aside from pretty decent multitasking, there is a full network stack for every process, a complex driver system to allow for thousands of third-party hardware vendors including hot-plugging, abstractions for most other things to allow things like networked filesystems or external authentication… and colours! I also think you are not remembering early-90s GUIs very well – I distinctly remember watching screen redraws for the outline-drag window resizing, and waiting tens of seconds for JPEGs to render on screen. We also deal with much more data than 1985-computers. This comment is over the memory limit of my first computer (Sinclair ZX81), but the first hard drive I bought myself in 1995 was 420MB, which was plenty for a Linux 1.2 system with X, and now isn’t enough for an episode of Adventure Time.
I agree about making CS students learn some computer architecture though – build a z80 sbc and know why it works, even if it’s done in VHDL and an FPGA instead of real components.
I don’t know about what they teach now, but when I went through intro C++ we were forbidden from using most of the STL until after we got through data structures (well, we were allowed cin, cout, and strings). Our programs ran on “The Curator” which could impose time limits, memory limits, I/O limits, monitor memory leaks, whatever the professors had decided to limit. So while a program might run just fine on our systems, the use of something like valgrind became essential.
The down side was, by the time we got to third year courses like Operating Systems and the dragon book or Intro AI, we’d had no exposure to the STL and would write our own linked lists, trees, bitmaps, whatever. Sure, we’d have our own libraries that had been tested by the last two years of courses; but it’s just bad design to use a custom List just because no one had pointed us towards the STL. And I honestly can’t remember any point at which the STL was introduced; it should have been a parallel class to data structures (learn to write a bunch of structures and understand pointers and memory leaks) where we learned what everyone else had cooked up before we came along.
My experience was in the different direction. First year students were required to learn Visual Basic. First semester of class, I remember the professor made it a point, disregard any constraints on RAM, CPU or HDD.
Pissed me off writing small effecient code would get lower marks than bloated crap. Students were even given extra points for adding functional code that “expanded the user experience”. I finally settled on a small routine to animate the icon (using the dragon from Wonder Boy in Monster World) to ensure I got the extra marks.
The entire course was complete crap. XML was shoved on us, Java replaced C and C++, and Access used for the SQL. After I left, I had to do a complete memory dump to get rid of all the nonsense that was taught.
Slow?
Beep and it was on within a second.
Don’t know of any modern computer that will boot-up with in 15 seconds, never mind one second.
OK, it was single tasking until the Archimedes, but speed was never a problem for me.
Also the ADFS filing system (Advanced Disc Filing System) with the Acorn Election and BBC Master was a lot better than the DOS filing system with it’s 8 character file names and 3 character extents instead of file types.
It also supported the DOS filing system if you wanted to save or read to a DOS disc.
The worst thing was the colour (Or COLOR if you bought a BBC Micro in the USA)
It only supported 8 colours plus 8 more flashing colours.
In high resolutions MODES this was reduced to just 2 colours. (The foreground and background, which could be any one of the 8 colours or flashing colours)
By high resolutions, I mean 80 characters wide by 32 lines.
One of the best things is that the OS was in a ROM chip. (Read Only Memory) and not on a hard drive.
You can’t upgrade the operating system unless you change or add a ROM chip,
but you or a computer virus can’t currupd or edit the OS.
I don’t know why todays computers don’t stick the OS on a chip.
“I don’t know why todays computers don’t stick the OS on a chip.”
Do it yourself. Put a SSD in your computer and put the OS on it.
That does completely miss the point, you know that right ;P