Digital Video From The Amiga’s DB23 Socket

Back in the days of 16-bit home computers, the one to have if your interests extended to graphics was the Commodore Amiga. It had high resolutions for the time in an impressive number of colours, and thanks to its unique video circuitry, it could produce genlocked broadcast-quality video. Here in 2023 though, it’s all a little analogue. What’s needed is digital video, and in that, [c0pperdragon] has our backs with the latest in a line of Amiga video hacks. This one takes the 12-bit parallel digital colour that would normally go to the Amiga’s DAC, and brings it out into the world through rarely-used pins on the 23-pin video connector.

This follows on from a previous [c0pperdragon] project in which a Raspberry Pi Zero was used to transform the digital video into HDMI. This isn’t a hack for the faint-hearted though, as it involves extensive modification of your treasured Amiga board.

It is of course perfectly possible to generate HDMI from an Amiga by using an external converter box from the analogue video output, of the type which can be bought for a few dollars from online vendors. What this type of hack gives over the cheap approach is low latency, something highly prized by gamers. We’re not sure we’re ready to start hacking apart our Amigas, but we can see the appeal for some enthusiasts.

5 thoughts on “Digital Video From The Amiga’s DB23 Socket

  1. Amiga always had digital video output on pins 6-9. In theory should work with CGA monitor, in reality I have never seen Amiga connected to one, not even a clip on YT of someone trying for the lulz. Great idea to pull rest of digital signals on the outside.

      1. 16 colors? Ah yes. One of the colors was ‘black’, another one was ‘black at half the brightness’, IIRC.

        Yes. Back in the days we would solder our own printer camera, MIDI adaptors, 5xBNC adaptors (yep, separate HSymc and VSync BNCs so we could hook up the Amiga to a three-tube video projector), so why not digital RGB?

        It’s mind-blowing that someone actually did that. Even more so that this fellow mentioned that 2l two hours (and four minutes) in response to some Hackaday comment…

  2. Even a Pi might be overkill if low latency is desired. If I remember my trawl through TI’s datasheets correctly there are HDMI transciever chips that just take in 24 bits of color in parallel along with sync and some sort of pixel clock and generate an HDMI output. I guess resolution mismatch (and possibly recreating a pixel clock) would make that a bit more difficult. Still, I think with the right hardware someone could probably get a retrocomputer to output to HDMI with sub 1-frame latency.

    The problem I see might be weird analog video tricks. Like can’t the Amiga switch graphics modes in the middle of a horizontal line? Or was that the ST?

    1. DVI (and HDMI) doesn’t officially support a pixel clock slower than 25MHz, and most TVs are obnoxiously pedantic about what the incoming video signal is allowed to do. (Strict restrictions on permitted pixel clocks, h/v sync rates, resolution). Feeding through the RGBtoHDMI and the commensurate delays seems to be mandatory for compatibility reasons.

      That said, RGBtoHDMI is already less than one vsync of latency.

Leave a Reply to rnjacobsCancel reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.