[Ben Cox] found some interesting USB devices on eBay. The Epiphan VGA2USB LR accepts VGA video on one end and presents it as a USB webcam-like video signal on the other. Never have to haul a VGA monitor out again? Sounds good to us! The devices are old and abandoned hardware, but they do claim Linux support, so one BUY button mash later and [Ben] was waiting patiently for them in the mail.
But when they did arrive, the devices didn’t enumerate as a USB UVC video device as expected. The vendor has a custom driver, support for which ended in Linux 4.9 — meaning none of [Ben]’s machines would run it. By now [Ben] was curious about how all this worked and began digging, aiming to create a userspace driver for the device. He was successful, and with his usual detail [Ben] explains not only the process he followed to troubleshoot the problem but also how these devices (and his driver) work. Skip to the end of the project page for the summary, but the whole thing is worth a read.
The resulting driver is not optimized, but will do about 7 fps. [Ben] even rigged up a small web server inside the driver to present a simple interface for the video in a pinch. It can even record its output to a video file, which is awfully handy. The code is available on his GitHub repository, so give it a look and maybe head to eBay for a bit of bargain-hunting of your own.
Actually, I just bought a component video to USB capture stick for about $18 a little while back. I can’t guarantee it would work with VGA – there could be issues with getting the sync right, but VGA is little different from component video, with the biggest difference being the connector.
Little different, except for the fact that colour data is sent for each individual channel (as R, G and B intensity) rather than a colour burst, and that it’s a progressive rather than interlaced signal at twice the line rate?
Scratch the first point as on second reading I see you’re talking about component rather than composite. The second still stands though. Chances are you don’t have the bandwidth.
I believe component video can also be progressive, though that scan mode might not be supported on many component devices. The real issue is that component video is in the YPbPr color encoding system– one wire for luminance (total brightness), one for the difference between blue and luma, and one for the difference between red and luma– whereas VGA uses plain RGB. The signal will have to be converted to RGB. There are cheap hardware converters that can do the job. It is also possible that it could be done in software, but I don’t know, not being a video expert.
Sixth generation consoles (dreamcast, xbox, ps2, gs with stupid expensive dongle) all supported progressive component 480p.
RGB to YCbCr conversion in software is one matrix multiplication. Your computer does it millions of times every time and JPEGs plop on screen :)
If you search forums old enough, you will find passive VGA to composite circuits, which kinda work. If i remember correctly, it was possible to get image, BUT you needed to properly set resolution and timing on the PC. It was rather tricky. Never actualy tried it. However there are VGA to composite ICs or ready made active devices, which can be used for this…
I had the ones over on Tomi Engdahls ePanorama pages working, with the dos driver that fudged your synch rates. You’d need an older VGA card though, cirrus or trident IIRC and from around ’05 upward, even if you’ve got regular PCI slots a lot of motherboards don’t recognise a video card in them. However, if you step up to GeForce 2/3/4 era AGP cards or ATIs eXpert @work Rage card, many of them had TV out with 9x/XP drivers, then TVout went out of fashion again until it appeared as HDMI
Coincidentally I posted about my experience with that solution merely one day ago on hackernews :o
My first 386 PC used TV connected over RGB (SCART socket) to VGA card with VGATV.COM TRS loaded to reprogram CRTC for 15KHz.
Today you can use CRT Emudriver or Soft-15kHz to do this on modern hardware.
You cant do passive composite from RGB at all. You cant do passive Component from RGB either, you need resistor dividers and opamps.
Well, I don’t know about that, but it does occur to me now, that in component video, the three signals are Y, U, and V, not R, G, and B, so I guess that’s not at all the same as VGA.
We used to supply these to clients back in the day. Rock solid hardware never had any returns for Epiphan devices. We had many of them running 24-7 365 for years at a time.
One thing I would love to have is a device that will save a screenshot of a VGA feed to a microSD card or other similar memory as a jpg or png or similar at the touch of a button that can be plugged into a usb port and enumerate as a USB memory stick to retrieve the images.
It would be perfect for taking a screenshot of errors etc shown onscreen when maintaining the couple thousand servers I look after, rather than plugging in a monitor and trying to take a not too blurry image with my phone.
I sometimes wonder how hard it would be to make something myself, but software is all black magic wizardry to me….
Then you can do GUI frame relay on top of IPOAC … to close the loop you can relay back mouse coordinates via semaphore tower.
Could you not employ a set where the software on the pc screen shots to the location via script of via an event driven from a rasp pi or such.
We linked a load of these to Pi 1s back in the day inside video slot machines so the cashbox staff could see credits/wins without leaving the cashbox. Only really got 1 fps out of them and dropped the frame to a web page on the pi but was more than enough. Since then we’ve other ways of reading the machines so I still have a handful lying about somewhere, must dig them out.
I wonder if it could be reprogrammed to work as a scope or SDR. $25 for a 3 channel scope with 80MHz or so of bandwidth would be insanely cheap, likewise $25 for a SDR with a lot more bandwidth than a RTLSDR would also be really nice.
possibly, it looks like it has a spartan6 fpga, some memory and a 3*8bit@170MHz adc
Writing software (afaik to this day not a single usable open source digital scope frontend solution exist) would also be insanely expensive (time consuming). At the point you release software all the $25 units would vanish from secondary market making whole exercise pointless. Not to mention you are still lacking a frontend.
For SDR, this wouldn’t perform any better than the $10 dongles, since the limitation is 8 bits of analog-to-digital conversion. This kind of puts a limit on dynamic range.
https://hackaday.com/2019/05/30/glscopeclient-a-permissively-licensed-remote-oscilloscope-utility/
They still seems to be updating drivers for these: https://ssl.epiphan.com/downloads/linux/ This lists drivers for both Debian w/ 4.19 kernel and Ubuntu w/ 5.3 kernel
So it runs fine with an older debian version inside a VM?
Could the device be used as a general purpose 3-channel ADC?
its limited to ~35-40MB/s of Cypress FX2LP USB interface chip
I am surprised he didn’t just use an older version of Linux that worked with the driver…
They are currently not what I would call Cheap on the worlds favourite auction site. £65 plus postage, and some selling for more than we bought at trade prices 10 years ago.