3D Printer Streaming Solution Unlocks Webcam Features

While 3D printer hardware has come along way in the past decade and a half, the real development has been in the software. Open source slicers are constantly improving, and OctoPrint can turn even the most basic of printers into a network-connected powerhouse. But despite all these improvements, there’s still certain combinations of hardware that require a bit of manual work.

[Reticulated] wanted an easy way to monitor his prints over streaming video, but didn’t have any of the cameras that are supported by OctoPrint. Of course he could just point a cheap network-connected camera at the printer and be done with it, but he was looking for a bit better integration than that. In the process, he demonstrates how to unlock some features hidden in inexpensive webcams.

He set about building something that wouldn’t require buying more equipment or overloading the limited hardware responsible for the actual printing. A few of his existing cameras have RTMP support, which allows a fairly straightforward setup with YouTube Live once Monaserver is set up to handle the RTMP feeds from the cameras and OBS Studio is configured to stream it out to YouTube. Using the OctoPrint API, he was able to pull data such as the current extruder temperature and overlay it on the video.

One of the other interesting parts of this build is that not all of [Reticulated]’s cameras have built-in RTMP support but following this guide he was able to get more of them working with this setup than otherwise would have had this capability by default. Even beyond 3D printing, this is an excellent guide (and tip) for getting a quick live stream going for whatever reason. For anything more mobile than a working 3D printer, though, you might want to look at taking your streaming setup mobile instead.

MovieCart Plays Videos On The Atari 2600

The original Xbox and PlayStation 2 both let you watch DVD movies in addition to playing games. Seldom few consoles before or since offered much in the way of media, least of all the Atari 2600, which was too weedy to even imagine such feats. And yet, as covered by TechEBlog[Lodef Mode] built a cartridge that lets it play video.

It’s pretty poor quality video, but it is video! The MovieCart, as it is known, is able to play footage at 80×192 resolution, with a color palette limited by the capabilities of the Atari 2600 hardware. It’s not some sneaky video pass-through, either—the Atari really is processing the frames.

To play a video using the MovieCart, you first have to prepare it using a special utility that converts video into the right format for the cart. The generated video file is then loaded on a microSD card which is then inserted into the MovieCart. All you then have to do is put the MovieCart into the Atari’s cartridge slot and boot it up.  Sound is present too, in a pleasingly lo-fi quality. Control of picture brightness and sound volume is via joystick. You could genuinely watch a movie this way if you really wanted to. I’d put on House of Gucci.

Thanks to the prodigious storage available on microSD cards, you can actually play a whole feature length movie on the hardware this way. You can order a MovieCart of your very own from Tindie, and it even comes with a public domain copy of Night of the Living Dead preloaded on a microSD card.

We don’t see a big market for Atari 2600 movies, but it’s neat to see it done. Somehow it reminds us of the hacked HitClips carts from a while ago. Video after the break.

Continue reading “MovieCart Plays Videos On The Atari 2600”

Unlimited Cloud Storage YouTube Style

[Adam Conway] wanted to store files in the cloud. However, if you haven’t noticed, unlimited free storage is hard to find. We aren’t sure if he wants to use the tool he built seriously, but he decided that if he could encode data in a video format, he could store his files on YouTube. Does it work? It does, and you can find the code on GitHub.

Of course, the efficiency isn’t very good. A 7 K image, for example, yielded a 9-megabyte video. If we were going to store files on YouTube, we’d encrypt them, too, making it even worse.

The first attempt was to break the file into pieces and encode them as QR codes. Makes sense, but it didn’t work out. To get enough data into each frame, the modules (think pixels) in the QR code were small. Combined with video compression, the system was unreliable.

Simplicity rules. Each frame is 1920×1080 and uses a black pixel as a one and a white pixel as a zero. In theory, this gives about 259 kbytes per frame. However, to help avoid problems decoding due to video compression, the real bits use a 5×5 pixel block, so that means you get about 10 kbytes of data per frame.

The code isn’t perfect. It can add things to the end of a file, for example, but that would be easy to fix. The protocol could use error correction and compression. You might even build encryption into it or store more data — old school cassette-style — using the audio channel. Still, as a proof of concept, it is pretty neat.

This might sound like a new idea, but people way back in the early home computer days could back up data to VCRs. This isn’t even the first time we’ve seen it done with YouTube.

Retrotechtacular: Studio Camera Operation, The BBC Way

If you ever thought that being a television camera operator was a simple job, this BBC training film on studio camera operations will quickly disabuse you of that notion.

The first thing that strikes you upon watching this 1982 gem is just how physical a job it is to stand behind a studio camera. Part of the physicality came from the sheer size of the gear being used. Not only were cameras of that vintage still largely tube-based and therefore huge — the EMI-2001 shown has four plumbicon image tubes along with tube amplifiers and weighed in at over 100 kg — but the pedestal upon which it sat was a beast as well. All told, a camera rig like that could come in at over 300 kg, and dragging something like that around a studio floor all day under hot lights had to be hard. It was a full-body workout, too; one needed a lot of upper-body strength to move the camera up and down against the hydropneumatic pedestal cylinder, and every day was leg day when you had to overcome all that inertia and get the camera moving to your next mark.

Operating a beast like this was not just about the bull work, though. There was a lot of fine motor control needed too, especially with focus pulling. The video goes into a lot of detail on maintaining a smooth focus while zooming or dollying, and shows just how bad it can look when the operator is inexperienced or not paying attention. Luckily, our hero Allan is killing it, and the results will look familiar to anyone who’s ever seen any BBC from the era, from Dr. Who to I, Claudius. Shows like these all had a distinctive “Beeb-ish” look to them, due in large part to the training their camera operators received with productions like this.

There’s a lot on offer here aside from the mechanical skills of camera operation, of course. Framing and composing shots are emphasized, as are the tricks to making it all look smooth and professional. There are a lot of technical details buried in the video too, particularly about the pedestal and how it works. There are also two follow-up training videos, one that focuses on the camera skills needed to shoot an interview program, and one that adds in the complications that arise when the on-air talent is actually moving. Watch all three and you’ll be well on your way to running a camera for the BBC — at least in 1982.

Continue reading “Retrotechtacular: Studio Camera Operation, The BBC Way”

Designing A Macintosh-to-VGA Adapter With An LM1881

Old-school Macintosh-to-VGA adapter. Just solve for X, set the right DIP switches and you’re golden.

If you’re the happy owner of a vintage Apple system like a 1989 Macintosh IIci you may know the pain of keeping working monitors around. Unless it’s a genuine Apple-approved CRT with the proprietary DA-15-based video connector, you are going to need at least an adapter studded with DIP switches to connect it to other monitors. Yet as [Steve] recently found out, the Macintosh’s rather selective use of video synchronization signals causes quite a headache when you try to hook up a range of VGA-equipped LCD monitors. A possible solution? Extracting the sync signal using a Texas Instruments LM1881 video sync separator chip.

Much of this trouble comes from the way that these old Apple systems output the analog video signal, which goes far beyond the physical differences of the DA-15 versus the standard DE-15 D-subminiature connectors. Whereas the VGA standard defines the RGB signals along with a VSYNC and HSYNC signal, the Apple version can generate HSYNC, VSYC, but also CSYNC (composite sync). Which sync signal is generated depends on what value the system reads on the three sense pins on the DA-15 connector, as a kind of crude monitor ID.

Theoretically this should be easy to adapt to, you might think, but the curveball Apple throws here is that for the monitor ID that outputs both VSYNC and HSYNC you are limited to a fixed resolution of 640 x 870, which is not the desired 640 x 480. The obvious solution is then to target the one monitor configuration with this output resolution, and extract the CSYNC (and sync-on-green) signal which it outputs, so that it can be fudged into a more VGA-like sync signal. Incidentally, it seems that [Steve]’s older Dell 2001FP LCD monitor does support sync-on-green and CSYNC, whereas newer LCD monitors no longer list this as a feature, which is why now more than a passive adapter is needed.

Although still a work-in-progress, so far [Steve] has managed to get an image on a number of these newer LCDs by using the LM1881 to extract CSYNC and obtain a VSYNC signal this way, while using the CSYNC as a sloppy HSYNC alternative. Other ICs also can generate an HSYNC signal from CSYNC, but those cost a bit more than the ~USD$3 LM1881.

Proper Video, From An ESP32

Back in the day a miniature television, probably on a wristwatch, was the stuff of science fiction. Now, it’s something which can be done with a commodity microcontroller, as [Atomic14] shows us with the ESP32-TV that plays both video and sound. Even with modern silicon it’s still somewhat pushing the envelope.

As he explains in the video below the break, not all formats are simple enough to be decoded on the fly by a microcontroller. But he finds an AVI file to be within its capabilities which can be created with a bit of ffmpeg wizardry. The board is a fairly standard ESP32 device with an I2C bus, and the video stream isn’t too fast for this meager interface. You’ll maybe recognize the Muppets clip, but it’s possible that the early-80s BBC comedy staple The Young Ones might have passed you by if you’re not British.

We think this code is likely to be of use in quite a few projects, and it would be great to see it further refined. Small video players for not a lot of money can never be a bad thing.

Previous ESP32 video projects which have appeared on these pages have been more likely to involve driving a display directly.

Continue reading “Proper Video, From An ESP32”

Streaming Video From An ESP32

The ESP32, while first thought to be little more than a way of adding wireless capabilities to other microcontrollers, has quickly replaced many of them with its ability to be programmed as its own platform rather than simply an accessory. This also paved the way for accessories of its own, such as various sensors and even a camera. This guide goes over taking the input from the camera and streaming it out over the network to multiple browsers.

On the server side of things, the ESP32 and its attached camera are set up with MQTT, a lightweight communications protocol which uses a publish/subscribe model to send information. The ESP32 is configured to publish its images only, but not subscribe to any other nodes. On the client side, the browser runs a JavaScript program which is able to gather these images and stitch them together into a video.

This can be quite a bit of data to send out over the ESP32’s compact hardware, so there are some tips and tricks for getting more out of these little devices, including using an external antenna for better Wi-Fi signal, or omitting it entirely in favor of Ethernet. As far as getting a lot out of a tiny microcontroller, though, leveraging MQTT really helps the ESP32 go a long way. These chips have come along way since they were first introduced; they’re powerful enough to act as 8-bit gaming consoles too.

Thanks to [Surfskidude] for the tip!