Scanimate Analog Video Synths Produced Oceans Of Motion Graphics

Why doesn’t this kind of stuff ever happen to us? One lucky day back in high school, [Dave Sieg] stumbled upon a room full of new equipment and a guy standing there scratching his head. [Dave]’s curiosity about this fledgling television studio was rewarded when that guy asked [Dave] if he wanted to help set it up. From that point on, [Dave] had the video bug. The rest is analog television history.

Today, [Dave] is the proud owner and maintainer of two Scanimate machines — the first R&D prototype, and the last one of only eight ever produced. The Scanimate is essentially an analog synthesizer for video signals, and they made it possible to move words and pictures around on a screen much more easily than ever before. Any animated logo or graphics seen on TV from the mid-1970s to the mid-80s was likely done with one of these huge machines, and we would jump quite high at the chance to fiddle with one of them.

Analog television signals were continuously variable, and much like an analog music synthesizer, the changes imposed on the signal are immediately discernible. In the first video below, [Dave] introduces the Scanimate and plays around with the Viceland logo a bit.

Stick around for the second and third videos where he superimposes the Scanimate’s output on to the video he’s making, all the while twiddling knobs to add oscillators and thoroughly explaining what’s going on. If you’ve ever played around with Lissajous patterns on an oscilloscope, you’ll really have a feel for what’s happening here. In the fourth video, [Dave] dives deeper and dissects the analog circuits that make up this fantastic piece of equipment.

Here’s another way to play with scan lines: delay the output to some of them and you have a simple scrambler.

Continue reading “Scanimate Analog Video Synths Produced Oceans Of Motion Graphics”

Real Time Object Detection For $59

There was a time when making a machine to identify objects in a camera was difficult, even without trying to do it in real time. But now, you can do it with a Jetson Nano board for under $60. How well does it work? Watch [Murtaza’s] video below and see what you think.

The first few minutes of the video piqued our interest, and good thing, too, because the 50 lines of code get a 50-plus minute video! It is worth watching, though, because there’s a lot of good information about how to apply this technique in your own projects.

Continue reading “Real Time Object Detection For $59”

sample of automatically generated comics

Read Your Movies As Automatically Generated Comic Books

A research paper from Dalian University of Technology in China and City University of Hong Kong (direct PDF link) outlines a system that automatically generates comic books from videos. But how can an algorithm boil down video scenes to appropriately reflect the gravity of the scene in a still image? This impressive feat is accomplished by saving two still images per second, then segments the frames into scenes through analysis of region-of-interest and importance ranking.

movie to comic book pipeline diagram

For its next trick, speech for each scene is processed by combining subtitle information with the audio track of the video. The audio is analyzed for emotion to determine the appropriate speech bubble type and size of the subtitle text. Frames are even analyzed to establish which person is speaking for proper placement of the bubbles. It can then create layouts of the keyframes, determining panel sizes for each page based on the region-of-interest analysis.

The process is completed by stylizing the keyframes with flat color through quantization, for that classic cel shading look, and then populating the layouts with each frame and word balloon.

The team conducted a study with 40 users, pitting their results against previous techniques which require more human intervention and still besting them in every measure. Like any great superhero, the team still sees room for improvement. In the future, they would like to improve the accuracy of keyframe selection and propose using a neural network to do so.

Thanks to [Qes] for the tip!

Reverse Engineering The Weather Channel’s Magic

For American readers of a certain age, Local on the 8s likely holds a special spot in your heart. The program, once a staple of The Weather Channel, would provide viewers with a text and eventually graphical depiction of their local forecast set to some of the greatest smooth jazz ever heard outside of an elevator. In the days before smartphones, or even regular Internet access for that matter, these broadcasts were a critical part of planning your day in the 1980s through to the early 2000s.

Up until recently the technical details behind these iconic weather reports were largely unknown, but thanks to the Herculean efforts of [techknight], the fascinating engineering that went into the WeatherSTAR 4000 machines that pumped out current conditions and Shakin’ The Shack from CATV distribution centers all over the US for decades is now being documented and preserved. The process of reversing the hardware and software has actually been going on for the last couple of years, but all those juicy details are now finally going to be available on the project’s Hackaday.IO page.

It all started around Christmas of 2018, when an eBay alert [techknight] had configured for the WeatherSTAR 4000 finally fired off. His offer was accepted, and soon he had the physical manifestation of Local on the 8s in his own hands. He’d reasoned that getting the Motorola MC68010 machine working would be like poking around in a retrocomputer, but it didn’t take long for him to realize he’d gotten himself into a much larger project than he could ever have imagined.

Continue reading “Reverse Engineering The Weather Channel’s Magic”

Volumetric OLED Display Shows Bladerunner Vibe, Curious Screen Tech

[Sean Hodgins] is out with his latest video and it’s a piece of art in itself. Beyond a traditional project show and tell, he’s spun together a cyberpunk vibe to premiere the volumetric display he built from an OLED stackup. Update: He’s also documented the build.

The trick of a volumetric display is the ability to add a third dimension for positioning pixels. Here [Sean] delivered that ability with a stack up of ten screens to add a depth element. This is not such an easy trick. These small OLED displays are all over the place but they share a common element: a dark background over which the pixels appear. [Sean] has gotten his hands on some transparent OLED panels and with some Duck-Duck-Go-Fu we think it’s probably a Crystalfontz 128×56 display. Why is it we don’t see more of these? Anyone know if it’s possible to remove the backing from other OLED displays to get here. (Let us know in the comments.)

The rest of the built is fairly straight-forward with a Feather M4 board driving the ten screens via SPI, and an MPU-6050 IMU for motion input. The form factor lends an aesthetic of an augmented reality device and the production approach for the video puts this in a Bladerunner or Johnny Mnemonic universe. Kudos for expanding the awesome of the build with an implied backstory!

If you can’t find your own transparent displays, spinning things are a popular trend in this area. We just saw one last week that spun an LED matrix to form cylindrical display. Another favorite of ours is a volumetric display that spins a helix-shaped projection screen.

Free To Good Home: FPGA Supercharged Audio/Video Synthesizer

Audio and video synthesizers have been around for decades, and are pretty much only limited by one’s willingness to spend money on them.  That is, unless you can develop your own FPGA-supercharged synthesizer to really get a leg up on the consumer-grade components. Of course, as [Julian] found out in this four-year project, you tend to pay for it anyway in time spent working on your projects.

[Julian] has actually decided to stop working on the project and open-source it to anyone who wants to continue on. He has already finished the PCB layout on a gargantuan 8-layer print, done all of the routing and parts selection, and really only needed to finish testing it to complete the project. It’s powered by the Xilinx Zynq and is packed with features too: HDMI, DDR3 ram, USB, a handful of sensors, and an Arduino Uno-style header to make interfacing and programming a breeze.

While we’re sympathetic with setting aside a project that we’ve worked so hard on, with most of the work done on this one it should be pretty easy to pick up and adapt for anyone interested in carrying the torch. If you were hoping to wet your whistle with something with fewer PCB layers, though, we’ve seen some interesting (but slightly simpler) video synthesizers made out of other unique hardware as well.

 

Bitbanged DVI On A Raspberry Pi RP2040 Microcontroller

When we first saw the Raspberry Pi Pico and its RP2040 microcontroller last month it was obvious that to be more than just yet another ARM chip it needed something special, and that appeared to be present in the form of its onboard PIO peripherals. We were eagerly awaiting how the community might use them to push the RP2040 capabilities beyond their advertised limits. Now [Luke Wren] provides us with an example, as he pushes an RP2040 to produce a DVI signal suitable to drive an HDMI monitor.

It shouldn’t be a surprise that the chip can be overclocked, however it’s impressive to find that it can reach the 252 MHz necessary to generate the DVI timing. With appropriate terminations it proved possible for the GPIO lines to mimic the differential signalling required by the spec. A PCB with the RP2040 and an HDMI socket was created, also providing a couple of PMOD connectors for expansion. All code and software can be found in a GitHub repository.

The result is a usable DVI output which though it is a relatively low resolution 640×480 pixels at 60 Hz is still a major advance over the usual composite video provided by microcontroller projects. With composite support on monitors becoming a legacy item it’s a welcome sight to see an accessible path to an HDMI or DVI output without using an FPGA.

Thanks [BaldPower] for the tip.