Unraveling The Secrets Of Apple’s Mysterious Fisheye Format

Apple has developed a proprietary — even mysterious — “fisheye” projection format used for their immersive videos, such as those played back by the Apple Vision Pro. What’s the mystery? The fact that they stream their immersive content in this format but have provided no elaboration, no details, and no method for anyone else to produce or play back this format. It’s a completely undocumented format and Apple’s silence is deafening when it comes to requests for, well, anything to do with it whatsoever.

Probably those details are eventually forthcoming, but [Mike Swanson] isn’t satisfied to wait. He’s done his own digging into the format and while he hasn’t figured it out completely, he has learned quite a bit and written it all up on a blog post. Apple’s immersive videos have a lot in common with VR180 type videos, but under the hood there is more going on. Apple’s stream is DRM-protected, but there’s an unencrypted intro clip with logo that is streamed in the clear, and that’s what [Mike] has been focusing on.

Most “fisheye” formats are mapped onto square frames in a way similar to what’s seen here, but this is not what Apple is doing.

[Mike] has been able to determine that the format definitely differs from existing fisheye formats recorded by immersive cameras. First of all, the content is rotated 45 degrees. This spreads the horizon of the video across the diagonal, maximizing the number of pixels available in that direction (a trick that calls to mind the heads in home video recorders being tilted to increase the area of tape it can “see” beyond the physical width of the tape itself.) Doing this also spreads the center-vertical axis of the content across the other diagonal, with the same effect.

There’s more to it than just a 45-degree rotation, however. The rest most closely resembles radial stretching, a form of disc-to-square mapping. It’s close, but [Mike] can’t quite find a complete match for what exactly Apple is doing. Probably we’ll all learn more soon, but for now Apple isn’t saying much.

Videos like VR180 videos and Apple’s immersive format display stereoscopic video that allow a user to look around naturally in a scene. But to really deliver a deeper sense of presence and depth takes light fields.

Analyzing The Code From The Terminator’s HUD

The T-800, also known as the Terminator, was like some kind of non-giving up robot guy. The robot assassin viewed the world through a tinted view with lines of code scrolling by all the while. It was cinematic shorthand to tell the audience they were looking through the eyes of a machine. Now, a YouTuber called [Open Source] has analyzed that code.

The video highlights one interesting finds, concerning graphics seen in the T-800’s vision. They appear to match the output of various code listings and articles in Nibble Magazine, specifically its September 1984 issue. One example spotted was a compass rose, spawned from an Apple Basic listing. it was a basic quiz to help teach children to understand the compass. Another graphic appears to be cribbed from the same issue in the MacPaint Patterns section.

The weird thing is that the original film came out in October 1984 — just a month after that article would have hit the news stands. It suggests perhaps someone involved with the movie was also involved or had access to an early copy of Nibble Magazine — or that the examples in the magazine were just rehashed from some other earlier source.

Code that regularly flickers in the left of the T-800s vision is just 6502 machine code. It’s apparently just a random hexdump from an Apple II’s memory. At other times, there’s also 6502 assembly code on screen which includes various programmer comments still intact. There’s even some code cribbed from the Apple II DOS 3.3 RAM Disk driver.

It’s neat to see someone actually track down the background of these classic graphics. Hacking and computers are usually portrayed in a fairly unrealistic way in movies, and it’s no different in The Terminator (1984). Still, that doesn’t mean the movies aren’t fun!

Continue reading “Analyzing The Code From The Terminator’s HUD”

Bye Bye Green Screen, Hello Monochromatic Screen

It’s not uncommon in 2024 to have some form of green background cloth for easy background effects when in a Zoom call or similar. This is a technology TV and film studios have used for decades, and it’s responsible for many of the visual effects we see every day on our screens. But it’s not perfect — its use precludes wearing anything green, and it’s very bad at anything transparent.

The 1960s Disney film makers seemingly had no problem with this as anyone who has seen Mary Poppins will tell you, so how did they manage to overlay actors with diaphanous accessories over animation? The answer lies in an innovative process which has largely faded from view, and [Corridor Crew] have rebuilt it.

Green screens, or chroma key, to give the effect its real name, relies on the background using a colour not present in the main subject of the shot. This can then be detected electronically or in software, and a switch made between shot and inserted background. It’s good at picking out clean edges between green background and subject, but poor at transparency such as a veil or a bottle of water. The Disney effect instead used a background illuminated with monochromatic sodium light behind the subject illuminated with white light, allowing both a background and foreground image to be filmed using two cameras and a dichroic beam splitter. The background image with its black silhouette of the subject could then be used as a photographic stencil when overlaying a background image.

Sadly even Disney found it very difficult to make more than a few of the dichroic prisms, so the much cheaper green screen won the day. But in the video below the break they manage to replicate it with a standard beam splitter and a pair of filters, successfully filming a colourful clown wearing a veil, and one of them waving their hair around while drinking a bottle of water. It may not find its way back into blockbuster films just yet, but it’s definitely impressive to see in action.

Continue reading “Bye Bye Green Screen, Hello Monochromatic Screen”

Streaming Deck Removes Need For Dedicated Hardware

Streaming content online has never been more popular than it is now, from YouTube to Twitch there are all kinds of creators around with interesting streams across a wide spectrum of interests. With that gold rush comes plenty of people selling figurative shovels as well, with audio mixing gear, high-quality web cams, and dedicated devices for controlling all of this technology. Often these devices take the form of a tablet-like device, but [Lenochxd] thinks that any tablet ought to be able to perform this task without needing dedicated, often proprietary, hardware.

The solution offered here is called WebDeck, an application written in Flask that turns essentially any device with a broswer into a stream control device. Of course it helps to have a touch screen as well, but an abundance of tablets and smartphones in the world makes this a non-issue. With the software running on the host computer, the streamer can control various aspects of that computer remotely by scanning a QR code which opens a browser window with all of the controls accessible from within. It has support for VLC, OBS Studio, and Spotify as well which covers the bases for plenty of streaming needs.

Currently the host software only runs on Windows, but [Lenochxd] hopes to have MacOS and Linux versions available soon. We’re always in favor of any device that uses existing technology and also avoids proprietary hardware and software. Hopefully that’s a recipe to avoid planned obsolescence and unnecessary production. If you prefer a version with a little bit of tactile feedback, though, we’ve seen other decks which add physical buttons for quick control of the stream.

Unlimited Cloud Storage YouTube Style

[Adam Conway] wanted to store files in the cloud. However, if you haven’t noticed, unlimited free storage is hard to find. We aren’t sure if he wants to use the tool he built seriously, but he decided that if he could encode data in a video format, he could store his files on YouTube. Does it work? It does, and you can find the code on GitHub.

Of course, the efficiency isn’t very good. A 7 K image, for example, yielded a 9-megabyte video. If we were going to store files on YouTube, we’d encrypt them, too, making it even worse.

The first attempt was to break the file into pieces and encode them as QR codes. Makes sense, but it didn’t work out. To get enough data into each frame, the modules (think pixels) in the QR code were small. Combined with video compression, the system was unreliable.

Simplicity rules. Each frame is 1920×1080 and uses a black pixel as a one and a white pixel as a zero. In theory, this gives about 259 kbytes per frame. However, to help avoid problems decoding due to video compression, the real bits use a 5×5 pixel block, so that means you get about 10 kbytes of data per frame.

The code isn’t perfect. It can add things to the end of a file, for example, but that would be easy to fix. The protocol could use error correction and compression. You might even build encryption into it or store more data — old school cassette-style — using the audio channel. Still, as a proof of concept, it is pretty neat.

This might sound like a new idea, but people way back in the early home computer days could back up data to VCRs. This isn’t even the first time we’ve seen it done with YouTube.

Digital Master Tapes Seek Deck

As a nerdy kid in the 90s, I spent a fair bit of time watching the computer-themed cartoon Reboot. During the course of making a documentary about the show, [Jacob Weldon] and [Raquel Lin] have uncovered the original digital master tapes of the show.

This is certainly exciting news for fans of the show, but there’s a bit of a wrinkle. These digital masters are all on D-1 digital cassette tapes which the studio doesn’t have a player for anymore. The dynamic duo are on the hunt for a Bosch BTS-D1 to be able to recapture some of this video for their own film while also heavily hinting to the studio that a new box set from the masters would be well-received.

As the first CGI TV series, Reboot has a special place in the evolution of entertainment, and while it was a technical marvel for its time, it was solid enough to last for four seasons and win numerous awards before meeting a cliffhanger ending. If you’re an expert in D-1 or have a deck to lend or sell, be sure to email the creators.

Feeling nostalgic for the electromechanical era? Why not check out some hidden lyrics on Digital Compact Cassettes (DCC) or encoding video to Digital Audio Tapes (DAT)?

[via Notebookcheck]

Video Feedback Machine Creates Analog Fractals

One of the first things everyone does when they get a video camera is to point it at the screen displaying the image, creating video feedback. It’s a fascinating process where the delay from image capture to display establishes a feedback loop that amplifies the image noise into fractal patterns. This sculpture, modestly called The God Machine II takes it to the next level, though.

We covered the first version of this machine in a previous post, but the creator [Dave Blair] has done a huge amount of work on the device since that allows him to tweak and customize the output that the device produces. His new version is quite remarkable, allowing him to create intricate fractals that writhe and change like living things.

The God Machine II is a sophisticated build with three cameras, five HD monitors, three Roland video switchers, two viewing monitors, two sheets of beam splitter glass, and a video input. This setup means it can take an external video input, capture it, and use it as the source for video feedback, then tweak the evolution of the resulting fractal image, repeatedly feeding it back into itself. The system can also control the settings for the monitor, which further changes the feedback as it evolves. [Blair] refers to this as “trapping the images.”

Continue reading “Video Feedback Machine Creates Analog Fractals”