[Victor Frost] has a deep voice and a fancy top of the line camera. While one would assume this to be a more than generous situation for life to put a person in; it’s got its own set of problems. Mainly that his fantastic fancy camera uses the most modern version of the popular h.264 encoding scheme, h.265. Gasp!
While that too seems like a pro, unfortunately h.265 doesn’t play as nice with his editing software. The solution seems easy, just transcode it and get on your way. However, when you start talking about transcoding 4K video from a top-of-the line source and retaining the quality. Well… It can bring a processor to its knees. Since he’d rather be playing overwatch than transcoding video on his main computer, he decided to offload and automate the drudgery to his spare.
That’s how the Ingest-a-Tron 9000 came into play. It uses a lot of open source software and, yes, windows batch files to take the files off his camera, process it on one computer, and dump it to another. Now he can game (or edit) while he waits. For those of us who are estranged from Linux thanks to our favorite software, it’s good to know that there are still ways to automate away the pain. Video after the break.
Maybe add a brief introductory note about why [Victor Frost] is using the Samsung NX500 camera instead of a less fantastic source that provides h.264 files?
Because it’s what he had, I’m going to assume
I got invested into the Samsung NX line, starting with the NX300, due to the price point to quality ratio at the time and bought several great NX mount lenses over the years. When my NX300 broke, I got an NX500. Sadly, they are no longer making NX cameras.
When the NX500 breaks, I’ll probably switch systems.
Interesting I guess, but he made a 20+ minute video on coping a file and then running a command on it. The whole second file that he uses to run the script from another computer is sorta the long way around.
He should have just put the transcode.bat file on a 1 minute task, and start it with: IF NOT EXIST “D:\cameravideopath\” exit
Then he wouldn’t even need to run the file, it would just run it whenever the camera is plugged in, extra lazy with fewer things to go wrong.
I thought about that, but I’ve tried automating stuff with windows tasks in the past and they’ve proven to be less than reliable.
I assume that transcoding from h.265 to h.264 takes much less time than going the other way, especially if you set things to just direct copy the audio and to allow the file sizes to increase a lot.
h.265 is quite amazing. It can pack a “half hour” sitcom episode into less than 200 megabytes in 720p resolution. To get the same quality with h.264 takes near 300 megabytes.
What’s a good CPU for h.265 encoding? I’m using a dual core Phenom II 555 and with that codec it’s like an old Pentium 3 encoding to DVD resolution MPEG2. At best 7 frames per second encoding, usually down around 3.
You’ll need: core i5/core i7 or equivalent, 8GB or 16GB fast memory, 3 x SSD (1 for OS+software, 1 for source, and 1 for destination). My encoding tends to go the other way – MP4 to JPEG2000 for Digital Cinema Packages. I ran a comparison between some packages to see which worked better. First, on my editing machine (core i7, 8GB ram, 3 x spinning rust HDD), Win 7, Premiere Pro CS6. Then, running ffmpeg and OpenDCP under Debian as a guest on the same machine, then, as a curiosity, ffmpeg and OpenDCP Windows versions.
1. ffmpeg under Debian as a guest under Windows 7 runs faster (finishes sooner) than the native ffmpeg for Windows on the same machine
2. Compiling all the tools for Linux OpenDCP is a one-way trip to dependency hell. I gave up after 2 days.
3. Hybrid workflow works well – linux ffmpeg initially, then OpenDCP for Windows
4. Premiere Pro gets the final result faster than any other solution, because it can output to JPEG2000 directly, which bypasses the initial stage where ffmpeg is used. Even if you never used PPro to edit a video, the import/export tools are fantastic.
Ahhhhh. Batch files.
Powershell is available on windows 7 and up and seems pretty comparable to bash and other interpreters on linux. It even has a lot of syntax in common with bash, and would probably have made all this a lot easier with it’s many ‘commandlets’ which enable the same sort of chaining of filter programs as on linux. (on a side note, powershell is open source and *can* be installed on linux, though it’s a pain in the butt and won’t let you run any old batch files you have laying around)
That said, it’s nice to see this, since writing complicated batch files is kinda my secret hobby and I derive a smug satisfaction in making the clunky “language” do things it really wasn’t designed to do… (current interest: object-oriented style of batch programming, for other projects)
Question: The files are transferred over a gigabit direct connection between machines, but if I just set up the direct connection, will it automatically be used, being the fastest route between machines? (or it it a result of how the destination is specified, being on that new separate cabled network at a specific address?)
Nah, I had to force windows to use it by setting up a manual route in the hosts file.
It’s worth noting that larger media companies usually have pipeline devs writing essentially more elaborate and streamlined versions of this and other software to avoid tedious manual file wrangling. Batch scripting like this is a godsend for anyone having to handle transcoding and organization of large amounts of footage.