Multiple concurrent post-process files

Feel free to talk about anything and everything in this board.
Post Reply
Posts: 1
Joined: October 29th, 2018, 11:14 am

Multiple concurrent post-process files

Post by ecpunk » October 29th, 2018, 11:33 am

Is it possible to post-process multiple files simultaneously? I use SABPostprocess (sickbeard mp4 automator).

My server has 28 physical cores and 512G of memory with 10 flash disks, and my internet connection is "fast".

Here are the problems and bottlenecks I would like to resolve:
- Unrar does not touch disk utilization potential
- ffmpeg inherently does not take advantage of multiple threads very well

Basically, I end up with a huge queue of files waiting to be processed one by one, with utilization of server resources below 10%. I have manually tested, and to take advantage of resources (CPU is first to fill), I can process 4 files at about 200fps best case, but normally it is closer to 150fps, in which case I can process 5 files simultaneously.

I would like to select how many downloaded files can be processed/run scripts concurrently, vs the serial manner in which it works now. Is this possible, or would this need to be a feature request?


User avatar
Posts: 3289
Joined: April 30th, 2015, 7:35 am
Location: Switzerland

Re: Multiple concurrent post-process files

Post by safihre » October 30th, 2018, 5:28 am

This is not possible with SAB, but NZBGet support what you want.

For SAB you could maybe run the ffmpeg process in a seperate thread, so all the PP-script does is fire off the ffmpeg and then returns directly. This way you won't have feedback on the output.

Post Reply