Page 1 of 1

Multiple concurrent post-process files

Posted: October 29th, 2018, 11:33 am
by ecpunk
Is it possible to post-process multiple files simultaneously? I use SABPostprocess (sickbeard mp4 automator).

My server has 28 physical cores and 512G of memory with 10 flash disks, and my internet connection is "fast".

Here are the problems and bottlenecks I would like to resolve:
- Unrar does not touch disk utilization potential
- ffmpeg inherently does not take advantage of multiple threads very well

Basically, I end up with a huge queue of files waiting to be processed one by one, with utilization of server resources below 10%. I have manually tested, and to take advantage of resources (CPU is first to fill), I can process 4 files at about 200fps best case, but normally it is closer to 150fps, in which case I can process 5 files simultaneously.

I would like to select how many downloaded files can be processed/run scripts concurrently, vs the serial manner in which it works now. Is this possible, or would this need to be a feature request?

Thanks!

Re: Multiple concurrent post-process files

Posted: October 30th, 2018, 5:28 am
by safihre
This is not possible with SAB, but NZBGet support what you want.

For SAB you could maybe run the ffmpeg process in a seperate thread, so all the PP-script does is fire off the ffmpeg and then returns directly. This way you won't have feedback on the output.