Page 2 of 4

Re: Decoder failure: Out of memory

Posted: April 28th, 2019, 10:24 am
by MiG
Of course, sometimes people repost stuff months later :)

If this is causing it I'll have to write a rotate script so anything older than, say, a year, gets moved to a subdir or something. I'll wait and see first though.

Re: Decoder failure: Out of memory

Posted: April 28th, 2019, 12:37 pm
by safihre
You can disable this behavior by going to Config Specials and unchekcing backup_for_duplicates.
But then it will only use History, so don't empty it if you want it to keep working half a year later ;)

Re: Decoder failure: Out of memory

Posted: April 28th, 2019, 3:09 pm
by MiG
As mentioned I'd prefer it to check for duplicates, I'll take a shorter list to check against over none at all (so if this is the problem, I'll write a script for it) :)

For troubleshooting purposes, does unchecking that setting prevent the backup dir from being parsed at all, or does it only stop comparing the queue items to this dir (but still loads everything)?

Re: Decoder failure: Out of memory

Posted: April 29th, 2019, 12:54 am
by safihre
The duplicate file-based check only looks for the filename to exist, which should not be very heavy.. I first thought it maybe would compute some MD5, but it doesn't.

Re: Decoder failure: Out of memory

Posted: May 1st, 2019, 3:34 am
by MiG
The nzb backup dir was probably coincidental, sometimes it works for a few jobs, sometimes it locks up straight away again... I noticed a different pattern though:

Image

They're all stuck at 88-90%, despite different sizes. Is there a specific process that kicks in at that moment?

These are the queue options that are enabled right now:
  • Only Get Articles for Top of Queue (just enabled this one, same result)
  • Check before download
  • Abort jobs that cannot be completed
  • Detect Duplicate Downloads: Pause
  • Allow proper releases
And post-processing:
  • Download all par2 files
  • Enable SFV-based checks
  • Post-Process Only Verified Jobs
  • Enable recursive unpacking
  • Ignore any folders inside archives
  • Ignore Samples
  • Use tags from indexer
  • Cleanup List: srr, nzb
  • History Retention: Keep all jobs
I just turned off Direct Unpack, still stuck.

Hardware wise I've ruled out memory issues with memtest86, and did a HDD surface scan as well.

Re: Decoder failure: Out of memory

Posted: May 1st, 2019, 6:47 am
by MiG
It gets even more puzzling... At least two of the 89% downloads on that screenshot are in the download dir, one fully downloaded and not in need of repair, the other just missing an sfv file that took a split second to restore. When I resume these downloads in SAB, the restored one jumps to 95%, then everything I start (including the restored one) eventually produces the 'out of memory' error again.

I did notice a lot of page faults (1.9M!) in process explorer, even though I restarted SAB a few times this morning:

Image

I've sampled a few other running processes, and they only display a small fraction of these. As mentioned, memtest86 ran a full pass without errors, and so far anything that gets stuck in SAB I can download without any issues with NZBget on the very same machine.

Anything else I can test / try?

Re: Decoder failure: Out of memory

Posted: May 2nd, 2019, 1:37 am
by safihre
No special process at 90%, it could be something specific in the files that triggers the segfaults. Although I'd expect a crash, not for it to keep running.
It's a bit hard to diagnose when there's many jobs. If we have it with just 1 job in the queue it could be easier to dig though the logs.

Is there any improvement if you disable Check before download?

Re: Decoder failure: Out of memory

Posted: May 29th, 2019, 6:52 am
by arhyarion
I have been experiencing the same issue that MiG first reported. This began recently - within the past month for me. I'm running macOS High Sierra on a Mac Mini with 8 GB RAM. Software version 2.3.9. A memory test said everything was ok.

The decoder failure issue does not happen with every download. Some downloads complete successfully, and others get the decoder failure. I haven't noticed a pattern yet. On 2019-04-17 MiG posted a log capture. The error on mine is almost identical - the only obvious difference I see on mine is that the extension "decoder.pyo" is "decoder.pyc" on mine. The issue will happen with only one download in the queue. Usually it will get to 90-95% complete and then present the "Decoder failure: out of memory" error.

Re: Decoder failure: Out of memory

Posted: June 2nd, 2019, 8:46 am
by airbay
Interesting

Re: Decoder failure: Out of memory

Posted: June 3rd, 2019, 1:38 am
by safihre
@arhyarion What do you have set for Article cache?

Re: Decoder failure: Out of memory

Posted: June 20th, 2019, 6:20 am
by loopdemack
I could say with 100% that this is a bug, I even changed the server, I order the new server from Hetzner and also i switched from Debian to Ubuntu 18 in order to change everything. I changed it because some admin told someone its your memory its bad , and even I tested alll of my memories with 7 days test modes not even single error with memory. Anyway I changed hardware and ordered the new server with fresh and new hw. I cant believe that sabnzbd is needing more than 32gb or 64gb or ram? I even changed from i7 to pure xeon.

And with 100% I can tell you that Decoder failure: Out of memory is 100% bug, its started not a long ago but somewhere in 2019 or at the end of 2018. Maybe its with some nzb files. Which mean its incompatibility with some nzb creator or these are bad nzb files which are normal in other nzb downloaders or its just straight bug in sabnzbd in par2 or repairing process or fetching extra blocks, but it look something with par2 part of sabnzbd. In my case I never user Check before download option.

But I would like to ask you to fix this bug please. Its very nasty because its stopping the whole download process.

Re: Decoder failure: Out of memory

Posted: June 20th, 2019, 7:35 am
by safihre
The only possible thing that I can imagine is that somehow somebody created NZB files that have huge article-sizes listed inside the NZB.
Do you have an NZB for which this happend?

Otherwise there is just nothing I can do. I use the lowest C-call possible (malloc) to ask the system for X bytes of memory and if it doesn't give it to us we have a problem. That code hasn't changed in years.

Re: Decoder failure: Out of memory

Posted: June 20th, 2019, 7:48 am
by loopdemack
Whats crazy when I tried that same nzb on old server its downloaded successfully, also I tried the same nzb on alt.bin on windows and its downloaded ok. On the new server four time it made error and 2 times it downloaded it ok. Seems when the queue is empty its downloading ok, and when there is few things in queue it could get the error.

Re: Decoder failure: Out of memory

Posted: June 20th, 2019, 9:26 am
by loopdemack
Asked to remove the nzb I can send it on pm if needed.

Stupid mediafire banned the file here is on mega

Re: Decoder failure: Out of memory

Posted: June 20th, 2019, 10:01 am
by sander
I downloaded that .NZB with SABnzbd ... and no problem at all: a nice *mp4 in the resulting directory, no problems in GUI nor LOG