Page 3 of 4

Re: Decoder failure: Out of memory

Posted: June 20th, 2019, 12:40 pm
by Puzzled
I notice it says segment bytes="-21730" for one of the articles, but I don't think that's used in the decoder. It might depend on which usenet server is being used.

Re: Decoder failure: Out of memory

Posted: June 20th, 2019, 1:37 pm
by loopdemack
sander wrote: June 20th, 2019, 10:01 am I downloaded that .NZB with SABnzbd ... and no problem at all: a nice *mp4 in the resulting directory, no problems in GUI nor LOG
Yes as I told its a special circumstances its not that easy.

Re: Decoder failure: Out of memory

Posted: June 20th, 2019, 4:36 pm
by Puzzled
safihre: Maybe you could add a sanity check for the amount of RAM it tries to malloc and give an error that doesn't pause the downloading if it's clearly wrong (negative or > 15M)? Don't know if that's the issue, but at least you'd find out.

Re: Decoder failure: Out of memory

Posted: June 21st, 2019, 12:27 am
by sander
Puzzled wrote: June 20th, 2019, 12:40 pm I notice it says segment bytes="-21730" for one of the articles, but I don't think that's used in the decoder. It might depend on which usenet server is being used.
Ah ... good catch. I can now reproduce with a NZB with all "segment bytes" parts set to negative value, for example 'segment bytes="-728914"'

Crafted NZB here: https://raw.githubusercontent.com/sande ... estnzb.nzb

Code: Select all

2019-06-21 07:22:37,230::WARNING::[decoder:139] Decoder failure: Out of memory
2019-06-21 07:22:37,232::DEBUG::[nzbstuff:293] Finishing import on reftestnzb 100MB auto 2151de41 2019-06-15 [01/19] - "mytestpost.part01.rar" yEnc (1/15) 10485760
2019-06-21 07:22:37,233::INFO::[decoder:141] Decoder-Queue: 0, Cache: 0, 0, 1003487232
2019-06-21 07:22:37,235::DEBUG::[__init__:924] [sabnzbd.nzbstuff.finish_import] Loading data for SABnzbd_nzf_XX_hp5 from /home/sander/Downloads/incomplete/test-negative-segment-bytes-reftestnzb/__ADMIN__/SABnzbd_nzf_XX_hp5
2019-06-21 07:22:37,235::INFO::[decoder:142] Traceback: 
Traceback (most recent call last):
  File "/usr/share/sabnzbdplus/sabnzbd/decoder.py", line 124, in run
    data = self.decode(article, lines, raw_data)
  File "/usr/share/sabnzbdplus/sabnzbd/decoder.py", line 222, in decode
    decoded_data, output_filename, crc, crc_expected, crc_correct = sabyenc.decode_usenet_chunks(raw_data, article.bytes)
MemoryError
2019-06-21 07:22:37,239::INFO::[downloader:279] Pausing
And Sab says "Decoder failure: Out of memory"

Furthermore, SAB GUI says it has downloaded "0 MB / -113 MB" (so: negative), so also SAB itself (not just sabeync) reads & uses those negative values.

Re: Decoder failure: Out of memory

Posted: June 21st, 2019, 12:47 am
by sander

Re: Decoder failure: Out of memory

Posted: June 21st, 2019, 1:38 am
by safihre
Very strange.
We do check for negative numbers.. Why doesn't that work? :O

https://github.com/sabnzbd/sabyenc/blob ... enc.c#L564

Re: Decoder failure: Out of memory

Posted: June 21st, 2019, 3:02 am
by Puzzled
Are they using a sabyenc version compiled before this commit? https://github.com/sabnzbd/sabyenc/comm ... 3bbbb0e04b

Re: Decoder failure: Out of memory

Posted: June 21st, 2019, 8:56 am
by sander
Safihre and I found out the cause in the C code of sabyenc. and I've sent a Pull Request https://github.com/sabnzbd/sabyenc/pull/15
Afther that PR, SAB will not give an OoM anymore, and the download will succeed.
If you run from source and know how to compile and install sabyenc (on Linux juist one command, or three command including the git commands), you can use it now. Otherwise wait for a new SAB release.

The real root cause is of course a wrong NZB: a negative article size in the NZB is impossible and illegal. So SABnzbd should give a warning for that, and/or refuse the NZB completely.

Re: Decoder failure: Out of memory

Posted: June 21st, 2019, 10:49 am
by loopdemack
sander wrote: June 21st, 2019, 8:56 am Safihre and I found out the cause in the C code of sabyenc. and I've sent a Pull Request https://github.com/sabnzbd/sabyenc/pull/15
Afther that PR, SAB will not give an OoM anymore, and the download will succeed.
If you run from source and know how to compile and install sabyenc (on Linux juist one command, or three command including the git commands), you can use it now. Otherwise wait for a new SAB release.

The real root cause is of course a wrong NZB: a negative article size in the NZB is impossible and illegal. So SABnzbd should give a warning for that, and/or refuse the NZB completely.
Could you speculate which tool did they use to create those crazy nzb files. I blame myself for not backing up erroneous nzb files which made out of memory bug, but I stumble on almost 450 on nzbs.org which is gone now. I will try to ask poster which tool did he used to create such nzb, we should warn indexing sites.

Re: Decoder failure: Out of memory

Posted: June 22nd, 2019, 1:13 am
by safihre
My fault guys, I just made a plain old coding bug... Thanks for reporting and pushing that it was really a bug and not your system! You were right!

Re: Decoder failure: Out of memory

Posted: June 22nd, 2019, 5:24 am
by safihre
I released a new version of SABYenc, 3.3.6.
How do you use SABnzbd, on which platform? I can create a new release for you.

Re: Decoder failure: Out of memory

Posted: June 22nd, 2019, 5:46 am
by loopdemack
I'm using ubuntu.

Re: Decoder failure: Out of memory

Posted: June 22nd, 2019, 6:09 am
by sander
loopdemack wrote: June 22nd, 2019, 5:46 am I'm using ubuntu.
The commands to get the fixed sabyenc on your Ubuntu:

Code: Select all

cd
git clone https://github.com/sabnzbd/sabyenc.git
cd sabyenc
git checkout sabyenc-python2 
sudo python setup.py install
Done

Re: Decoder failure: Out of memory

Posted: June 23rd, 2019, 6:01 am
by safihre
(if all the prerequisites are installed, info here how to get those: https://sabnzbd.org/wiki/installation/sabyenc.html )

Re: Decoder failure: Out of memory

Posted: June 23rd, 2019, 2:55 pm
by sander
It looks like @jcfp has already updated his ppa for python-sabyenc with the newest, fixed sabyenc. See https://launchpad.net/~jcfp/+archive/ubuntu/sab-addons ... version 3.3.6-0ubuntu1~jcfp1

Cool! Thank you, @jcfp

So ... Ubuntu users ... just run the sudo apt-get update / upgrade, and your SABnzbd on Ubuntu is protected against Out-of-Memory.