Wait Fetch Issue

Get help with all aspects of SABnzbd
Forum rules
Help us help you:
  • Are you using the latest stable version of SABnzbd? Downloads page.
  • Tell us what system you run SABnzbd on.
  • Adhere to the forum rules.
  • Do you experience problems during downloading?
    Check your connection in Status and Interface settings window.
    Use Test Server in Config > Servers.
    We will probably ask you to do a test using only basic settings.
  • Do you experience problems during repair or unpacking?
    Enable +Debug logging in the Status and Interface settings window and share the relevant parts of the log here using [ code ] sections.
Post Reply
whatwhynd
Newbie
Newbie
Posts: 1
Joined: May 15th, 2019, 10:40 pm

Wait Fetch Issue

Post by whatwhynd » May 15th, 2019, 10:56 pm

Hi all, new to the forums but I did search and read through quite a few topics, none seem to be similar to the issue I'm having.

Running sab on my QNAP as a qpkg, and trying to get Mylar to work but everything Mylar finds on nzbgeek and sends to sab seems to be stuck in a WAIT/FETCH loop. Everything runs as a qpkg on the QNAP, indexers are manually added to Mylar, "comics" category set up in both sab and Mylar (although in sab it's just a category, no folders have been assigned to it but neither have they been done for other categories and I haven't run into any issues, whether that's because of Mylar and Sonarr/Radarr handling things differently, I don't know)

I've checked all my settings, API keys, folders, etc on NZBgeek, Mylar, and sab multiple times and everything *seems* to be in order, however I'm fairly new to all this so I may have overlooked something.

I can add the nzb files from geek manually without issue, and the files are definitely showing in my Grab History on geek too. Mylar sends the files to sab as soon as I click "Retry", it's just the Wait/Fetch thing I can't seem to figure out.

I've paid for the indexer and provider, I have plenty of space on the QNAP.

Finally, other downloads from Sonarr and Radarr via nzbgeek are working perfectly fine and not giving this issue.

I also ask you to please forgive me in advance for my lack of knowledge on how to do certain things you may ask of me! I'm not proficient or learned even familiar in things such as scripts, python, terminal commands, ssh, etc. Basically I'm a complete noob when it comes to this stuff, so grateful for GUIs lol.

Apologies for the essay!

User avatar
safihre
Administrator
Administrator
Posts: 3304
Joined: April 30th, 2015, 7:35 am
Location: Switzerland
Contact:

Re: Wait Fetch Issue

Post by safihre » May 16th, 2019, 1:16 am

The difference between Mylar and Radarr/Sonarr is that Mylar will send SABnzbd the URL to download while Radarr/Sonarr actually download the NZB themselves and then send it to us as a file.
So it just seems as if your SABnzbd can't reach NZBGeek. Maybe if you switch logging to +Debug in the Status and Interface settings window, you can then check the logs (Show Logging button) after it happens again and it might output why it can't fetch the file.
Could be some SSL issue or other blockade.

ChuckyNorris
Newbie
Newbie
Posts: 3
Joined: June 1st, 2019, 5:49 am

Re: Wait Fetch Issue

Post by ChuckyNorris » June 1st, 2019, 6:12 am

Hi

I'm having a similar problem:
All NZB URLs sent from Sickchill to SABnzbd for the usenet-crawler.### indexer are stuck on fetch with a wait time of 24 hours, URLs for other indexers are working fine.

Log file without debug just shows (for example):
Fetching www.usenet-crawler.###/getnzb/<HASH>a567fa3.nzb&i=343390&r=<HASH>53c799a
Grabbing URL www.usenet-crawler.###/getnzb/<HASH>a567fa3.nzb&i=343390&r=<HASH>53c799a
Retry URL www.usenet-crawler.###/getnzb/<HASH>a567fa3.nzb&i=343390&r=<HASH>53c799a

I've put debugging on now so will see what that comes up with later/tomorrow.

Chuck

User avatar
safihre
Administrator
Administrator
Posts: 3304
Joined: April 30th, 2015, 7:35 am
Location: Switzerland
Contact:

Re: Wait Fetch Issue

Post by safihre » June 3rd, 2019, 1:37 am

Maybe you can send me an example URL at [email protected]?
It might just be that your API-limit has expired. In that case the indexer can send us a special header to tell SABnzbd to wait for X hours before trying again.

ChuckyNorris
Newbie
Newbie
Posts: 3
Joined: June 1st, 2019, 5:49 am

Re: Wait Fetch Issue

Post by ChuckyNorris » June 11th, 2019, 7:53 am

safihre wrote:
June 3rd, 2019, 1:37 am
It might just be that your API-limit has expired. In that case the indexer can send us a special header to tell SABnzbd to wait for X hours before trying again.
Debug log confirmed this. I contacted Usenet Crawler to find out if there is a new 1 download per day limit for free users but haven't got a response. Maybe someone else can confirm/test?

User avatar
sander
Release Testers
Release Testers
Posts: 6738
Joined: January 22nd, 2008, 2:22 pm

Re: Wait Fetch Issue

Post by sander » June 11th, 2019, 9:26 am

You can check yourself: open the RSS XML in your webbrowser (like Chrome) and check what it says.

ChuckyNorris
Newbie
Newbie
Posts: 3
Joined: June 1st, 2019, 5:49 am

Re: Wait Fetch Issue

Post by ChuckyNorris » June 20th, 2019, 7:16 am

Sorey for the late reply... I forgot.
I use Sickchill, not RSS...
Still have this issue with UC, I guess they changed their rules to 1 d/l per day without telling anyone... just weird that no one else is complaining about it

Post Reply