Dowload does not start with SABconnect++

Get help with all aspects of SABnzbd
Forum rules
Help us help you:
  • Are you using the latest stable version of SABnzbd? Downloads page.
  • Tell us what system you run SABnzbd on.
  • Adhere to the forum rules.
  • Do you experience problems during downloading?
    Check your connection in Status and Interface settings window.
    Use Test Server in Config > Servers.
    We will probably ask you to do a test using only basic settings.
  • Do you experience problems during repair or unpacking?
    Enable +Debug logging in the Status and Interface settings window and share the relevant parts of the log here using [ code ] sections.
Post Reply
Ellea
Newbie
Newbie
Posts: 6
Joined: March 4th, 2016, 9:40 am

Dowload does not start with SABconnect++

Post by Ellea »

Hello,

I use sabnzbd with Chrome and the extension SABconnect++, but it does not work :

When I send the nzb via Sabconnect++ to sabnzbd, I have to wait for the download to start and it is written in sabnzbd : "Patientez 45s. Essai de récupération du NZB depuis https://www.binsearch.info/?action=nzb&402809786=1" and sometimes after a long time, the download starts, sometimes not.

If I translate, it is "Wait 45s. Try to recover NZB from https://www.binsearch.info/?action=nzb&402809786=1"

Is it a bad configuration ?

Or Sabconnect++ does nork correctly ?

Thanks for your help :)
User avatar
sander
Release Testers
Release Testers
Posts: 8811
Joined: January 22nd, 2008, 2:22 pm

Re: Dowload does not start with SABconnect++

Post by sander »

"Essai de récupération du NZB depuis %s" is "Trying to fetch NZB from %s"

The cause & real reason: binsearch.info is not reachable / not answering. SABnzbd will try again. And as soon as Binsearch gives the NZB, SABnzbd will start downloading.
Ellea
Newbie
Newbie
Posts: 6
Joined: March 4th, 2016, 9:40 am

Re: Dowload does not start with SABconnect++

Post by Ellea »

Hello Sander,

Thank you for your answer. However when I put "https://www.binsearch.info/?action=nzb&402809786=1" in Chrome, I have directly the NZB, so binsearch.info is reachable / answering...
User avatar
safihre
Administrator
Administrator
Posts: 5339
Joined: April 30th, 2015, 7:35 am
Contact:

Re: Dowload does not start with SABconnect++

Post by safihre »

It could be that Binsearch is blocking requests that don't come from the website, only when the right cookie is set they will allow it (just a guess).

I would actually suggest you try an indexer like NZBGeek, this will give you much more joy than using binsearch! All the hidden downloads you can find via there.
If you like our support, check our special newsserver deal or donate at: https://sabnzbd.org/donate
User avatar
sander
Release Testers
Release Testers
Posts: 8811
Joined: January 22nd, 2008, 2:22 pm

Re: Dowload does not start with SABconnect++

Post by sander »

I put that binsearch into my SAB (running on an old Ubuntu), and SAB indeed does not fetch the NZB. And sabnzbd.log is interesting:

Code: Select all

2017-12-11 13:57:15,721::DEBUG::[urlgrabber:131] Error "<urlopen error [errno 1] _ssl.c:510: error:14077410:ssl routines:ssl23_get_server_hello:sslv3 alert handshake failure>" trying to get the url https://www.binsearch.info/?action=nzb&402809786=1
Full log:

Code: Select all

2017-12-11 13:57:15,647::INFO::[__init__:550] Fetching https://www.binsearch.info/?action=nzb&402809786=1
2017-12-11 13:57:15,647::DEBUG::[nzbqueue:298] Creating placeholder NZO
2017-12-11 13:57:15,648::INFO::[nzbqueue:277] Saving queue
2017-12-11 13:57:15,648::DEBUG::[__init__:877] [sabnzbd.nzbqueue.save] Saving data for SABnzbd_nzo_blWAxh in /home/sander/.sabnzbd/admin/future
2017-12-11 13:57:15,649::DEBUG::[__init__:948] [sabnzbd.nzbqueue.save] Saving data for queue10.sab in /home/sander/.sabnzbd/admin/queue10.sab
2017-12-11 13:57:15,650::INFO::[urlgrabber:124] Grabbing URL https://www.binsearch.info/?action=nzb&402809786=1
2017-12-11 13:57:15,721::DEBUG::[urlgrabber:131] Error "<urlopen error [errno 1] _ssl.c:510: error:14077410:ssl routines:ssl23_get_server_hello:sslv3 alert handshake failure>" trying to get the url https://www.binsearch.info/?action=nzb&402809786=1
2017-12-11 13:57:15,721::DEBUG::[urlgrabber:325] No usable response from indexer, retry after 60 sec
2017-12-11 13:57:15,721::INFO::[urlgrabber:203] Retry URL https://www.binsearch.info/?action=nzb&402809786=1
2017-12-11 13:58:15,786::INFO::[urlgrabber:124] Grabbing URL https://www.binsearch.info/?action=nzb&402809786=1
2017-12-11 13:58:15,804::DEBUG::[urlgrabber:131] Error "<urlopen error [errno 1] _ssl.c:510: error:14077410:ssl routines:ssl23_get_server_hello:sslv3 alert handshake failure>" trying to get the url https://www.binsearch.info/?action=nzb&402809786=1
2017-12-11 13:58:15,804::DEBUG::[urlgrabber:325] No usable response from indexer, retry after 120 sec
2017-12-11 13:58:15,805::INFO::[urlgrabber:203] Retry URL https://www.binsearch.info/?action=nzb&402809786=1
2017-12-11 14:00:15,936::INFO::[urlgrabber:124] Grabbing URL https://www.binsearch.info/?action=nzb&402809786=1
2017-12-11 14:00:15,950::DEBUG::[urlgrabber:131] Error "<urlopen error [errno 1] _ssl.c:510: error:14077410:ssl routines:ssl23_get_server_hello:sslv3 alert handshake failure>" trying to get the url https://www.binsearch.info/?action=nzb&402809786=1
2017-12-11 14:00:15,950::DEBUG::[urlgrabber:325] No usable response from indexer, retry after 180 sec
2017-12-11 14:00:15,950::INFO::[urlgrabber:203] Retry URL https://www.binsearch.info/?action=nzb&402809786=1
User avatar
sander
Release Testers
Release Testers
Posts: 8811
Joined: January 22nd, 2008, 2:22 pm

Re: Dowload does not start with SABconnect++

Post by sander »

CLI testing on the old Ubuntu (14.04):

Code: Select all

sander@haring:~$ python -c "import urllib2; f = urllib2.urlopen('https://www.binsearch.info/?action=nzb&402809786=1'); print f.read()[:100] "
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/usr/lib/python2.7/urllib2.py", line 127, in urlopen
    return _opener.open(url, data, timeout)
  File "/usr/lib/python2.7/urllib2.py", line 404, in open
    response = self._open(req, data)
  File "/usr/lib/python2.7/urllib2.py", line 422, in _open
    '_open', req)
  File "/usr/lib/python2.7/urllib2.py", line 382, in _call_chain
    result = func(*args)
  File "/usr/lib/python2.7/urllib2.py", line 1222, in https_open
    return self.do_open(httplib.HTTPSConnection, req)
  File "/usr/lib/python2.7/urllib2.py", line 1184, in do_open
    raise URLError(err)
urllib2.URLError: <urlopen error [Errno 1] _ssl.c:510: error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure>
So: reproducable with an old Ubuntu 14.04
User avatar
sander
Release Testers
Release Testers
Posts: 8811
Joined: January 22nd, 2008, 2:22 pm

Re: Dowload does not start with SABconnect++

Post by sander »

On Ubuntu 17.10:
- SAB can download
- CLI gives a forbidden. So ... binsearch is blocking python CLIE ... >:(

Code: Select all

sander@sammie-1710:~$ python -c "import urllib2; f = urllib2.urlopen('https://www.binsearch.info/?action=nzb&402809786=1'); print f.read()[:100] "
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/usr/lib/python2.7/urllib2.py", line 154, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python2.7/urllib2.py", line 435, in open
    response = meth(req, response)
  File "/usr/lib/python2.7/urllib2.py", line 548, in http_response
    'http', request, response, code, msg, hdrs)
  File "/usr/lib/python2.7/urllib2.py", line 473, in error
    return self._call_chain(*args)
  File "/usr/lib/python2.7/urllib2.py", line 407, in _call_chain
    result = func(*args)
  File "/usr/lib/python2.7/urllib2.py", line 556, in http_error_default
    raise HTTPError(req.get_full_url(), code, msg, hdrs, fp)
urllib2.HTTPError: HTTP Error 403: Forbidden
sander@sammie-1710:~$
User avatar
safihre
Administrator
Administrator
Posts: 5339
Joined: April 30th, 2015, 7:35 am
Contact:

Re: Dowload does not start with SABconnect++

Post by safihre »

It works on Windows.. Maybe OpenSSL/OS specific?
Have to indeed add headers:

Code: Select all

import urllib2
req = urllib2.Request('https://www.binsearch.info/?action=nzb&402809786=1')
req.add_header('User-Agent', 'SABnzbd')
urllib2.urlopen(req).read()[:100]
If you like our support, check our special newsserver deal or donate at: https://sabnzbd.org/donate
User avatar
sander
Release Testers
Release Testers
Posts: 8811
Joined: January 22nd, 2008, 2:22 pm

Re: Dowload does not start with SABconnect++

Post by sander »

@Ellea:

On which device / Operating system is your SABnzbd running?
In SABnzbd, if you turn on DEBUG, can you inspect sabnzbd.log and post the relevant lines here (the lines containing 'binsearch')?
User avatar
sander
Release Testers
Release Testers
Posts: 8811
Joined: January 22nd, 2008, 2:22 pm

Re: Dowload does not start with SABconnect++

Post by sander »

Some more testing with Safire's code:

Works well on Ubuntu 17.10:

Code: Select all

sander@sammie-1710:~$ python -c "import urllib2; req = urllib2.Request('https://www.binsearch.info/?action=nzb&402809786=1'); req.add_header('User-Agent', 'SABnzbd'); print urllib2.urlopen(req).read()[:200] "
<?xml version="1.0" encoding="iso-8859-1" ?>
<!DOCTYPE nzb PUBLIC "-//newzBin//DTD NZB 1.0//EN" "http://www.newzbin.com/DTD/nzb/nzb-1.0.dtd">
<!-- NZB Generated by Binsearch.info -->
<nzb xmlns="ht
sander@sammie-1710:~$ 

SSL handshake error on old Ubuntu 14.04 with old python 2.7.6:

Code: Select all

sander@haring:~$ python -c "import urllib2; req = urllib2.Request('https://www.binsearch.info/?action=nzb&402809786=1'); req.add_header('User-Agent', 'SABnzbd'); print urllib2.urlopen(req).read()[:200] "
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/usr/lib/python2.7/urllib2.py", line 127, in urlopen
    return _opener.open(url, data, timeout)
  File "/usr/lib/python2.7/urllib2.py", line 404, in open
    response = self._open(req, data)
  File "/usr/lib/python2.7/urllib2.py", line 422, in _open
    '_open', req)
  File "/usr/lib/python2.7/urllib2.py", line 382, in _call_chain
    result = func(*args)
  File "/usr/lib/python2.7/urllib2.py", line 1222, in https_open
    return self.do_open(httplib.HTTPSConnection, req)
  File "/usr/lib/python2.7/urllib2.py", line 1184, in do_open
    raise URLError(err)
urllib2.URLError: <urlopen error [Errno 1] _ssl.c:510: error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure>
sander@haring:~$
OK with same old Ubuntu 14.04 with newer python 2.7.11:

Code: Select all

sander@haring:~$ python2711 -c "import urllib2; req = urllib2.Request('https://www.binsearch.info/?action=nzb&402809786=1'); req.add_header('User-Agent', 'SABnzbd'); print urllib2.urlopen(req).read()[:200] "
<?xml version="1.0" encoding="iso-8859-1" ?>
<!DOCTYPE nzb PUBLIC "-//newzBin//DTD NZB 1.0//EN" "http://www.newzbin.com/DTD/nzb/nzb-1.0.dtd">
<!-- NZB Generated by Binsearch.info -->
<nzb xmlns="ht
sander@haring:~$ 
User avatar
sander
Release Testers
Release Testers
Posts: 8811
Joined: January 22nd, 2008, 2:22 pm

Re: Dowload does not start with SABconnect++

Post by sander »

@safihre:

This could be useful in SAB as checker/warning: ssl.HAS_SNI. See results

Code: Select all

sander@haring:~$ python -c "import ssl; print ssl.HAS_SNI"
Traceback (most recent call last):
  File "<string>", line 1, in <module>
AttributeError: 'module' object has no attribute 'HAS_SNI'

Code: Select all

sander@haring:~$
sander@haring:~$ python2711 -c "import ssl; print ssl.HAS_SNI"
True

Code: Select all

sander@nanopineo2:~$ python -c "import ssl; print ssl.HAS_SNI"
True
Ellea
Newbie
Newbie
Posts: 6
Joined: March 4th, 2016, 9:40 am

Re: Dowload does not start with SABconnect++

Post by Ellea »

Hello,

On which device / Operating system is your SABnzbd running?
Club Linux Ubuntu


In SABnzbd, if you turn on DEBUG, can you inspect sabnzbd.log and post the relevant lines here (the lines containing 'binsearch')?

Code: Select all

2017-12-11 21:15:45,915::DEBUG::[interface:420] API-call from XX [Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/XX Safari/537.36] {'output': 'json', 'apikey': 'XX', 'limit': '5', 'mode': 'queue'}
2017-12-11 21:15:45,916::INFO::[__init__:499] Fetching https://www.binsearch.info/?action=nzb&402809786=1
2017-12-11 21:15:45,928::INFO::[nzbqueue:211] Saving queue
2017-12-11 21:15:45,929::DEBUG::[__init__:829] Saving data for SABnzbd_nzo_NKf2Yf in /home/sabnzbdplus/.sabnzbd/admin/future
2017-12-11 21:15:45,931::INFO::[__init__:904] Saving data for queue9.sab in /home/sabnzbdplus/.sabnzbd/admin/queue9.sab
2017-12-11 21:15:45,933::DEBUG::[notifier:112] Sending registration to localhost:23053
2017-12-11 21:15:45,934::DEBUG::[notifier:219] To : localhost:23053 <<class 'gntp.GNTPRegister'>>
2017-12-11 21:15:45,935::DEBUG::[growler:171] Cannot register with Growl [Errno 111] Connection refused
2017-12-11 21:15:46,436::INFO::[urlgrabber:116] Grabbing URL https://www.binsearch.info/?action=nzb&402809786=1
2017-12-11 21:15:46,666::DEBUG::[urlgrabber:356] No response from indexer, retry after 60 sec
2017-12-11 21:15:46,667::INFO::[urlgrabber:179] Retry URL https://www.binsearch.info/?action=nzb&402809786=1
2017-12-11 21:15:47,219::DEBUG::[interface:420] API-call from XX [Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/XX Safari/537.36] {'output': 'json', 'apikey': 'XX', 'limit': '5', 'mode': 'queue'}
2017-12-11 21:15:47,223::DEBUG::[interface:420] API-call from XX [Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/XX Safari/537.36] {'output': 'json', 'apikey': 'XX', 'limit': '10', 'mode': 'history'}
I have replaced IP and APIkey by XX
User avatar
sander
Release Testers
Release Testers
Posts: 8811
Joined: January 22nd, 2008, 2:22 pm

Re: Dowload does not start with SABconnect++

Post by sander »

"Club Linux Ubuntu"? Never heard of. Can't find it via Google.


2017-12-11 21:15:46,436::INFO::[urlgrabber:116] Grabbing URL https://www.binsearch.info/?action=nzb&402809786=1
2017-12-11 21:15:46,666::DEBUG::[urlgrabber:356] No response from indexer, retry after 60 sec

Ah: "No response from indexer" ... that is NOT the problem I can reproduce on my old Ubuntu.

Questions:
1) what is the python version on the Ubuntu Check via Sabnzbd -> Config
2) Is there a firewall on the Linux system?
3) what do you get if you run the one-liner on your Linux, so:

python -c "import urllib2; req = urllib2.Request('https://www.binsearch.info/?action=nzb&402809786=1'); req.add_header('User-Agent', 'SABnzbd'); print urllib2.urlopen(req).read()[:200] "
Post Reply