Articles falsely labeled missing due to slow propagation
Posted: October 22nd, 2014, 3:00 pm
Lately, I am noticing more and more releases where the nzb appears on newznab providers long before all of the articles are propagated to major usenet servers. I believe some uploaders use more obscure usenet servers that upload to others very slowly. In some cases, full propagation may take well over an hour. This appears to be the case for many episodes lately, affecting astraweb, blocknews and xsusenet (ones that I tested).
The issue is that if the article is not on the usenet server yet, sab marks is as missing. Which is the expected behavior. However, at the end the download fails because too many articles are missing. But if I simply delete the nzb, re-add it and let it re-download an hour later, all the articles are now there and it successfully completes.
My suggested solution is that either 1) sab does not mark them missing permanently if the nzb is relatively new, and retry the missing articles at the end, or a less intrusive option, 2) After the download fails due to missing articles, when the user clicks on retry, sab actually attempts to look for the missing articles one more time. I guarantee it will find them unless the nzb was malformed.
I think the second option would be a good compromise for the devs as it is not too intrusive, and although it requires manual interaction, it can easily be done by a simple button press in the web interface, or many apps that can control ab also have the ability to send a retry request.
Another solution (a band aid really) would be to delay the download by an hour or two, to make sure that all of the articles are propagated. But I don't think it is a good solution because requested take downs are more prominent these days and therefore it is quite advantageous to download the articles as soon as possible before they can get taken down.
Thanks
The issue is that if the article is not on the usenet server yet, sab marks is as missing. Which is the expected behavior. However, at the end the download fails because too many articles are missing. But if I simply delete the nzb, re-add it and let it re-download an hour later, all the articles are now there and it successfully completes.
My suggested solution is that either 1) sab does not mark them missing permanently if the nzb is relatively new, and retry the missing articles at the end, or a less intrusive option, 2) After the download fails due to missing articles, when the user clicks on retry, sab actually attempts to look for the missing articles one more time. I guarantee it will find them unless the nzb was malformed.
I think the second option would be a good compromise for the devs as it is not too intrusive, and although it requires manual interaction, it can easily be done by a simple button press in the web interface, or many apps that can control ab also have the ability to send a retry request.
Another solution (a band aid really) would be to delay the download by an hour or two, to make sure that all of the articles are propagated. But I don't think it is a good solution because requested take downs are more prominent these days and therefore it is quite advantageous to download the articles as soon as possible before they can get taken down.
Thanks