Advanced Duplicate Processing

Want something added? Ask for it here.
Post Reply
zymurgist
Newbie
Newbie
Posts: 4
Joined: June 18th, 2015, 7:22 am

Advanced Duplicate Processing

Post by zymurgist »

I like the improvement made to detect duplicate episodes in series, but I have some ideas on how to advance on the duplicate detection.

1. If "Pause on duplicates" is selected, resume the duplicate if the primary fails. With more and more failures due to missing articles these days, being a little smarter here would be huge.
2. If all duplicates fail for same NZB name (this is more complex), would it be possible to use articles from multiple dupe nzbs to reconstruct the destination file?

The first one seems pretty straight forward, but the second would would be complex and powerful. There are more and more reposts these days due to articles being taken down by usenet providers. It would be awesome to be able to reconstruct the file from multiple reposts of the same nzb.

Thank you!
ALbino
Full Member
Full Member
Posts: 214
Joined: October 23rd, 2014, 12:28 am

Re: Advanced Duplicate Processing

Post by ALbino »

I do the second one manually sometimes. If it's broken across multiple uploads/cross-posts then I'll copy them into a temp directory keeping the largest RAR from each set. Often times by doing that you can get enough to repair.
User avatar
shypike
Administrator
Administrator
Posts: 19774
Joined: January 18th, 2008, 12:49 pm

Re: Advanced Duplicate Processing

Post by shypike »

zymurgist wrote: 2. If all duplicates fail for same NZB name (this is more complex), would it be possible to use articles from multiple dupe nzbs to reconstruct the destination file?
#2 is not worth the effort.
If the NZBs refer to different uploads then it's unlikely that the combined parts verify.
Maybe sometimes it works, but there no certainty at all.
Potentially wasting even more bandwidth.

#1 is interesting, but I think you'll be better of with a front-end like SickBeard, which does this already.
zymurgist
Newbie
Newbie
Posts: 4
Joined: June 18th, 2015, 7:22 am

Re: Advanced Duplicate Processing

Post by zymurgist »

ALbino wrote:I do the second one manually sometimes. If it's broken across multiple uploads/cross-posts then I'll copy them into a temp directory keeping the largest RAR from each set. Often times by doing that you can get enough to repair.
I was hoping it was doable in some way. Manual is fine since it's not something that happens often. Care to provide a step by step how-to? Thanks!
zymurgist
Newbie
Newbie
Posts: 4
Joined: June 18th, 2015, 7:22 am

Re: Advanced Duplicate Processing

Post by zymurgist »

shypike wrote:
zymurgist wrote: 2. If all duplicates fail for same NZB name (this is more complex), would it be possible to use articles from multiple dupe nzbs to reconstruct the destination file?
#2 is not worth the effort.
If the NZBs refer to different uploads then it's unlikely that the combined parts verify.
Maybe sometimes it works, but there no certainty at all.
Potentially wasting even more bandwidth.

#1 is interesting, but I think you'll be better of with a front-end like SickBeard, which does this already.
Thanks. I'll have to take a look at SickBeard. I'm not familiar with it.
ALbino
Full Member
Full Member
Posts: 214
Joined: October 23rd, 2014, 12:28 am

Re: Advanced Duplicate Processing

Post by ALbino »

zymurgist wrote:
ALbino wrote:I do the second one manually sometimes. If it's broken across multiple uploads/cross-posts then I'll copy them into a temp directory keeping the largest RAR from each set. Often times by doing that you can get enough to repair.
I was hoping it was doable in some way. Manual is fine since it's not something that happens often. Care to provide a step by step how-to? Thanks!
Well, there's not much of a step-by-step, but essentially you just 1) download them both individually, 2) put one of them in a temp folder, then 3) copy the second one into the temp folder as well and either it will copy successfully because the file was missing all together or it will be prompt you to replace the existing file. If the one you're copying is larger then say yes, and if it's smaller say no. Eventually you end up with the best possible RARs and PARs from each separate posting. This "trick" has really only been useful a handful of times, but it does work on occasion if the uploader is using the same RAR and PAR sets.

A similarly good way to go about it is to assume a bad NZB from the indexer and just download the specific broken RARs/PARs from Binsearch/NZBClub or whatever.
Post Reply