Repair failed, not enough repair blocks but works fine in quickpar?
Posted: April 7th, 2010, 1:13 pm
Version: (Ex: 0.5.0 Final)
OS: ClearOS
Install-type: python source
Skin (if applicable): Smpl-white
Firewall Software: : None
Are you using IPV6? no
Is the issue reproducible? yes
Hello
I've installed sabnzbd on a ClearOS installation by using this guide:
http://www.clearfoundation.com/componen ... w/id,6380/
It works pretty well so far, however when I've downloaded something first sabnzbd verifies it, then it starts repairing it, but somewhere along the way it claims that there are not enough repair blocks and places the files in my 'completed' folder.
Now the thing is, if I copy the entire directory to my windows PC and run quickpar from there it has plenty of repair blocks and will repair it just fine.
The .rar files are usually archives in the neighbourhood of 25-50gbs per rar set and i'm running it off a dualcore atom which means it takes around 12-18 hours to repair a full rarset that size.
My usenetprovider is giganews.
I was wondering if my CPU simply isn't fast enough to repair archives of that magnitude? Or if I need to do something else?
Thanks
OS: ClearOS
Install-type: python source
Skin (if applicable): Smpl-white
Firewall Software: : None
Are you using IPV6? no
Is the issue reproducible? yes
Hello
I've installed sabnzbd on a ClearOS installation by using this guide:
http://www.clearfoundation.com/componen ... w/id,6380/
It works pretty well so far, however when I've downloaded something first sabnzbd verifies it, then it starts repairing it, but somewhere along the way it claims that there are not enough repair blocks and places the files in my 'completed' folder.
Now the thing is, if I copy the entire directory to my windows PC and run quickpar from there it has plenty of repair blocks and will repair it just fine.
The .rar files are usually archives in the neighbourhood of 25-50gbs per rar set and i'm running it off a dualcore atom which means it takes around 12-18 hours to repair a full rarset that size.
My usenetprovider is giganews.
I was wondering if my CPU simply isn't fast enough to repair archives of that magnitude? Or if I need to do something else?
Thanks