Quote Originally Posted by Speederlander View Post
Not sure. Several people have pointed it out on the quickpar forum so the developer(s) know about it. I find it's a minor limitation really, because I use quickpar with recovery on huge files mostly when I am going to archive them or do secondary back-ups. Besides, quickpar seems to be the only game in town for windows with this level of functionality. Plus it's free. So as long as I have a work-around that gets me data security I'm good.

My process for large files is:
1. Create consistency check data with quickpar on the original file.
2. Create rar'd version on secondary back-up location.
3. Add consistency and recovery information with quickpar to rar'd files.

I can verify the original is good, verify the back-up is good, and recover the back-up if it becomes damaged. I can then verify the consistency of the reconstituted file with the original consistency check data back on the source.
I do all backups with batches, so QuickPar is almost useless for me. PhPar does the same in a more comfortable way and is faster.
I don't use it yet and I still haven't decided whether to start or wait for FreeArc - it supports ECC, but has ~year before it's really stable.

http://sourceforge.net/docman/displa...group_id=30568
It seems there's 128 bits for practically anything in par2. So file spec shouldn't limit anything.

BTW I see that QuickPar does it, but IMO it's incorrect to say "100% coverage" in case when your parity=data size. It's 50% (actually slightly less) because you may have errors in parity too and you might need to correct your parity in order to be able to correct the main data.