![]() |
[QUOTE=CollegiateMafia;92095]About 3/4 of what I'm doing in the range I just reserved is marked as duplicate.
Is this usual?[/QUOTE] Yes it is. These are factors for k-n-pairs that the client finds for free (without spending extra CPU-time), and as it is aware of the fact that there are already known factors for these k-n-pairs, it marks them as duplicate. Should have been like this already with your first range. There are out of range-factors, too. Again, these come for free. If you can, send all the three file: fact.txt, factexcl.txt and factrange.txt to [email]psp@ldausch.de[/email], it is better to have too much than too less. H. |
Small correction i only use fact.txt and factrange.txt.
And the factrange files are only there in case we want to sieve higher then 50M in the far future. So there is no real need at the moment. Cheers, Lars |
I thought factexcl could be used for gap-detection purposes.
But perhaps the factor density is still so high that there is no need. H. |
@ hhh,
yes and yes, what happens is if there is an unusally large gap in the data Joe or I can go back and look for the users submission of a factexcl.txt file. If factors are found in the factexcl.txt file over that portion, it's more than likely there was no error and the gap is real. It's alot of storage and archieve perhaps not worth it, in the end it's Lars call. At one point we were considering having only one factor file. fact.txt |
[QUOTE=hhh;92122]I thought factexcl could be used for gap-detection purposes.
But perhaps the factor density is still so high that there is no need. H.[/QUOTE] As it did not come to my mind to store the other factors for gap detection when i designed the database the data is only stored in the filesystem for archiving and i have no way to analyse them. When we moved from 20M upper bound to 50M upper bound i took all the range files i had to import them to remove lots of factors very fast. If i had thought enough before building the DB another design would have been better but now it is to late to redo everything. Lars |
Lars I think your system is just fine as it is and wouldn't change a thing.
I'm not sure how much factexcl.txt has help us in the past. Generally the gaps are so small we simply redo them. In the past we have also found missed factors through P-1 where gaps were not expected. This was quite some time ago and hopefully all of those problems have been solved with Joe's new client, etc. I wouldn't worry too much about gaps etc they are pretty easy to find simply using fact.txt. Especially with PSP and even more so with the combined dat, since the factor density is very high as HHH pointed out. |
| All times are UTC. The time now is 09:56. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.