![]() |
![]() |
#67 |
Sep 2004
53310 Posts |
![]()
I found that if I set d=5 I get about 2% faster sieving. Is it ok to change the d value? Or can this lead to faulty results. Should I leave it at default?
P.S. There should be an edit button that lasts for more than a few minutes. Last fiddled with by Joshua2 on 2006-03-11 at 06:03 |
![]() |
![]() |
![]() |
#68 |
Nov 2003
2·1,811 Posts |
![]()
You can keep on using your best d, no false results should happen. If you want you can check found factors, convert each line of the .del file (using emacs for example) as follows.
Original line: 23004155641343 | 2995125705*2^932764-1 Converted to: (2995125705*2^932764-1)%23004155641343 Then feed the converted .del file to pfgw. All results should be "Zero". |
![]() |
![]() |
![]() |
#69 |
Sep 2004
13×41 Posts |
![]()
ok, i used d=5, but did not check my results with pfgw, assuming you do that anyway. finished that section taking to 2500.
|
![]() |
![]() |
![]() |
#70 |
Sep 2004
53310 Posts |
![]()
I have sent you all the del files now through 25T. Some of the ones in this batch I had sent to you when they weren't finished in case you ran out of sieved files, but all is complete now. Can you post the line of what to enter for seiving higher n's, like what you mentioned we were going to do next?
|
![]() |
![]() |
![]() |
#71 |
Nov 2003
1110001001102 Posts |
![]()
All right, I'll check the files now. Duplicate factors (if any) are handled automatically by the merge program so that's not a problem at all.
We are moving to the n=1-2M range! ![]() ![]() ![]() Please use the command above, with -p=50-100bn. I already marked in the table that you are working on it and will update the line later. I'll sieve to 50bn because the del file will be large. |
![]() |
![]() |
![]() |
#72 |
Sep 2004
13×41 Posts |
![]()
Taking n 100-200 from p 50b-200b. I found that d=3 was fastest on one comp and d=4 on another. Is it ok that the hash overflows are many and go into the first, second, third column? At what column is it unsafe? Also, often about a third of what is one the screen is ! Erk2H. Does the frequencey of these matter? A missed factor would take quite a bit of llr time.
|
![]() |
![]() |
![]() |
#73 |
Nov 2003
70468 Posts |
![]()
All right, please use the same del file name for your whole range, the less files the better. As for Erk2H messages just ignore them, together with particulars about hash slots used. Phil, who wrote ksieve, says they don't matter. I'll do a verification for small p's comparing the output to NewPGen. But for such a large k, ksieve is faster, that's why we are using it.
BTW, my guess is that numerous Erk2H messages are related somehow to the large range of n, and/or large n_max. There were less messages for n<1M. |
![]() |
![]() |
![]() |
#74 |
Sep 2004
13·41 Posts |
![]()
Its a bit hard to use the same del file, since I'm using three different processors, but I'll try to keep them to a minimum. I've gotten hash overflows into the fourth column now, but from what you said it seems that it wouldn't matter even if they made it into the last column. Sieving in progress...
|
![]() |
![]() |
![]() |
#75 |
Jun 2004
2·53 Posts |
![]()
sieving 200bn-205bn
|
![]() |
![]() |
![]() |
#76 |
Nov 2003
2×1,811 Posts |
![]()
Templus, Joshua is sieving to 300bn but the page is not updated. If you want to do it, can you please try 300-310bn, you can do 10bn quickly. Thank you.
|
![]() |
![]() |
![]() |
#77 |
Sep 2004
13×41 Posts |
![]()
Completed sieving to 1 T.
![]() Last fiddled with by Kosmaj on 2006-05-11 at 23:36 |
![]() |
![]() |
![]() |
Thread Tools | |
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
Team sieve for OPN - 70841^53-1 | RichD | NFS@Home | 26 | 2016-11-18 07:55 |
Team sieve #22: c166 from 3270:620 | fivemack | Aliquot Sequences | 55 | 2011-02-15 23:01 |
Sierp. base 6 team sieve | mdettweiler | Conjectures 'R Us | 136 | 2008-06-07 15:33 |
Team Sieve | grobie | Riesel Prime Search | 3 | 2005-11-16 08:46 |
Team Sieve of 210885 | SlashDude | 15k Search | 21 | 2003-12-23 16:31 |