mersenneforum.org

mersenneforum.org (https://www.mersenneforum.org/index.php)
-   Conjectures 'R Us (https://www.mersenneforum.org/forumdisplay.php?f=81)
-   -   Riesel base 3 reservations/statuses/primes (https://www.mersenneforum.org/showthread.php?t=11151)

rogue 2019-01-30 18:15

I found and fixed a [URL="https://www.mersenneforum.org/showthread.php?p=507179#post507179"]bug in sr2sieve[/URL]. Based upon some comparisons with sieving these ranges of R3, I strongly recommend using sr2sieve with -x instead of srsieve to sieve these ranges, if you can. What would be nice is to find out which k can't be done and split them off so that srsieve is used for those and sr2sieve for the others. Alternatively maybe sr2sieve can be modified to handle larger k. That is much easier said than done.

rogue 2019-01-30 20:32

[QUOTE=rogue;507180]I found and fixed a [URL="https://www.mersenneforum.org/showthread.php?p=507179#post507179"]bug in sr2sieve[/URL]. Based upon some comparisons with sieving these ranges of R3, I strongly recommend using sr2sieve with -x instead of srsieve to sieve these ranges, if you can. What would be nice is to find out which k can't be done and split them off so that srsieve is used for those and sr2sieve for the others. Alternatively maybe sr2sieve can be modified to handle larger k. That is much easier said than done.[/QUOTE]

You cannot use sr2sieve for k >= 2^32. Unfortunately it doesn't make such a check. I have posted an updated sr2sieve in the other thread that will stop sr2sieve immediately if k >= 2^32. I will look into the effort of removing that restriction.

gd_barnes 2019-01-30 20:58

Curtis had determined that sr2sieve with the -x switch only works for k<2^31 only (not k<2^32) despite what the help file says. See post 940.

It is why he keeps a separate sieve going for k<2.147G that he feeds to BOINC from time to time.

Any new versions of sr(x)sieve make me nervous based on past experiences here. I would like a very detailed parallel test plan created ahead of time for any new version of sr2sieve that can handle k>2^31. Sieving should be done to at least P=1G with the old version of srsieve vs. the new version of sr2sieve with an exact file compare of the two sieve files. This detailed testing should be done before any public release. Included in the file should be k<2^31, k=2^31 thru 2^32, and k=2^32 thru 2^64 (or whatever the upper sieving k-limit of the new sr2sieve would be). All k-sizes should be tested with any new version of sr(x)sieve.

rogue 2019-01-30 21:40

[QUOTE=gd_barnes;507206]Curtis had determined that sr2sieve with the -x switch only works for k<2^31 only (not k<2^32) despite what the help file says. See post 940.

It is why he keeps a separate sieve going for k<2.147G that he feeds to BOINC from time to time.

Any new versions of sr(x)sieve make me nervous based on past experiences here. I would like a very detailed parallel test plan created ahead of time for any new version of sr2sieve that can handle k>2^31. Sieving should be done to at least P=1G with the old version of srsieve vs. the new version of sr2sieve with an exact file compare of the two sieve files. This detailed testing should be done before any public release. Included in the file should be k<2^31, k=2^31 thru 2^32, and k=2^32 thru 2^64 (or whatever the upper sieving k-limit of the new sr2sieve would be). All k-sizes should be tested with any new version of sr(x)sieve.[/QUOTE]

I think I know why Curtis ran into the issue with k between 2^31 and 2^32. There were two conditions where a variable could overflow and it would build invalid Legendre tables.

The current version of sr2sieve (posted earlier today) will terminate sr2sieve if k >= 2^32. That is just a stop-gap method so that users don't try using it for large k as it won't produce valid factors. It does not have a check regarding that overflow mentioned above.

I am working now on allowing sr2sieve to support k up to 2^64. The limits on the Legendre tables still exist, but the checks are more solid. If it terminates due to the inability to build the Legendre tables, it will suggest running again with -x. I am doing some side-by-side comparisons with srsieve for k > 2^32, but I always welcome users who can help verify the software as it is nearly impossible for me to test every feature.

VBCurtis 2019-01-31 01:39

I've used -x to avoid Legendre tables for all my R3 work. So, the bug you found regarding Legendre tables is not what I ran into for 2^31<k<2^32.

If you manage to get sr2sieve working for 2^31 - 2^32, I'd be happy to add that region to my big sieve! However, I believe the restriction is k < 2^32, AND squarefree part of k < 2^31. So, some k's between 2^31 and 2^32 should work, but not all.

rogue 2019-01-31 02:35

[QUOTE=VBCurtis;507227]I've used -x to avoid Legendre tables for all my R3 work. So, the bug you found regarding Legendre tables is not what I ran into for 2^31<k<2^32.

If you manage to get sr2sieve working for 2^31 - 2^32, I'd be happy to add that region to my big sieve! However, I believe the restriction is k < 2^32, AND squarefree part of k < 2^31. So, some k's between 2^31 and 2^32 should work, but not all.[/QUOTE]

There are actually multiple checks for "square free". The ones in the generate_legendre_lookup_table() routine are the ones that can overflow a 32 bit value and thus cause bad results. Those are fixed in sr2sieve 1.9.4.

The limited testing I have done with 1.9.5, which supports k > 2^32 is promising. When k > 2^32 then -x is almost alway required, unless one happens to have the right k. Compared to srsieve, I range range with 6000 k > 55G. sr2sieve (with -x) completed the range in 35 minutes, srsieve in 51 minutes. No factors were missing.

I will be posting 1.9.5 tomorrow.

rogue 2019-01-31 14:48

I decide to use version 2.0.0 instead of 1.9.5. See [URL="https://www.mersenneforum.org/showthread.php?p=507251#post507251"]here[/URL].

Beta testers are welcome to give it a spin.

rogue 2019-02-16 18:58

[QUOTE=rogue;506569]Okay. I'll adjust the reservation to 55G-60G.[/QUOTE]

Sieved with sr2sieve 2.0.0 to about 2e10. All factors found with that version are valid factors. For each 1G range that leaves a little less than 12 million terms between n=25000 and n=100000.

Using srfile I apply the factors the the ABCD file, but output as ABC. I remove the first line, then sort the file using the sort.exe that comes with cygwin. Sorting this way will sort by ascending k then n instead of ascending n then k. I split that file (by k) into as many files as I need (typically the number of cores on the target computer) and add the ABC line with the number_primes option. By doing this I can get a very accurate estimate for the number of days to test a range once the range has been running for a few days. I will provide such an estimate in a few days.

gd_barnes 2019-02-16 19:37

[QUOTE=rogue;508725]Sieved with sr2sieve 2.0.0 to about 2e10. All factors found with that version are valid factors. For each 1G range that leaves a little less than 12 million terms between n=25000 and n=100000.

Using srfile I apply the factors the the ABCD file, but output as ABC. I remove the first line, then sort the file using the sort.exe that comes with cygwin. Sorting this way will sort by ascending k then n instead of ascending n then k. I split that file (by k) into as many files as I need (typically the number of cores on the target computer) and add the ABC line with the number_primes option. By doing this I can get a very accurate estimate for the number of days to test a range once the range has been running for a few days. I will provide such an estimate in a few days.[/QUOTE]


Previously your reservation was to sieve k=55G-60G. Are you now reserving it for testing?

rogue 2019-02-17 02:55

[QUOTE=gd_barnes;508729]Previously your reservation was to sieve k=55G-60G. Are you now reserving it for testing?[/QUOTE]

Sorry if I wasn't clear. That is correct. I am testing the range now.

rogue 2019-02-24 01:18

[QUOTE=rogue;508725]Sieved with sr2sieve 2.0.0 to about 2e10. All factors found with that version are valid factors. For each 1G range that leaves a little less than 12 million terms between n=25000 and n=100000.

Using srfile I apply the factors the the ABCD file, but output as ABC. I remove the first line, then sort the file using the sort.exe that comes with cygwin. Sorting this way will sort by ascending k then n instead of ascending n then k. I split that file (by k) into as many files as I need (typically the number of cores on the target computer) and add the ABC line with the number_primes option. By doing this I can get a very accurate estimate for the number of days to test a range once the range has been running for a few days. I will provide such an estimate in a few days.[/QUOTE]

I can now estimate when I expect to complete these ranges. I have one older laptop that is doing about 20,000 rows from the ABC file per day. Based upon how I split the sieved range across 5 threads, this will take close to 120 days to complete. The newer computers are able to do nearly 60,000 rows per day. I estimate those to take about 40 days to complete. In fact the faster ones are nearly 20% done with a 1G range. I also realized that I sieved way too deeply. I waited until the removal rate was about one term every four seconds, but I only needed to wait for it to remove about one term every two seconds (as I sieved on a faster machine).


All times are UTC. The time now is 10:34.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.