mersenneforum.org  

Go Back   mersenneforum.org > Factoring Projects > XYYXF Project

Reply
 
Thread Tools
Old 2020-08-26, 06:03   #12
pxp
 
pxp's Avatar
 
Sep 2010
Weston, Ontario

2628 Posts
Default

Thank you. As I had just finished interval #13, I was going to have a run-off between the two versions. So I interrupted my running version, took a duplicated output file and removed the +1s for the new ABC format, placed that file in the folder containing the new xyyxsieve on my other computer, and ran it. Unfortunately it terminated with a "Segmentation fault: 11".
pxp is offline   Reply With Quote
Old 2020-08-26, 12:44   #13
rogue
 
rogue's Avatar
 
"Mark"
Apr 2003
Between here and the

24×7×53 Posts
Default

Quote:
Originally Posted by pxp View Post
Thank you. As I had just finished interval #13, I was going to have a run-off between the two versions. So I interrupted my running version, took a duplicated output file and removed the +1s for the new ABC format, placed that file in the folder containing the new xyyxsieve on my other computer, and ran it. Unfortunately it terminated with a "Segmentation fault: 11".
It is possible that you do not have enough memory. The version of software I provided is not designed to support very large ranges of x. I suggest you continue with the version I posted a few weeks ago.

Last fiddled with by rogue on 2020-08-26 at 12:46
rogue is offline   Reply With Quote
Old 2020-08-26, 13:53   #14
pxp
 
pxp's Avatar
 
Sep 2010
Weston, Ontario

2·89 Posts
Default

After I restarted the interrupted run (using its output file as my new input) the program began with an ETC of September 1. That's quite a change from the previous mid-January 2021. Now those early ETC calculations aren't firm but I wonder now if the initial file size somehow skews the ETC guess. As I have a handful of other sieves going I will interrupt a couple of those to see if I get a similar advance in ETC dates using the size-reduced output files as new inputs.
pxp is offline   Reply With Quote
Old 2020-08-26, 14:37   #15
pxp
 
pxp's Avatar
 
Sep 2010
Weston, Ontario

2×89 Posts
Default

Quote:
Originally Posted by pxp View Post
Now those early ETC calculations aren't firm but I wonder now if the initial file size somehow skews the ETC guess. As I have a handful of other sieves going I will interrupt a couple of those to see if I get a similar advance in ETC dates using the size-reduced output files as new inputs.
I am seeing much earlier ETC dates on restarted interruptions. Perhaps a better guess than file size being the cause is multi-core implementation. As all of my sieves are run as a single process on a 6-core machine, I use -W6. Perhaps the ETC does not take that performance improvement into account.
pxp is offline   Reply With Quote
Old 2020-08-26, 15:16   #16
rogue
 
rogue's Avatar
 
"Mark"
Apr 2003
Between here and the

24·7·53 Posts
Default

Quote:
Originally Posted by pxp View Post
I am seeing much earlier ETC dates on restarted interruptions. Perhaps a better guess than file size being the cause is multi-core implementation. As all of my sieves are run as a single process on a 6-core machine, I use -W6. Perhaps the ETC does not take that performance improvement into account.
The ETC is based upon when the sieving started compared to where it currently is and is based upon the last prime that has been successfully sieved. Various things impact this calculation including the type of sieve and the number of threads. Regarding the "type of sieve", sieves such as xyyxsieve and gcwsieve start slow, but "p/sec" increases as terms are removed. This means that each "chunk" takes longer for small p than for large p. This causes the ETC to be reduced as p increases. I could change this, but I haven't thought much about it since so few sieves are impacted by it.

Last fiddled with by rogue on 2020-08-26 at 15:30
rogue is offline   Reply With Quote
Old 2020-09-17, 17:58   #17
rogue
 
rogue's Avatar
 
"Mark"
Apr 2003
Between here and the

24×7×53 Posts
Default

I have verified all PRPs for x <= 14000. The range for 14000 < x <= 20000 has about 1.5 million terms in it. I will be starting on that soon.
rogue is offline   Reply With Quote
Old 2020-09-25, 16:00   #18
rogue
 
rogue's Avatar
 
"Mark"
Apr 2003
Between here and the

134608 Posts
Default

Quote:
Originally Posted by rogue View Post
I have verified all PRPs for x <= 14000. The range for 14000 < x <= 20000 has about 1.5 million terms in it. I will be starting on that soon.
Verification done thru x <= 15000. Estimate of about 7 weeks to finish the double check for x <= 20000.

I made an interesting observation when looking at primes/PRPs of this form. The last column is the number of primes in the range. Note the relatively even distribution despite the geometric growth of terms in the range (approximately (max x)^2). Is that expected or is that unusual?

The numbers for x > 15000 have not been verified yet.

Code:
    0 <= x <  1000   87 
 1000 <= x <  2000   87 
 2000 <= x <  3000   92 
 3000 <= x <  4000   80 
 4000 <= x <  5000   80 
 5000 <= x <  6000   72 
 6000 <= x <  7000   69 
 7000 <= x <  8000   80 
 8000 <= x <  9000   79 
 9000 <= x < 10000   61 
10000 <= x < 10000   63 
10000 <= x < 11000   75 
11000 <= x < 12000   63 
12000 <= x < 13000   70 
13000 <= x < 14000   67 
14000 <= x < 15000   68 
15000 <= x < 16000   66 
16000 <= x < 17000   50 
17000 <= x < 18000   71
rogue is offline   Reply With Quote
Old 2020-09-25, 18:00   #19
rogue
 
rogue's Avatar
 
"Mark"
Apr 2003
Between here and the

10111001100002 Posts
Default

Here are updates based upon prp's searching (barring mistakes in my counting):

Code:
    0 <= x <  1000   87 
 1000 <= x <  2000   87 
 2000 <= x <  3000   92 
 3000 <= x <  4000   80 
 4000 <= x <  5000   80 
 5000 <= x <  6000   72 
 6000 <= x <  7000   69 
 7000 <= x <  8000   80 
 8000 <= x <  9000   79 
 9000 <= x < 10000   61 
10000 <= x < 10000   63 
10000 <= x < 11000   75 
11000 <= x < 12000   63 
12000 <= x < 13000   70 
13000 <= x < 14000   67 
14000 <= x < 15000   68 
15000 <= x < 16000   66 
16000 <= x < 17000   50 
17000 <= x < 18000   71 
18000 <= x < 19000   72
19000 <= x < 20000   79
20000 <= x < 21000   62
21000 <= x < 22000   79
22000 <= x < 23000   71
23000 <= x < 24000   73
24000 <= x < 25000   56
25000 <= x < 26000   47
26000 <= x < 27000   33
Note that for x > 24000 the distribution changes, but that is because those ranges are not fully tested. I'm not even certain of the entire search space for x < 24000 has been fully tested. Every x < 23000 looks like it has been tested, but I can't speak for x > 23000 as there appear to be some gaps (from my perspective).
rogue is offline   Reply With Quote
Old 2020-09-25, 20:03   #20
pxp
 
pxp's Avatar
 
Sep 2010
Weston, Ontario

2·89 Posts
Default

Quote:
Originally Posted by rogue View Post
Note that for x > 24000 the distribution changes, but that is because those ranges are not fully tested. I'm not even certain of the entire search space for x < 24000 has been fully tested. Every x < 23000 looks like it has been tested, but I can't speak for x > 23000 as there appear to be some gaps (from my perspective).
You are correct. By mid-October I will have finished interval #16 which will guarantee x < 24000. To get to x < 25000 I will need to finish interval #17, which I haven't started yet. My current long-term goal is 150000 decimal digits which will bring this up to x < 33000.
pxp is offline   Reply With Quote
Old 2020-09-26, 03:24   #21
rogue
 
rogue's Avatar
 
"Mark"
Apr 2003
Between here and the

24·7·53 Posts
Default

Quote:
Originally Posted by pxp View Post
You are correct. By mid-October I will have finished interval #16 which will guarantee x < 24000. To get to x < 25000 I will need to finish interval #17, which I haven't started yet. My current long-term goal is 150000 decimal digits which will bring this up to x < 33000.
Sieving is really slow for 20000 < x <= 40000. There are over 10m terms remaining at a little over 1e9 and it to take me months to sieve deeply enough using 6 cores. The problem is that xyyxsieve needs a lot of memory to sieve the range efficiently (over 10GB for 6 workers), but the memory access gets expensive. I tried adding a prefetch as that should help with speed, but my initial attempts have hurt performance. I'm sure that is the key, but I very likely am doing something wrong. With 6 cores I am only testing about 30 p per second. I could possibly gain some speed by looking at x with few y terms and avoid building a power table for those x in memory. The same could be said for y with few x terms. I'm not certain if there are enough x or y with few terms that sieving could benefit from it.

As soon as I complete for x <= 20000, I will peel off ranges of of y in groups of 1000 from the 10m data set. Those are the smaller terms from the big range and as I pull them out sieving should pick up a little bit of speed.
rogue is offline   Reply With Quote
Old 2020-10-01, 19:21   #22
rogue
 
rogue's Avatar
 
"Mark"
Apr 2003
Between here and the

134608 Posts
Default

Actually there are 68 primes for 15000 <= x < 16000. I miscounted above.

The range for x < 16000 has now been double-checked. No missing primes/PRPs.
rogue is offline   Reply With Quote
Reply

Thread Tools


Similar Threads
Thread Thread Starter Forum Replies Last Post
Reservations ET_ Operazione Doppi Mersennes 490 2020-10-12 15:49
Reservations for b^(k*2^n); b = 3, k<750 lukerichards And now for something completely different 13 2019-05-11 23:27
Reservations kar_bon Riesel Prime Data Collecting (k*2^n-1) 129 2016-09-05 09:23
Reservations? R.D. Silverman NFS@Home 15 2015-11-29 23:18
4-5M Reservations paulunderwood 3*2^n-1 Search 15 2008-06-08 03:29

All times are UTC. The time now is 14:15.

Tue Oct 20 14:15:47 UTC 2020 up 40 days, 11:26, 1 user, load averages: 3.18, 2.84, 2.70

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2020, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.