mersenneforum.org

mersenneforum.org (https://www.mersenneforum.org/index.php)
-   Computer Science & Computational Number Theory (https://www.mersenneforum.org/forumdisplay.php?f=116)
-   -   Wilson-prime search practicalities (https://www.mersenneforum.org/showthread.php?t=16028)

Jeff Gilchrist 2011-09-09 13:55

[QUOTE=MrRepunit;271260]My guess is that it will be at around 57500 sec.
Reason is that there are only 40M primes in that interval. So except from other effects the speed should be the same.[/QUOTE]

It is slightly faster. This is on Opteron with wilsontest.c

[CODE]Range Interval Time(sec) RAM(MB)
------ -------- --------- -------
71e9-72e9 50000128 57585 2230
72e9-73e9 5000192 111822 543
73e9-74e9 75000064 54651 2758
[/CODE]

Same thing on my core2 with wilsontest2.c having larger still improves speed:

[CODE]Range Interval Time(sec)
------ -------- ---------
70e9-71e9 50000128 64988
76e9-77e9 75000064 58059
77e9-78e9 75000064 58325
[/CODE]

I'm almost finished my range, so I will reserve 80e9 to 100e9 as well.

rajula 2011-09-09 14:03

[QUOTE=R.D. Silverman;271293]Slight nitpick. The use of percentage here is wrong. A percentage represents a faction with respect to some other value. The expected number (which is an absolute number) of Wilson primes up to N is ~C loglog(N), giving .08C as the number of expected primes in [1e12, 1e13] for some constant C.

I have never seen a strong argument that gives the value of C. Is it 1??[/QUOTE]

Percentage is mostly used to represent a fraction but by itself it means percentage. I used the value C=1 as was done in [URL="http://www.ams.org/journals/mcom/1997-66-217/S0025-5718-97-00791-6/S0025-5718-97-00791-6.pdf"]Crandall-Dilcher-Pomerance[/URL] accepting their heuristic estimates without thinking about it too much.

(Thanks rogue for pointing out the wrong exponent.)

R. Gerbicz 2011-09-09 16:36

[QUOTE=fivemack;271284]It will be faster; the interval parameter is used in two places in the software, in one place min(interval length, number of primes) is relevant, and in the other it isn't.
...
in both of which the interval is larger than the number of primes in the range.
...
(this is with wilsontest.c linked against gmp-5; I am rerunning one parameter choice with wilsontest2.c linked against mpir-2.4)[/QUOTE]

Yes, probably it is a little confusing, that you still see a speed improvement, when there are (much) less than ~interval/2 primes in your range. But interval is used in many places of the algorithm, in the most time consuming part of the wilson code (for large ranges) the ff() where in each chunk you multiple interval primes (up to n), followed by a division. In this part if your interval is larger then you get a little speedup, the reason is that the division is more expensive than the multiplication (between same size of numbers): so it is better to multiple and finally do a division between unbalanced numbers.
A larger interval also helped the func() routine for wilsontest; however wilsontest2 handle this much better with modproducttree().

R. Gerbicz 2011-09-09 16:47

[QUOTE=Jeff Gilchrist;271294]It is slightly faster. This is on Opteron with wilsontest.c

[CODE]Range Interval Time(sec) RAM(MB)
------ -------- --------- -------
71e9-72e9 50000128 57585 2230
72e9-73e9 5000192 111822 543
73e9-74e9 75000064 54651 2758
[/CODE]

Same thing on my core2 with wilsontest2.c having larger still improves speed:

[CODE]Range Interval Time(sec)
------ -------- ---------
70e9-71e9 50000128 64988
76e9-77e9 75000064 58059
77e9-78e9 75000064 58325
[/CODE]

I'm almost finished my range, so I will reserve 80e9 to 100e9 as well.[/QUOTE]

Would not be faster to check a range of 2e9 (for st>=70e9) with interval=42e6 ? (3 blocks: one for each type).

Or a range of 4e9 with interval of 42e6? (4 blocks: two of p==1 mod 3; one of p==5 mod 12 and one of p==11 mod 12). If we see only the number of primes, this is close to optimal, because in each block there are approx. 40e6=4e7 primes. What do you think about it? I know that a larger interval helps a little, but a semi-filled block of primes is time consuming.

Jeff Gilchrist 2011-09-09 17:07

[QUOTE=R. Gerbicz;271315]Would not be faster to check a range of 2e9 (for st>=70e9) with interval=42e6 ? (3 blocks: one for each type).

Or a range of 4e9 with interval of 42e6? (4 blocks: two of p==1 mod 3; one of p==5 mod 12 and one of p==11 mod 12). If we see only the number of primes, this is close to optimal, because in each block there are approx. 40e6=4e7 primes. What do you think about it? I know that a larger interval helps a little, but a semi-filled block of primes is time consuming.[/QUOTE]

Could be faster, I was just splitting the work to easily manage between my CPUs. I never really thought about finding the optimal range as well as interval. I will try a range of 2e9 and 4e9 to see.

For fun I also tried a 1e9 range with interval 100000256 and came back at 55263 sec (3,249 MB RAM) so starts to slow down up that high.

Jeff Gilchrist 2011-09-10 11:38

I have now finished the 70e9 to 80e9 range, nothing found.

I haven't finished my next range but I do have some results:
[B]80248324571 -1+46p
80908082573 -1-20p
[/B]

Should we be manually checking this with the slow function and should they be reported anywhere in particular?

R. Gerbicz 2011-09-10 12:52

[QUOTE=Jeff Gilchrist;271379]
Should we be manually checking this with the slow function and should they be reported anywhere in particular?[/QUOTE]

Reporting only here is fine. As I can remember I have checked all new near Wilson primes reported here. Just to be sure now I will check all of these. (so far 14 new primes).

Jeff Gilchrist 2011-09-10 22:35

[QUOTE=R. Gerbicz;271315]Would not be faster to check a range of 2e9 (for st>=70e9) with interval=42e6 ? (3 blocks: one for each type).[/QUOTE]

It is indeed faster, using a 2e9 range and interval of 42e6 it took 99,053 sec compared to 2 x 54651 (109,302 sec).

Now that I'm using larger 2e9 and 4e9 ranges all my current reserved ranges are all running. I'm now reserving 111e9 to 113e9.

Jeff.

R. Gerbicz 2011-09-11 00:38

[QUOTE=R. Gerbicz;271393]Reporting only here is fine. As I can remember I have checked all new near Wilson primes reported here. Just to be sure now I will check all of these. (so far 14 new primes).[/QUOTE]

Checked, all of them are near Wilson prime.

[QUOTE=Jeff Gilchrist;271425]It is indeed faster, using a 2e9 range and interval of 42e6 it took 99,053 sec compared to 2 x 54651 (109,302 sec).
[/QUOTE]

I thought there will be a larger speedup, but that's still nice.

fivemack 2011-09-11 16:35

Jeff, I'm afraid I've already done 111G to 113G as calibration runs.

Found 112825721339 -1+70p at 12:52 10/9/2011

Also:
102G-110G (interval 8e8) complete, no hits
[code]
Testing p=102000000061...109999999987, p==1 mod 3, time=4414 sec.
Testing p=102000000041...109999999937, p==5 mod 12, time=136766 sec.
Testing p=102000000227...109999999979, p==11 mod 12, time=266342 sec.
Done the st=102000000000-en=110000000000 interval. Time=501789 sec.
493311.22user 8250.93system 139:23:10elapsed 99%CPU (0avgtext+0avgdata 107148784maxresident)k
1540776inputs+22162544outputs (26611major+1927975376minor)pagefaults 0swaps

Maximum memory use 26.743G
[/code]

50G-60G, interval 4e7 complete, no hits
[code]
Testing p=50000000423...53947922087, p==11 mod 12, time=1011 sec.
Testing p=50000000023...51972279967, p==1 mod 3, time=132601 sec.
Testing p=50000000021...53947813781, p==5 mod 12, time=165829 sec.
Testing p=51972280039...53947594627, p==1 mod 3, time=234953 sec.
Testing p=53947594681...55925963257, p==1 mod 3, time=268054 sec.
Testing p=53947813793...57907253993, p==5 mod 12, time=300460 sec.
Testing p=53947922123...57907614407, p==11 mod 12, time=369817 sec.
Testing p=55925963263...57907169977, p==1 mod 3, time=508006 sec.
Testing p=57907170049...59891256703, p==1 mod 3, time=541926 sec.
Testing p=57907254221...59999999861, p==5 mod 12, time=575239 sec.
Testing p=57907614419...59999999999, p==11 mod 12, time=607262 sec.
Testing p=59891256769...59999999959, p==1 mod 3, time=667609 sec.
Done the st=50000000000-en=60000000000 interval. Time=672907 sec.
666107.93user 6265.94system 186:55:07elapsed 99%CPU (0avgtext+0avgdata 14905040maxresident)k
4192inputs+27350896outputs (49major+2469411071minor)pagefaults 0swaps

Maximum memory 3.726G
[/code]

Taking 120G to 144G

Jeff Gilchrist 2011-09-11 16:44

[QUOTE=fivemack;271459]Jeff, I'm afraid I've already done 111G to 113G as calibration runs.[/QUOTE]

Ok no prob, I just started that so killed it now, no time wasted. So you just went up to 113G and stopped?

In that case I will take 113e9 to 120e9 to fill in the gap from your latest reservation.

Jeff.


All times are UTC. The time now is 03:59.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2022, Jelsoft Enterprises Ltd.