![]() |
[QUOTE=VBCurtis;556443]Nope. But the opposite isn't true, either.[/QUOTE]But then I would say low weight doesn't really matter. Either I test 100000 candidates for a given k and find 10 primes or I test 10000 and find one. Then move to the next low weight k and in the end I end up with the same number of LLRs and primes. Just spread over more than one k. Is it like that?
|
We have no evidence otherwise, though some folks around here do think otherwise.
I test a couple low-weight k's as well as some of the highest; I like finding anomalies, but I don't think my choices are more prime-worthy per unit of search effort. |
I looked into the matter why only primes show up as n for low n values. Here are some restrictions I found:
If [TEX]n \equiv 0 \textrm{ mod } 2[/TEX] the Riesel number will be divisible by 3 (n = 2, 4, 6, 8, 10, ...) If [TEX]n \equiv 1 \textrm{ mod } 10[/TEX] the Riesel number will be divisible by 11 (n = 11, 21, 31, 41, 51, ...) If [TEX]n \equiv 1 \textrm{ mod } 8[/TEX] the Riesel number will be divisible by 17 (n = 9, 17, 25, 33, 41, 49, ...) If [TEX]n \equiv 15 \textrm{ mod } 20[/TEX] the Riesel number will be divisible by 41 (n = 15, 35, 55, 75, ...) If [TEX]n \equiv 19 \textrm{ mod } 70[/TEX] the Riesel number will be divisible by 71 (n = 19, 89, 159, 229, ...) There might be more. I am not sure though if this will rather hit composites than primes. The 2nd condition removes 11, 31, 41, 71, so it doesn't really seem like it. |
Haha, I just realized this is probably trivial and occurs for every divisor... yes, I'm not that good at math. :)
|
Here are the n values for Riesel primes with k = 1281979 and n <= 100000
3 7 43 79 107 157 269 307 373 397 1005 1013 1765 1987 2269 6623 7083 7365 10199 16219 26143 32557 38165 47167 47863 70373 94723 95167 |
The (near-)Woodall [I]k[/I]'s listed in [url]https://www.mersenneforum.org/showpost.php?p=550539&postcount=363[/url], again except for [I]k[/I]=1993191, have been completed to [I]n[/I]=375k. The only prime found was 667071*2^373497-1.
Edit: Another prime, 667071*2^358286-1, was already reported (by me) to the Prime-Wiki in August, so I accidentally left it off here. |
1 Attachment(s)
I've completed the remaining RPS 9th and 10th Drive [I]k[/I]'s with missing ranges from [I]n[/I]=300k to 325k. 18 primes were found, which are attached. It'll probably be until the end of 2021 before they're finished to [I]n[/I]=400k, the ultimate goal.
|
update 11/3
k = 50171 is at 3.883 M, is on hold for the moment while I do a Carol-Kynea reservation. Please keep it reserved to me.
|
8847
Reserving 8847
|
The (near-)Woodall [I]k[/I]'s listed in [url]https://www.mersenneforum.org/showpost.php?p=550539&postcount=363[/url], this time and in the future [I]including[/I] [I]k[/I]=1993191, have been completed to [I]n[/I]=400k. The primes for [I]k[/I]=1993191 have already been posted to Prime-Wiki and will not be listed here for the sake of brevity. (There are 26 total below [I]n[/I]=400k.) The following primes were found for the other [I]k[/I]'s:[list][*]667071*2^380058-1[*]1183953*2^384787-1[*]665127*2^385516-1[*]1268979*2^387863-1[/list]
|
[QUOTE=bur;556456]But then I would say low weight doesn't really matter. Either I test 100000 candidates for a given k and find 10 primes or I test 10000 and find one. Then move to the next low weight k and in the end I end up with the same number of LLRs and primes. Just spread over more than one k. Is it like that?[/QUOTE]
Given: k * 2^n - 1 The risk you run with low-weights is that you do not find a prime at all, because when the number of bits of the exponent n increases, the testing time goes exponentially up and there is going to be a limit that your hardware can quickly enough proces. See a low weight as a big gamble. Yet i'm also gambling upon one. So currently actively i'm searching k=32767 (low weight) and k=89 (medium to heavy weight). k=69 has been temporarily stopped, for a simple reason. My hardware has problems crunching above n=7Mbits - the L2 cache simply isn't large enough. Nash-weight of course is a quick estimate. Note that i'm also sieving k=32767 deeper and deeper. I'm now at a point that for quite a while sieving deeper is more useful than testing. With k=32767 that's roughly around n = 5Mbits for my hardware. Note i started sieving until 30Mbits. So right now i'm sieving [5M ; 30M] Probably that's becasue of wishful thinking from my side i ever will manage to get that far. As i'm still removing quite a bit of exponents at larger sieve depths i would expect the odds for a prime is dramatically lower than one would expect for a different k with the same nash-weight where this sieving heuristic doesn't apply. So if i'm so lucky i find 1 prime with k=32767 worth mentionning, yet odds is huge there isn't one. If the next k=32767 prime is located at 25Mbits then i'm not sure i'm gonna find it before having entirely white hair (and right now it's not yet grey) :) For some small odds of finding 1 prime would you want to do all this effort and wait that many years? On the other hand k=89 i need to do 20k tests between 4m and 5m - yet that'll take only a year or so. And 5m to 6m i do not know yet. Will depend upon whether i make enough cash to upgrade the hardware here :) Read - whether my 3d printer finally releases after that many years... Yet odds are quite optimistic there is 1 or more primes between [4m and 8m]. The low weight is a total gamble and basically i probably am soon going to take the decision to just sieve [5m ; 30m] for a year to come. No nothing testing above 5M for now... A single core from a magny cours processor here of 2.2Ghz removes on average 1 exponent each 3+ hours. We can calculate the break even point when to start testing [5m;6m] Well that's a 2:20 AM estimate here just typed in live. BSGS algorithm works uses a square root trick you can find on wiki. So we can see [5m ; 30m] versus [6m ; 30m] for sieving. sqrt(24 mln) / sqrt (25mln) = 0.9797 So that's roughly 2%, So the sieving is 2% slower roughly if i keep testing the larger domain [5m;30m] Now testing time at [4m;5m] is already 15000 seconds at 2 cores of a Xeon L5420 at 2.5Ghz. That's 8.3 hours for 1 core. We assume an even distribution the sieving gets where it removes an exponent between [5m and 30m] So when under this circumstance is there a break even point of testing expressed in removal rate? A first attempt to calculate break-even point.... (yeah at 2:30 AM by now)... Let's assume now that the average of [5m ; 6m] is 5.8M. LLR time for 5.8M might be roughly 8.3 hours * (5.8M / 4.0M) ^ 2 = 17.5 hours 17.5 hours == removal_rate * 0.02 * (25M / 1M) ==> removalrate = (17.5 hours * 50 / 25) = 17.5 * 2 = 35 hours So i might be sieving for years to come before i can start testing the low weight k=32767 :) removal_rate |
| All times are UTC. The time now is 15:48. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.