mersenneforum.org

mersenneforum.org (https://www.mersenneforum.org/index.php)
-   Math (https://www.mersenneforum.org/forumdisplay.php?f=8)
-   -   How to guess M#48's exponent to < 1% (based on qwiki-leaks) (https://www.mersenneforum.org/showthread.php?t=17742)

ewmayer 2013-02-05 18:26

How to guess M#48's exponent to < 1% (based on qwiki-leaks)
 
In the [url=http://mersenneforum.org/showthread.php?goto=newpost&t=17704]48th Mersenne prime discovered?!![/url] thread I promised to show how one could have ascertained the new prime's exponent to within 1% using only information inadvertently leaked in that thread by "people in the know" plus a bit of math, without any recourse to Primenet server data. OK, so here's how:

In post #38, I let slip that Serge's verify run used a vector length of 3328 Kdoubles, which is larger than default for the exponent in question. In posts #40-41 Ralf and ATH dig up fossilized data from the days when Mlucas still used hard-coded FFT-length / exponent breakpoints, and ATH correctly concludes that the exponent must be less that the lower limit for 3328 K, which for that earlier version of the code implies p < 58050000. In post #43 I note that the Current Mlucas breakpoint for transitioning from 3072 K to 3328 K is p = 58569855, so that is the true upper bound on the new prime's exponent.

[Using the runlengths supported by Mlucas, the other possibility besides 3072 K is 2816 K, but that is unlikely because it implies an exponent < ~54 M, well below the range of most recent first-time tests.]

Now here is where things could very quickly have gotten very interesting ... In post #63 Mike gives a millionth-iteration status line from his Mlucas run, with exponent obscured. Mike was so busy being clever in also replacing the true hex residue with a hex string that reads amusingly that he failed to realize that the roundoff error stats in the same printouts also convey useful information, in this case
[i]
MaxErr = 0.218750000
[/i]
It's widely known that for Prime95 and similar code MaxErr ~= 0.4 is the danger threshold, so it makes sense that the FFT length breakpoints occur at roughly that level of ROE. Thus, p = 58569855 (or whatever the closest prime is to that) gives MaxErr ~= 0.4 at 3072 K. What exponent gives MaxErr = 0.21875 at that FFT length?

To figure that out it helps to refer to the paper I referenced in post 43, which is the basis for the breakpoint-computing code snippet there. (The PDF I have at the hogranch.com ftp server is easily found via cursory web search). Specifically, eqn [7] in the paper shows that for a given FFT length, ROE scales as the square of the average input wordsize. sqrt(0.21875/0.4) ~= 0.74, so the new M-prime yields an average input wordsize 0.74 that of p = 58569855 at 3072 K. To estimate p we need to compute the ratio of average *bits* per word which corresponds to that wordsize ratio: 2[sup]b1[/sup]/2[sup]b2[/sup] = 0.74, with b2 = 58569855/(3072*2[sup]10[/sup]) = 18.618... .

Thus the new prime has average bits per word of roughly b1 = lg(0.74*2[sup]b2[/sup]) = 18.1834969... average bits per input word. Multiplying by 3072 K gives p ~= 57200335, which is just a tad over 1% off from the true value, 57885161.

----

But one further improve on that, since it not necessary to "take my word for it" that the aforementioned breakpoint-computing function uses a roundoff threshold of 0.4. In fact anyone who was written FFT code, especially including non-power-of-2 runlengths, will notice that the breakpoint function makes no allowance for the kinds of systematic error-level variations arising as a result of the details of the FFT radices used: radices which factor less smoothly due to the presence of one or more odd primes tend to have more roundoff accumulation due to their relatively higher cost in terms of operations-per-input, for example. In this case the actual code is easy to obtain; running a 1000-iteration timing self-test at 3072 K yields this sort of output:
[i]
1000 iterations of M58569809 with FFT length 3145728 = 3072 K
Res64: DC08E05E46E39ECF. AvgMaxErr = 0.216503057. MaxErr = 0.281250000. Program: E3.0x
[/i]
The prime used, 58569809, is trivially different from the above (composite) breakpoint value of 58569855, but simply substituting it and the actual error level observed - which is appreciably less than 0.4 due to the smoothness of 3072 K - we get sqrt(0.21875/0.28125) ~= 0.88, so the new M-prime yields an average input wordsize 0.88 that of p = 58569809 at 3072 K. To estimate p we again compute the ratio of average *bits* per word which corresponds to that wordsize ratio: 2[sup]b1[/sup]/2[sup]b2[/sup] = 0.88..., with b2 = 58569809/(3072*2[sup]10[/sup]) = 18.6188... .

Thus the new prime yields an average of b1 = lg(0.88*2[sup]b2[/sup]) = 18.4375558... bits per input word. Multiplying by 3072 K gives p ~= 57999536, which is less than 0.2% away from the true value, 57885161.

cheesehead 2013-02-05 21:59

:smile:

Xyzzy 2013-02-05 22:06

[QUOTE]…failed to realize that the roundoff error stats in the same printouts also convey useful information…[/QUOTE]:kitten:

retina 2013-02-06 02:56

Nice analysis, BUT ...

There is a problem that you haven't covered. [b]We can't trust any of the figures that are posted here[/b]. It could be that figures have been deliberately doctored to mislead and generally mess with people's heads. It could be that the round-off figures were doctored and not genuine. It could be that the FFT size was secretly discussed amongst those in-the-know and decided to publicly give a fake value. It could be that pretty much every other indicator was a pure fabrication. Even the M(xxxxxxxx) milestone values could have been fakes deliberately designed to mislead. Without seeing actual figures that one knows have not been doctored it becomes simply a guessing game with which information to trust and which to ignore.

When you are in-the-know then you implicitly know which figures are accurate and thus how to make the computations. For outsiders it becomes a lot more problematic with following false trails and dead-ends that complicate matters considerably.

axn 2013-02-06 04:08

[QUOTE=retina;327971]There is a problem that you haven't covered. [b]We can't trust any of the figures that are posted here[/b]. [/QUOTE]

PWNED

ewmayer 2013-02-06 04:44

Well, as with any interrogation, ya gotta try to figure out which data are more likely genuine. My note about Serge running @3328 K didn't strike a false chord, because it fit with the current GIMPS wavefront, and in itself was not very revealing.

In Mike's note, the res64 was obviously fake, but the two ROE numbers both 'looked real', the larger had the kind of simple bit pattern that ROE data anywhere close to 0.5 do, and the max/avg ratio could easily be checked against those output by an actual build. And since ROE numbers not-really-close-to-a-breakpoint are unrevealing to the unpracticed eye, it didn't need a huge leap of faith to figure those might be unfudged.

Easy? No. But not out of the realm of cleverness of our more eagle-eyed members, IMO.

retina 2013-02-06 08:39

[QUOTE=ewmayer;327990]... the two ROE numbers both 'looked real', ...[/QUOTE]I agree. But that doesn't mean it [i]is[/i] real. You are all clever people and making it 'look real' would not be too difficult a task.[QUOTE=ewmayer;327990]... the larger had the kind of simple bit pattern that ROE data anywhere close to 0.5 do, and the max/avg ratio could easily be checked against those output by an actual build.[/QUOTE]Yes, and just as easily those numbers could be copy-pasted from a different exponent to make a fake result 'look real'.[QUOTE=ewmayer;327990]And since ROE numbers not-really-close-to-a-breakpoint are unrevealing to the unpracticed eye, it didn't need a huge leap of faith to figure those might be unfudged.[/QUOTE]Leaps of faith are great and all, but ultimately still faith based and requires belief in all the numbers posted by the various parties.[QUOTE=ewmayer;327990]Easy? No. But not out of the realm of cleverness of our more eagle-eyed members, IMO.[/QUOTE]I actually never bothered to even try this time because of the potential for misleading posts to make all that effort a pointless exercise with those in-the-know laughing from the sidelines. It was only when I saw the milestone post that I decided I would give that one a shot since I already had the relevant data available to count exponents and see which one it was. But even that was faith based and depended upon the milestone page not being fudged in any way, which was certainly not guaranteed.

Anyhow, the point is that when you [i]know[/i] the numbers are real then it is not too difficult to work out such things. But we minions, from the unwashed masses, had no guarantee that the numbers given were real. Our lord masters, the privileged verifiers with limitless intelligence and clevertudiness, have tricksy unfathomable ways of fooling with our efforts at discovery.

LaurV 2013-02-06 09:15

See? That is why I asked for it! To avoid arguments like this :razz:

(not that I would have any chance to guess it, my expectation (after the first posts about finding a new prime) was in the 61M area, I was a bit disappointed when my man told me the expo, and didn't really believe it until the official announcement).

davieddy 2013-02-06 09:27

[QUOTE=ewmayer;327798]In the [URL="http://mersenneforum.org/showthread.php?goto=newpost&t=17704"]48th Mersenne prime discovered?!![/URL] thread I promised to show how one could have ascertained the new prime's exponent to within 1% using only information inadvertently leaked in that thread by "people in the know" plus a bit of math, without any recourse to Primenet server data. OK, so here's how:

< the big snip>

[/QUOTE]
So simple, how could I have missed it?
Must be the Ardbeg.

D

flashjh 2013-02-06 17:21

I knew the exponent this time, for obvious reasons. However, for next time your explanation is very informative.

Personally, I wish I knew more about the math (and I'm working on learning). Really, the challenge of finding the exponent is a lot of fun.

Thanks!

science_man_88 2013-02-06 17:33

[QUOTE=flashjh;328091]I knew the exponent this time, for obvious reasons. However, for next time your explanation is very informative.

Personally, I wish I knew more about the math (and I'm working on learning). Really, the challenge of finding the exponent is a lot of fun.

Thanks![/QUOTE]


once the milestone thing came up I kinda tried but failed to find the correct one, the number of primes in between the known exponents , and the amount needed to prove that one is the x-th one can be used to roughly figure how many have no need to be double checked in that range and so you add the difference of these to the amount needed to be checked until the exponent in question and a few other things and I still messed it up by 20 million when I tried.


All times are UTC. The time now is 16:02.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.