![]() |
[QUOTE=xilman;262981]No, he uses "unfactored" in a different way from you. "Unfactored" has as one of its meanings "without any known prime factors". Another meaning is "without being completely factored into primes".[/QUOTE]Correct. And that is a common usage by the crowd that is using Prime95 to work on Mersennes. The latter usage would normally be referred to as "not fully factored". (Not in the number theory and academic world, but out in most of this forum.)
|
[QUOTE=xilman;262981]No, he uses "unfactored" in a different way from you. "Unfactored" has as one of its meanings "without any known prime factors". Another meaning is "without being completely factored into primes".
Paul[/QUOTE] The purpose of trial factoring for GIMPS is to eliminate candidates for LL testing that can be disposed of more quickly by finding a small prime factor. The factor itself is irrelevant because it is highly improbable that the cofactor is prime. Having an objective of finding just one small factor of a number that is too large to be fully factored is a pointless waste of time if the objective is JUST that small factor. |
[QUOTE=R.D. Silverman;262990]Having an objective of finding just one small factor
of a number that is too large to be fully factored is a pointless waste of time if the objective is JUST that small factor.[/QUOTE] I'd say that's a matter of opinion. :smile: Some might see it a waste of time to fully factor any number, or to do a GIMPS-style search for prime numbers. You see it as a waste of time to find a small factor of a number that you probably won't be fully factoring. Doesn't mean anyone's necessarily right or wrong, just different opinions. |
[QUOTE=Mini-Geek;262994]I'd say that's a matter of opinion. :smile:
[/QUOTE] Agreed. But not all opinions are created equal. :smile: I suggest that you take a poll among mathematicians who work in this domain...Finding a small factor is useful if one is eliminating primality candidates. But it is not very helpful in solving other problems. My opinion about the OPN search is known. But finding one small factor isn't very helpful even there, whereas full factorizations are. Over time as one gains experience and mathematical maturity, one gets a feel for what is important. Think of it as wine-tasting. The opinion of people with experience, as opposed to dilletantes makes a difference. For Cunningham numbers (those with prime base) full factorizations are useful because they reveal the structure of finite fields. Knowing said structure has uses in other parts of mathematics, even practical ones such as constructing good pseudo-rngs. |
[QUOTE=R.D. Silverman;262980]Unfactored 2^n-1 for n < 2000 has only 4 exponents??
Clearly you count differently from the way everyone else does.[/QUOTE] That can be rephrased to say "there are 4 'genuine composites'", i.e. composite numbers for which we do not know any prime factor. |
[QUOTE=alpertron;263360]That can be rephrased to say "there are 4 'genuine composites'", i.e. composite numbers for which we do not know any prime factor.[/QUOTE]
You can't mean what you wrote here. I expect that most of the factorizations of 2^n-1 for n < 2000 are incomplete; these incomplete factorizations [i]all[/i] leave composite cofactors for which we do not know any prime factor......... I have never seen the term "genuine composite" in a book or paper. Can you give a reference??? |
[QUOTE=R.D. Silverman;263394]You can't mean what you wrote here. I expect that most of the
factorizations of 2^n-1 for n < 2000 are incomplete; these incomplete factorizations [i]all[/i] leave composite cofactors for which we do not know any prime factor......... I have never seen the term "genuine composite" in a book or paper. Can you give a reference???[/QUOTE] I've read that in "Prime Numbers" by Richard Crandall and Carl Pomerance. The four numbers are: 2[SUP]1061[/SUP]-1, 2[SUP]1277[/SUP]-1, 2[SUP]1619[/SUP]-1 and 2[SUP]1753[/SUP]-1. The cofactors you said above are not Mersenne numbers themselves so those do not count. |
[QUOTE=alpertron;263396]I've read that in "Prime Numbers" by Richard Crandall and Carl Pomerance.[/QUOTE]
Do they give any reasoning for their use of the term? I think they got it backwards; if any composites deserve to be called "genuine", it should be those with known factors, because we can quite concretely write them as products of other numbers, thereby proving without doubt that they are indeed composites. In contrast, a composite with no known factors is only known to be composite because some primality test reported it as composite, and it may be very difficult to verify the result. |
[QUOTE=Random Poster;263455]In contrast, a composite with no known factors is only known to be composite because some primality test reported it as composite, and it may be very difficult to verify the result.[/QUOTE]Proof of compositeness is easy and fast. Verification is simple even without knowing any factors.
|
[QUOTE=retina;263456][QUOTE=Random Poster;263455][QUOTE=alpertron;263396]I've read that in "Prime Numbers" by Richard Crandall and Carl Pomerance. The four numbers are: 2[SUP]1061[/SUP]-1, 2[SUP]1277[/SUP]-1, 2[SUP]1619[/SUP]-1 and 2[SUP]1753[/SUP]-1. The cofactors you said above are not Mersenne numbers themselves so those do not count.[/QUOTE]
Do they give any reasoning for their use of the term? I think they got it backwards; if any composites deserve to be called "genuine", it should be those with known factors, because we can quite concretely write them as products of other numbers, thereby proving without doubt that they are indeed composites. In contrast, a composite with no known factors is only known to be composite because some primality test reported it as composite, and it may be very difficult to verify the result.[/QUOTE] Proof of compositeness is easy and fast. Verification is simple even without knowing any factors.[/QUOTE] Really? The notion of genuine composites is discussed in PN:ACP in the section on Fermat numbers (§1.3.2, at least in the first edition), and it specifically references F[sub]14[/sub], F[sub]20[/sub], F[sub]22[/sub] and F[sub]24[/sub]. Of course, two of those have since had a factor found. Shall we ask ewmayer and jasonp whether their proof of F[sub]24[/sub]'s compositeness was easy and fast? And even today, over a decade later, can you perform a "simple" verification of this fact? Will your verification be substantially simpler than the original proof? (Speaking of which, Ernst and Jason: is the full F[sub]24[/sub] Pépin residue generated by your proof available anywhere? I know the Selfridge-Hurwitz residues are in the paper, but the full residue would be nice to have, just for, uh, fun or something.) @Random: I'm not sure of the justification for choosing the term genuine to describe such a composite. Perhaps a better word didn't suggest itself. Would you care to put forward an alternative? In any case, the authors talk a bit (in exercise 1.79) about some interesting aspects of this notion of "genuineness". Specifically, given any composite C with no known factors, it follows immediately that 2[sup]C[/sup]-1 is also composite, and chances are good that no factors of [I]it[/I] are known either. So there can't really be any largest genuine composite. |
I may have the F24 residue on a backed-up old hard drive. You can make up random garbage and it should work just as well :)
|
[QUOTE=jyb;263484]Shall we ask ewmayer and jasonp whether their proof of F[sub]24[/sub]'s compositeness was easy and fast?[/QUOTE]It was easier and faster than factoring it. :wink:
|
[QUOTE=jasonp;263491]I may have the F24 residue on a backed-up old hard drive. You can make up random garbage and it should work just as well :)[/QUOTE]
Not really, depending on what one wants to do. If a factor of F[sub]24[/sub] is found, then the full Pépin residue would be needed to use the Suyama method for proving the cofactor composite (which would be a lot faster than a corresponding Fermat test). In fact, the abstract of your paper says: [QUOTE]We deposited a final Pépin residue for possible use by future investigators in the event that a proper factor of F[sub]24[/sub] should be discovered.[/QUOTE] So it would seem that there's a residue bank somewhere and you guys made a deposit. If I want to show that my F[sub]24[/sub] cofactor is composite with minimal fuss, I'll need to make a withdrawal. Please can you direct me to the teller window?* *I don't actually have a cofactor to test, so this is all academic. I'm just wondering where one might obtain the residue. |
[QUOTE=retina;263494]It was easier and faster than factoring it. :wink:[/QUOTE]
Sure, but is verification any faster than proof? |
[QUOTE=jyb;263484]Specifically, given any composite C with no known factors, it follows immediately that 2[sup]C[/sup]-1 is also composite, and chances are good that no factors of [I]it[/I] are known either. .[/QUOTE]
Chances ares ZERO that no factor is known. If C is even, then 2^C-1 splits as the difference of two squares. If C = ab is odd, then 2^C-1 is known to have the factors 2^a-1 and 2^b-1. |
[QUOTE=R.D. Silverman;263509]Chances ares ZERO that no factor is known. If C is even, then
2^C-1 splits as the difference of two squares. If C = ab is odd, then 2^C-1 is known to have the factors 2^a-1 and 2^b-1.[/QUOTE] Why don't you read my post again (just the part you quoted) and see if you can figure out why this is incorrect? |
How much ECM has been done on M1277, M1619, M1753, M2137?
|
A *huge* amount....someone else will have to say how much that means.....to the point where the probably effort to find factors that way exceeds the effort to run NFS on it and get the factors with certainty.
|
[QUOTE=lorgix;264576]How much ECM has been done on M1277, M1619, M1753, M2137?[/QUOTE]Your answer lies here: [url]http://v5www.mersenne.org/report_ECM/[/url]
These are best left to a big sieving project. |
Are you seriously saying that those numbers are ready for SNFS?
That no more ECM should be done? I'd like to know if anyone has completed t65. |
[QUOTE=lorgix;264626]Are you seriously saying that those numbers are ready for SNFS?
That no more ECM should be done?[/QUOTE] [QUOTE=R.D. Silverman;262950]After Bruce's work and EPFL's latest pass, the chances of finding a factor from 2^n-1 for n < 1200 is [i]miniscule[/i]. For most of these numbers, ECM does [b]not[/b] have the "best chance" per GHz/day. Instead, with high probability, NFS will succeed with less work. And it is guaranteed. If NFS@Home were to attempt a larger number than M1061, some additional ECM work might be done on that number as a pre-screen. Otherwise, further ECM work on these numbers does not have a lot of value.[/QUOTE] Bob said it. |
I would be surprised were even M1277 'ready for SNFS' in a useful way.
The PS3 ECM implementation in [url]http://eprint.iacr.org/2010/338.pdf[/url] stopped at 1200 bits, so I don't think they've done their B1=3e9 magic on M1277. (their largest reported-failure was 142162 curves at 3e9 on M1193) It looks as if we would need a better siever (larger large primes, in any case) to be able to find relations for M1277 at any sort of usable speed; maxed-out parameters for gnfs-lasieve4I16e get me about one relation a minute and we need a billion. (which suggests that 'ready for SNFS' would be about 200 million minutes of ECM, which is a few million curves at PS3-speed ... so wait for a better siever first) |
I don't think Bob said 1200 > 1277 though.
I left out M1061 because it's already in sieving. Thank you, fivemack. I'm assuming 85e7~29e8 should be used. |
Just finished P-1 to B1=10^12, B2=10^14, reported and got 122GHz-days credit. :smile:
[code] Manual Testing 1061 NF-PM1 2011-11-21 02:02 B1=1000000000000, B2=100000000000000 122.3257 [/code] Yeah, I know about all the efforts, as discussed in few threads around, like [URL="http://www.mersenneforum.org/showthread.php?t=14822"]here[/URL], and [URL="http://www.mersenneforum.org/showthread.php?t=14236"]here[/URL], and the current thread, and I know about nfs@home sieving it already, in a word, I know the futility of my action. I just had an antediluvian computer who crunched it for two months and half, phase two had to process 2400 primes and needed just few MB of memory. So I let it run. I even forgot about it, till I saw the primenet report in the morning, when I was looking for my eventually GPU-2-72 new factors :D, the report is few days old already... and it took the older numbers to 3 levels of magnitude higher. |
[QUOTE=LaurV;279785]Just finished P-1 to B1=10^12, B2=10^14, reported and got 122GHz-days credit. :smile:
[code] Manual Testing 1061 NF-PM1 2011-11-21 02:02 B1=1000000000000, B2=100000000000000 122.3257 [/code]Yeah, I know about all the efforts, as discussed in few threads around, like [URL="http://www.mersenneforum.org/showthread.php?t=14822"]here[/URL], and [URL="http://www.mersenneforum.org/showthread.php?t=14236"]here[/URL], and the current thread, and I know about nfs@home sieving it already, in a word, I know the futility of my action. I just had an antediluvian computer who crunched it for two months and half, phase two had to process 2400 primes and needed just few MB of memory. So I let it run. I even forgot about it, till I saw the primenet report in the morning, when I was looking for my eventually GPU-2-72 new factors :D, the report is few days old already... and it took the older numbers to 3 levels of magnitude higher.[/QUOTE] Certainly not efficient use of time. But there should still have been one chance in several hundred thousands (right?) that a factor would have been found. Unless P-1 had been taken that far already. (Hmm, somethings up with that grammar...) |
[QUOTE=lorgix;279786]Certainly not efficient use of time. But there should still have been one chance in several hundred thousands (right?) that a factor would have been found. Unless P-1 had been taken that far already. (Hmm, somethings up with that grammar...)[/QUOTE]
Better use of time comparing with the case when that computer stays on the bottom of some carton box in my basement (1Ghz proc, few megs of ram only). It can not be used for anything else, except maybe TF, which is now taken over by GPU's. The chances were almost zero. I mean, the old p-1 limits were like b1=4.2E10 and b2=100*B1, I did not have the files, so the work was done from scratch. As I said, I extended the limit by 3 orders of magnitude, but considering the immense amount of ecm done in between (which is in fact a "more advanced version of p-1"), the chances to find a factor were still almost zero. However, if the "movie" would turn out with a happy end of finding a factor.... :P (I imagine myself coming here and telling you about my secret formula of finding factors and let you boil for a while :P) |
| All times are UTC. The time now is 07:36. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.