![]() |
[QUOTE=R.D. Silverman;105708]The remaining numbers under 768 bits are:
5,317-, 323- 6,283- 6,284+, 292+ 7,263-, 269-, 271- 7,268+[/QUOTE] The "current" info in my previous post seems to have been a few months out-of-date, at least relative to hard-copy (or perhaps a needed cache clearing). Reflecting post-NFSNET base-5 factors, 5, 317- is now 7th on the Most wanted list and 7, 263- is 8th. The others are also Wanted, except that 6, 283- has been reserved --- that would be for NFSNET's next. All of these have had ecm pretests to 2*t50, and checking for factors by snfs is a lot cheaper than checking ecm pretests on gnfs candidates (three of these nine are over 200-digits, the smallest ones are 161- 168- and 173-digits; missing a p55 may be tolerable for a difficulty 220-229 snfs, perhaps for .66*225 = 150 gnfs; but we're accumulating a collection larger gnfs candidates. Of course, the new state-of-the-art is 53.2% chance of missing a p70, which translates into c. t68; or as reported elsewhere, over 3*t65 ... that's for .66*312 = c208? Translation seems a bit off, believe I heard a rough equivalent to 700-bit gnfs, which would be 210.7-digits; close enough ...) OK, three of these nine give a Much cheaper snfs model for state-of-the-art gnfs difficulty; while even the smaller composites are measurably above current Smaller-but-Needed size, if gnfs is the quicker choice (over snfs). Admittedly, a rather bizarre reason; unlikely to sustain anyone through several months of computation. Pending continued progress on getting minimal-stats back up, tracking a larger scale sieving collaboration is probably a more viable objective. Wonder how many of the c. 800 Cunninghams have snfs difficulty under 300, and how many have already been sufficiently factored to have gnfs be the method of choice. I'm still working on an updated view of the 2- list, n < 1200. -Bruce PS - If M1061 is back on our list of un-spoken-for numbers, and the M1039 ecm pre-test is an even moderately plausible standard, then finishing t60 (another 33000 curves at b1=260M?) would seem to be Very conservative. At difficulty 320, M1061 is nearly twice as hard. Just meeting the 3*p65 standard used for M1039 would be 3*(70,000); over 200K curves with B1=850M. Still seems to me that running t55's on the 2- numbers of difficulty above 230, say with unfactored part above .66*234 = c156 is more likely to find a factor. Depending upon whether one's ecm objective is finding factors, versus ecm pre-testing near-term sieving candidates, versus ecm trophy hunting (on other people's longer-term sieving candidates, or numbers like Alex's F31, way-out of sieving ... err, gnfs_sieving/snfs_sieving range). |
Did you sieve the small factors?
|
[QUOTE=VolMike;109001]Did you sieve the small factors?[/QUOTE]
Huh? |
2^772+1 has 2 small factors:17 and 43464340002838801.
Thus 2^772+1 should be divided on them and then the quotient should be tested by GNFS algorithm.So does GNFS implementation which you use eliminate by dividing these 2 small factors? |
[QUOTE=VolMike;109004]2^772+1 has 2 small factors:17 and 43464340002838801.
Thus 2^772+1 should be divided on them and then the quotient should be tested by GNFS algorithm.So does GNFS implementation which you use eliminate by dividing these 2 small factors?[/QUOTE] Your assumptions are wrong. (1) 2^772+1 is not being done via GNFS, but by SNFS. (2) The small prime factors of 2^772+1 are irrelevant. |
I see
Thus SNFS applied to 2^772+1 returns results faster then GNFS applied to quotient. |
[QUOTE=VolMike;109004]2^772+1 has 2 small factors:17 and 43464340002838801.
Thus 2^772+1 should be divided on them and then the quotient should be tested by GNFS algorithm.So does GNFS implementation which you use eliminate by dividing these 2 small factors?[/QUOTE] Yes, there are "small factors" that are removed at the appropriate time, whether doing SNFS or GNFS. However, in a case such as this where the product of the known factors is still only a very small portion of the total number, SNFS will give sufficiently smaller polynomial coefficients that ignoring the known factors will produce relations significantly faster than the best general polynomial which takes them into account. (It looks as if Bob was saying the same thing while I was composing my response) |
[QUOTE]Yes, there are "small factors" that are removed at the appropriate time, whether doing SNFS or GNFS.
However, in a case such as this where the product of the known factors is still only a very small portion of the total number, SNFS will give sufficiently smaller polynomial coefficients that ignoring the known factors will produce relations significantly faster than the best general polynomial which takes them into account. (It looks as if Bob was saying the same thing while I was composing my response)[/QUOTE] Thanks for your explanation. As I see, factorization is not completed? |
On 6/14, Paul reported that he was starting the solution of the 5.6M square matrix that he was able to produce. He estimated that that phase would take 19 days. His estimates are usually quite good.
|
[QUOTE=VolMike;109010]Thanks for your explanation.
As I see, factorization is not completed?[/QUOTE]The linear algebra is progressing nicely. At the moment, it is roughly 3.38/5.54, or 61%, completed. When it has finished, a few hours more will suffice to find the factors. As Wacky said, the linear algebra started on 14th June. Perform the extrapolation yourself. Paul |
Ok. I understand. Approximately 1 week left.
|
| All times are UTC. The time now is 00:03. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.