![]() |
Now what (IV)
109!+1 is proceeding nicely.
10^263-1 may or may not finish its linear algebra before I leave the country, but it'll certainly be done by Easter. What would you be interested in next? I don't see any very interesting but possible GNFS numbers from the Cunningham tables - most of the C180 to C185 are easier by SNFS. Siever 16e isn't yet really usable, which makes very hard SNFS jobs a bit out of reach. Possibilities are: 2^877-1 (Mersenne, SNFS, a bit harder than 10^263-1) 2801^79-1 (oddperfect, SNFS, a bit harder than 2^877-1) EM43 (GNFS, people on this forum have been attacking it on and off for several years, same sort of difficulty as 5^421-1 was) Something else |
[QUOTE=fivemack;167269]109!+1 is proceeding nicely.
10^263-1 may or may not finish its linear algebra before I leave the country, but it'll certainly be done by Easter. What would you be interested in next? I don't see any very interesting but possible GNFS numbers from the Cunningham tables - most of the C180 to C185 are easier by SNFS. Siever 16e isn't yet really usable, which makes very hard SNFS jobs a bit out of reach. Possibilities are: 2^877-1 (Mersenne, SNFS, a bit harder than 10^263-1) 2801^79-1 (oddperfect, SNFS, a bit harder than 2^877-1) EM43 (GNFS, people on this forum have been attacking it on and off for several years, same sort of difficulty as 5^421-1 was) Something else[/QUOTE] I'd vote for EM43 and would probably dedicate resources to it were it to be done. The interesting thing about this value is that nothing further work can be done on the sequence until EM43 is factored. The others can't be considered "roadblocks" for their respective projects as there are other Mersenne or Odd-Perfect numbers available to factor. |
[QUOTE=fivemack;167269]109!+1 is proceeding nicely.
10^263-1 may or may not finish its linear algebra before I leave the country, but it'll certainly be done by Easter. What would you be interested in next? I don't see any very interesting but possible GNFS numbers from the Cunningham tables - most of the C180 to C185 are easier by SNFS. Siever 16e isn't yet really usable, which makes very hard SNFS jobs a bit out of reach. Possibilities are: 2^877-1 (Mersenne, SNFS, a bit harder than 10^263-1) 2801^79-1 (oddperfect, SNFS, a bit harder than 2^877-1) EM43 (GNFS, people on this forum have been attacking it on and off for several years, same sort of difficulty as 5^421-1 was) Something else[/QUOTE] 11,233+ or 11,229-. |
The factorization of 10[sup]271[/sup]-1 could help to find more prime [URL="http://www.alpertron.com.ar/googolm.pl"]factors of googolplex-10[/URL].
|
[quote=fivemack;167269]109!+1 is proceeding nicely.
10^263-1 may or may not finish its linear algebra before I leave the country, but it'll certainly be done by Easter. What would you be interested in next? I don't see any very interesting but possible GNFS numbers from the Cunningham tables - most of the C180 to C185 are easier by SNFS. Siever 16e isn't yet really usable, which makes very hard SNFS jobs a bit out of reach. Possibilities are: 2^877-1 (Mersenne, SNFS, a bit harder than 10^263-1) 2801^79-1 (oddperfect, SNFS, a bit harder than 2^877-1) EM43 (GNFS, people on this forum have been attacking it on and off for several years, same sort of difficulty as 5^421-1 was) Something else[/quote] I'd go for 2^877-1 first on the grounds that this is the mersenneforum, and then EM43. This both avoids two GNFSs in a row and allows more time for improvements in msieve's poly selection. |
[QUOTE=R.D. Silverman;167276]11,233+ or 11,229-.[/QUOTE]
Heck, I'll do those two. Should be about 30 days and 25 days of sieving, respectively, on my currently-available resources. I'm surprised they are still uncracked. I'll send off a missive to Wagstaff, and grab one of these. I finished 11,227- a while ago. I thought that 11,229- was already reserved, but a glance at the Cunningham project page says it hasn't. |
[tex]\ 2^{877}-1[/tex]
|
I'd also like to see M877 factored.
(Not sure if I'm able again to contribute...) |
That looks a reasonable consensus for 2-877. When I get back after Easter, I'll put up a reservations post; until then, please sieve 109!+1 more, so that the matrix doesn't take eight weeks.
|
[QUOTE=fivemack;167540]That looks a reasonable consensus for 2-877. When I get back after Easter, I'll put up a reservations post; until then, please sieve 109!+1 more, so that the matrix doesn't take eight weeks.[/QUOTE]
This one is C178 with difficulty 264. As a number below C190, it ought to have had 7*t50 >> t55 worth of ecm ("smallest 100 Cunninghams" list). I could add another t55 (to make p54/p55's less likely, while not ruling out p59/p60's), if that would be regarded as a worthwhile contribution? -Bruce |
Another t55 would definitely be a worthwhile contribution, thanks very much for the offer.
|
[QUOTE=fivemack;167628]Another t55 would definitely be a worthwhile contribution, thanks very much for the offer.[/QUOTE]
Last of the scragglers are in, second pass is 10,446 curves, B1=260M (default B2). That's past 6*t50, with t55 somewhere between 5.0-5.7 t50s. A full 2t55 with the first pass of 7t50, which (if I recall) meets an 80% chance of finding a p55, for Peter's term of "removing" these (uhm, so, still a 1/5 that we're supposed to take as an acceptable risk). Don't think that we'll see a p52/p53. No promises about p59/p60. -bd |
[QUOTE=bdodson;169033]Last of the scragglers are in, second pass is 10,446 curves, B1=260M
(default B2). That's past 6*t50, with t55 somewhere between 5.0-5.7 t50s. A full 2t55 with the first pass of 7t50, which (if I recall) meets an 80% chance of finding a p55, for Peter's term of "removing" these (uhm, so, still a 1/5 that we're supposed to take as an acceptable risk). Don't think that we'll see a p52/p53. No promises about p59/p60. -bd[/QUOTE] Hi Bruce, My general opinion is that unless you want to do some specific ECM pre-tests on an NFS candidate, that Cunningham numbers with difficulty under 250 are probably not worth any further ECM effort. You may want to either help out on the Fibonacci/Lucas numbers of low index (say n < 1500) OR I have a very small number of Homogeneous Cunningham numbers for you to test, if you have the time. See [url]http://www.chiark.greenend.org.uk/ucgi/~twomack/homcun.pl[/url] The following site contains the partial (known) factorizations: [url]http://www.leyland.vispa.com/numth/factorization/anbn/main.htm[/url] The numbers we need NFS pre-tested are: 3,2,457- 3,2,499- 3,2,482+ 3,2,494+ 3,2,496+ 3,2,499+ |
[QUOTE=R.D. Silverman;169202]Hi Bruce,
My general opinion is that unless you want to do some specific ECM pre-tests on an NFS candidate, that Cunningham numbers with difficulty under 250 are probably not worth any further ECM effort. [/QUOTE] If you scroll up a bit you'll see [code] This one is C178 with difficulty 264. [/code] Not difficulty under 250. You might want to check the recent Batalov + Dodson snfs factorization of 2,1618L, with a p51. The numbers in c190-c233 of difficulty below 250 have gotten 4*t50; while the ones above 250 (not as likely for immediate sieving?) had 3*t50. For 1618L, it was on the wrong side of c233, in c234-c250, which has only had 2t50. If you're routinely sieving ones with courve counts so far below t55, you're certainly going to hit some more p51-p54's. I'm currently working on near-term sieving candidates with difficulty 263 and 268. The ones above c250 have hardly had any ecm at all. I don't regard these curves as ecm "factoring". but rather as a precomputation for sieving, ecm "pre-testing", and have a full schedule of such Cunningham candidates. No factors, much at all, but I might hope to occasionally hit an early p59/p60. -Bruce |
[QUOTE=R.D. Silverman;169202]
The numbers we need NFS pre-tested are: 3,2,457- 3,2,499- 3,2,482+ 3,2,494+ 3,2,496+ 3,2,499+[/QUOTE] <WRONG>I believe that 3,2,494+ has been factored completely...</WRONG> I put 7550 curves at 43e6 into the remaining 5 on this list (Batalov is keeping 3,2,494+ warm for us). That's t50. This is not enough, of course. I assume that Paul Leyland's ECM server has also shown these plenty of love at 3e6 and 43e6. I might plow through some more ECM on these 3,2 candidates a few weeks from now. If anyone has suggestions for how many curves would be optimal, I would be glad to hear them. |
[quote=FactorEyes;169248]I believe that 3,2,494+ has been factored completely, but Paul's site is down, so I can only go by the fivemack reservations page at chiark.greenend.org.uk.
I put 7550 curves at 43e6 into the remaining 5 on this list, so that's t50. This is not enough, of course.[/quote] 3,2,494+? Not yet. It's stagnating on my disk. It is next in my queue, after 5,362+ :smile: I am pretty much done with 11+2,199 though. These are my two liabilities. |
[QUOTE=bdodson;169247]If you scroll up a bit you'll see [code]
This one is C178 with difficulty 264. [/code] Not difficulty under 250. You might want to check the recent Batalov + Dodson snfs factorization of 2,1618L, with a p51. The numbers in c190-c233 of difficulty below 250 have gotten 4*t50; while the ones above 250 (not as likely for immediate sieving?) had 3*t50. For 1618L, it was on the wrong side of c233, in c234-c250, which has only had 2t50. If you're routinely sieving ones with courve counts so far below t55, you're certainly going to hit some more p51-p54's. [/QUOTE] I don't regard finding a factor with NFS in the 51 to 55 digit range to be a problem on these smaller numbers. The *expected* time to factor the smaller numbers is much less with NFS than with ECM. |
[QUOTE=R.D. Silverman;169350]I don't regard finding a factor with NFS in the 51 to 55 digit range
to be a problem on these smaller numbers. The *expected* time to factor the smaller numbers is much less with NFS than with ECM.[/QUOTE] The expected time to factor any large number with ECM is meaninglessly enormous, because t(find 70-digit factor by ECM) is so large and p(70-digit factor) quite big, so it's a matter of picking your early-abort strategy. I have an entirely unjustified 'ECM for about 25% of the time the NFS job will take', each of the twenty thousand curves done would take about an hour and I'm fairly sure the sieving will take around sixty kilohours. Bother. I have now convinced myself that 2^877-1 is a GNFS number: 10^263-1 took ~170 megaseconds, 109!+1 looks as if it'll take ~120 megaseconds, and 2^877-1 is only a little bigger in the relevant sense than either of those. I'd better figure out some polynomial-search parameters, or some indisputably SNFS number of less than 255 digits and difficulty around 265. Help? |
[quote=fivemack;169360]I'd better figure out some polynomial-search parameters, or some indisputably SNFS number of less than 255 digits and difficulty around 265. Help?[/quote]
What about 2801^79-1? (a bigger one this time) [b]fivemack:[/b] it's more than 255 digits long so would need recompiled sievers - any oddperfect-search number will be of length basically equal to its SNFS difficulty, because the roadblocks are of the form 'we know no factors of sigma(a^b)' |
[quote=fivemack;169360]I'd better figure out some polynomial-search parameters, or some indisputably SNFS number of less than 255 digits and difficulty around 265. Help?[/quote]
Just hunting around through the first 5 holes and cross-checking with what is already spoken for, I see 10,268+ at 243 digits and difficulty 269... |
That's the right sort of size and shape - but is anyone interested in it?
|
[QUOTE=fivemack;169360]The expected time to factor any large number with ECM is meaninglessly enormous, because t(find 70-digit factor by ECM) is so large and p(70-digit factor) quite big, so it's a matter of picking your early-abort strategy. I have an entirely unjustified 'ECM for about 25% of the time the NFS job will take', each of the twenty thousand curves done would take about an hour and I'm fairly sure the sieving will take around sixty kilohours.
Bother. I have now convinced myself that 2^877-1 is a GNFS number: 10^263-1 took ~170 megaseconds, 109!+1 looks as if it'll take ~120 megaseconds, and 2^877-1 is only a little bigger in the relevant sense than either of those. I'd better figure out some polynomial-search parameters, or some indisputably SNFS number of less than 255 digits and difficulty around 265. Help?[/QUOTE] My joint paper with Sam Wagstaff: "A Practical Analysis of ECM" gives an exact abort strategy for when to shift from ECM to QS/NFS. One combines the sample data obtained from ECM failures, perhaps performed at different B1,B2 values, with the known a-priori distribution of factors given by Dickman's function. (or any other approximation to the distribution of factors). One uses Bayes' Theorem to derive a posterior distribution and computes the *expected value* of the posterior. If the time to find a prime near the "expected value" via ECM with p=1-1/e exceeds the time it would take NFS, then switch. This is based upon using the unit-linear loss function combined with minimizing the expected cost to achieve the factorization. The unit-linear loss function simply applies a linear cost function to the cost of being wrong, under the assumption that computer costs are a simple linear function of the CPU time that is spent. |
[QUOTE=R.D. Silverman;169368]My joint paper with Sam Wagstaff: "A Practical Analysis of ECM"
gives an exact abort strategy for when to shift from ECM to QS/NFS. [/QUOTE] I was very happy to have gotten a copy from you; and spent many years relying upon the table for effort (in curves) to find a factor of a given size, before the appearance of Peter's thesis, and ecm/fft --- with performance now matched/improved by gmp-ecm. But there's a mis-match in the assumptions, if I understand correctly. The analysis supposes that the same resources are being applied to both sieving and ecm. In my case, sieving is on x86-64 clusters (Opteron and intel/quadcore), while ecm is being run on a grid of pcs, not suitable for sieving. So -- for me -- it's not a case of deciding when to switch from ecm to sieving; the pcs are always running ecm, the x86-64s always sieving (when they're not being applied to higher purposes in grad education). As far as I can tell, you're the unique person here that doesn't mind finding p51-p54's by snfs. For me, the optimization is between tedium in running ecm on a number that we don't expect to have a factor in ecm range, vs my co-worker being unhappy with an occasional small factor. -Bruce |
Yes. There is an implicit assumption that the resources are the same
for both methods. |
[QUOTE=bdodson;169404]
As far as I can tell, you're the unique person here that doesn't mind finding p51-p54's by snfs. For me, the optimization is between tedium in running ecm on a number that we don't expect to have a factor in ecm range, vs my co-worker being unhappy with an occasional small factor. -Bruce[/QUOTE] I don't mind. I take the view that :poop: happens. To pay for the pleasure of finding an unusually large factor by ECM, you have to take the disappointment of finding an unusually small one by NFS which had previously been missed by ECM. Paul |
[QUOTE=xilman;169468]
To pay for the pleasure of finding an unusually large factor by ECM, you have to take the disappointment of finding an unusually small one by NFS which had previously been missed by ECM. Paul[/QUOTE] An interesting observation, on an under-explained possible property of ecm --- there seems to be a better chance of finding an unusually large factor by ecm from a collection of numbers that haven't been substantially searched for factors. Of course, there'll be more small factors found from a newer collection, but I'm thinking of early large factors, found well below the number of expected curves. Maybe this is just an impression, on too sparse data. There aren't many collections from which factors below p55 have already been removed (to 80%, 2t55?; a little below, 2t53.5, say), but that oughtn't to have too seriously depleted factors in [p58,p70], certainly not in [p62,p70], for very unusually large factors. I am developing some experience running numbers post 4t50; and wondering whether there's any reason why to expect (as Paul seems to suggest) that these numbers would be less likely to produce a factor in the top 3 of the year? I did run fewer curves in 2008 than in 2007, perhaps only half as many, but we're getting rather well into 2009 by now. Perhaps I ought to include 2007, with a single p62; except for the three from p60-p61. The number of curves run here in 2006 was way below the number in 2007. More in curves in 2008 than 2006. More in 2006 than in 2005; but those were the years with p66, p67. Perhaps numbers from c234-c320 with factors below p47 removed include many more numbers with no factors below p70? I spent a lot of curves/time there, as well as in parts of c160-c233 with few factors below p52. -Bruce |
[QUOTE=fivemack;169360]
I'd better figure out some polynomial-search parameters, or some indisputably SNFS number of less than 255 digits and difficulty around 265. Help?[/QUOTE] Is 3, 562+ C255 diff 269 too large? It's way under-tested for ecm. There's c243 2^1043 - 1 (SNFS 269.12). Also hard is c241 6^346 + 1 (SNFS 269.24). A number that has been ecm'd is [code] c238 3^551 - 1 (SNFS 262.89) [/code] although Batalov has already looked at parameters (and might reserve it soon for Batalov+Dodson? ... we could find another one). Probably no one has looked at c236 7^338 + 1 (SNFS 263.67), likewise c239 3^563 + 1 (SNFS 268.62). -bd |
3,562+
I'll give up anything to the mersenneforum! :smile:
(well, [I]almost[/I] anything) There are plenty of good numbers for everyone! I think 3,562+ c255 is an excellent number. (I was looking at it, too.) It will sieve with any old-fashioned siever (I guess this is one of Tom's concerns). [SIZE=1]3,562+ not to be confused with 5,362+[/SIZE] |
Base 3, my hobby horse! I'd be glad to help sieving this one.
Alex |
[QUOTE=bdodson;169477]Is 3, 562+ C255 diff 269 too large? It's way under-tested for ecm.
There's c243 2^1043 - 1 (SNFS 269.12). Also hard is c241 6^346 + 1 (SNFS 269.24). A number that has been ecm'd is [code] c238 3^551 - 1 (SNFS 262.89) [/code] although Batalov has already looked at parameters (and might reserve it soon for Batalov+Dodson? ... we could find another one). Probably no one has looked at c236 7^338 + 1 (SNFS 263.67), likewise c239 3^563 + 1 (SNFS 268.62). -bd[/QUOTE] 2,923+ ???? |
That's done (Page 110). I fell into the same trap. They did it so fast (Tom+Bruce) that it didn't even register in people's memories.
Here's one that may be interesting (it's a quintic) - 2,979+ c255 (diff.268) |
[QUOTE=Batalov;169563]That's done (Page 110). I fell into the same trap. They did it so fast (Tom+Bruce) that it didn't even register in people's memories.
Here's one that may be interesting (it's a quintic) - 2,979+ c255 (diff.268)[/QUOTE] There are a number of suitable 2,LM candidates; e.g. 2,1762L etc. For some reason, people seem reluctant to work on the 2LM table. |
[QUOTE=akruppa;169533]Base 3, my hobby horse! I'd be glad to help sieving this one.
Alex[/QUOTE] Yes, the very reason we considered it. The 3- list is getting shorter. On 2LM, Batalov + Dodson did 2,1618L, will be starting sieving on 2, 1686L as soon as I'm ready for a break from 12+256 (Womack+Dodson), and have just reserved 2,2086M C268=diff268. -Bruce |
A large part of the reason why I started focussing on the base 3 tables some... hmm, how long's it been... six or seven years ago?... was that the largest remaining composite for 3- and 3+ was smaller than for the other Cunningham bases, so I figured those would have the best chance of clearing them out completely. Plus, it's an odd prime, so some of those factorisations might perhaps help someone who needs the structure of some GF(3^n)*, or maybe advance OPN search a little bit.
Alex |
[QUOTE=akruppa;169605]A large part of the reason why I started focussing on the base 3 tables some... hmm, how long's it been... six or seven years ago?... was that the largest remaining composite for 3- and 3+ was smaller than for the other Cunningham bases, so I figured those would have the best chance of clearing them out completely. Plus, it's an odd prime, so some of those factorisations might perhaps help someone who needs the structure of some GF(3^n)*, or maybe advance OPN search a little bit.
Alex[/QUOTE] The Garo/Rogue tables for 5+ and 7+ still take most of two full pages, but 11- and 11+ are both short. For 3- we'll need snfs 280, twice? Looks like two more in 3+. The lists for 5- and 7- also fit on a single page. Not sure what would trigger an extension; perhaps Paul or Bob know? One notable feature, aside from completely clearing both + and - might be a number of bases + or - with fewer than five "first holes". We're not that far from putting some blank spots on that part of the "champions" page at Sam's site. -Bruce PS - perhaps someone could find snfs difficulties on some of the other short tables, like on Alex's 3-? |
[QUOTE=bdodson;169757] ...
PS - perhaps someone could find snfs difficulties on some of the other short tables, like on Alex's 3-?[/QUOTE] That was quick (3+ and 11-+). There are several other short (one page in the Cunningham subforum) lists. Both 12- and 12+ are short. The others are minus only, base-6 and base-10. Perhaps it's worth emphasizing that this is not a topic for which participation is limited to those with either access to large University machines or top-of-the-line icore-7s. The new Selfridge + Wagstaff wanted lists include a bunch of 3+ numbers (in particular), and typically focus on neglected numbers that are easier relative to numbers that take more extensive resources. Sustained persistence being the primary resource. I've no idea what Cunningham et. al. (back in the 1920's?) were thinking, but perhaps the base-b lists with b composite serve as test cases for seeing whether there's a visible difference, aside from ones already known, with the factorization tables for the prime bases. Base-10 has its own interest (and long focus of attention) from repunit (1111...111's) factorizations. Base-12 is most likely short due to sustained attention from Peter/CWI. Among all bases there's a special interest in numbers with few/small non-algebraic factors (number of digits close to snfs difficulty), and especially b^q-1, b^q+1 for prime exponents, M_p's for example. Thanks are due to Alex and (most recently) Batalov for attention to updating the forum tables. -Bruce |
[QUOTE=bdodson;169477] ... A number that has been ecm'd is
[code] c238 3^551 - 1 (SNFS 262.89) [/code] although Batalov has already looked at parameters (and might reserve it soon for Batalov+Dodson? ... we could find another one). ... -bd[/QUOTE] We took 10, 393+ (and one each from 11- and 12+) as warm-up for 2, 2086M, so 3,551- is no longer on the list of numbers we might consider reserving any time soon --- i.e., open; un-encumbered for the forum. Being from c234-c250 it had 2t50 to start. The second (last?) round of ecm finished over the weekend, 10325 curves with B1 = 260M (p60-optimal; default B2) a new 6.5*t50; should be ready to sieve if you like. -Bruce |
[QUOTE=bdodson;170123] ...The second (last?) round of
ecm finished over the weekend, 10325 curves with B1 = 260M (p60-optimal; default B2) a new 6.5*t50; should be ready to sieve if you like. -Bruce[/QUOTE] OK, nevermind the curve counts on 3, 551-. For M859 C203 diff 258.58 it's below c233, which is good; then above diff 250 means 3t50, just counting Lehigh curves. Must have had a bunch more from other people as a Mersenne number. -bd |
M859
The convenience detour is here
=> [URL="http://www.mersenneforum.org/showthread.php?t=11761"][COLOR=#810081]http://www.mersenneforum.org/showthread.php?t=11761[/COLOR][/URL] |
| All times are UTC. The time now is 22:04. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.