![]() |
![]() |
#1 |
"William"
May 2003
New Haven
3×787 Posts |
![]()
When we began the present stage of the Special Project, there were many complaints about how long each curve took and many proposals for working on the smaller factors instead. It's tricky to figure out the consequences of working on smaller factors because of the structure of the algebraic factors. For the next stage, I've built a Planner Page to help track these relationships. It's reachable from the bottom menu of any page on the ElevenSmooth site, or from the upper left navigation box from the Factors page.
Clicking on the "Standard" button shows the standard approach of running 700 curves on the largest number. Every smaller algebraic factor gets 700 indirect curves, and the expected number of primes found is 10.8. Suppose we skip the largest number, but do 250 curves for each of the five numbers on level 1. Three of the algebraic factors on Level 2 then get only 250 indirect curves - suppose we also do 150 curves for each of these. The total work is then equivelent to only 258 curves on the largest number, and the number of primes expected to be found is 10.4. This seems like a reasonable approach - 258 equivalent curves, 10.4 primes. Factors missed for the larger composites can be found in a later stage with even higher search limits. Post alternative suggestions to this thread. |
![]() |
![]() |
#2 | |
"William"
May 2003
New Haven
3·787 Posts |
![]() Quote:
The result is 132 equivalent curves with 10.4 primes. See if you find something even better on Planner Page |
|
![]() |
![]() |
#3 |
"William"
May 2003
New Haven
3×787 Posts |
![]()
I've been thinking about how far we want to go with the various composites. One possilbe guideline is to get the smallest composites to a point where they qualify for NFSNET factoring, and get the other composites to an equivalent level using the ECM Server rule of balancing (length)^2 * (Sum of B1). The NFSNET qualification rule has been "low probability there is a factor under 50 digits." These highly composite Mersenne numbers will probably be best handled using the GNFS, which has been said to be effective through about 165 digits. We should be within those guidelines if we plan on working 200 digit factors 25% of the way through the 55 digit level (B1=110M).
Prime95 does faster multiplcation but must work with larger exponents. Previous work has suggested that for the ElevenSmooth numbers smaller than 15K digits, it is more efficient to use GMP-ECM on the individual composites instead of Prime 95 on the whole Mersenne Number. By these guidelines, M(3326400) is finished. For the next stage (30 digits, B1=250K) we should do 166 curves on M(163200) and 465 curves on M(1108800). Then we would move on the (35 digits, B1=1M) level and work 54 curves on M(831600). That would end the Special Project, and all remaining work would be done in the ECM Server. What do you think of this approach? |
![]() |
Thread Tools | |
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
Special project #2 - Driver escape attempt 2^3*3 | schickel | Aliquot Sequences | 94 | 2012-03-28 23:35 |
Special project #3b - Project 400 | schickel | Aliquot Sequences | 307 | 2011-10-28 01:29 |
Special project #3a - Project 300 | schickel | Aliquot Sequences | 29 | 2011-08-12 17:45 |
Planner for Level 4 of Special Project | wblipp | ElevenSmooth | 1 | 2003-10-16 22:44 |
Special Project Level 3 (25 digits, B1=50K) | wblipp | ElevenSmooth | 0 | 2003-10-15 16:07 |