[QUOTE=10metreh;175852]It's very odd to pick on 4788. Maybe it's chance and someone's just being naughty?[/QUOTE]I would bet very heavily on "chance". I don't think there's any reason someone would pick on 4788 in particular.....

[QUOTE=schickel;175853]With what kind of hardware? I've got a dual core I could throw into the mix......Maybe we would gang up on this and do it before Saturday......
I can set up an FTP server on my end and you could upload your relations before you leave......[/QUOTE] 64bit siever running on Core 2 Duo 2GHz laptop (srsly, that's it). But looks like I might be able to pull off the autoposting thingie using LWP::Simple. Your call. 
[QUOTE=schickel;175856]I would bet very heavily on "chance". I don't think there's any reason someone would pick on 4788 in particular.....[/QUOTE]
But the use of Alex's name suggest somebody who reads the forum ==> Not chance 
[QUOTE=axn;175857]64bit siever running on Core 2 Duo 2GHz laptop (srsly, that's it). But looks like I might be able to pull off the autoposting thingie using LWP::Simple.[/QUOTE]OK, I'm all for trying new things. Let's give it a shot.[QUOTE]Your call.[/QUOTE]Go for it. It'll make for an interesting experiment.

[QUOTE=axn;175858]But the use of Alex's name suggest somebody who reads the forum ==> Not chance[/QUOTE]OK, someone who reads the forum, but I was thinking more along the lines that we just happened to be at the front of the queue when the "someone" starting meddling, not that the "someone" is picking on us particularly.
How many projects use the DB for serious work on their numbers? I noticed that the responses seemed a little sluggish last night when I was pulling aliquot updates..... 
[QUOTE=axn;175847]No need. Like I said, this one is good enuf. Unless you can find something now that I can finish in half the time (or if some one gives me the perl code requested 2 posts above)[/QUOTE]I called off the search, so go ahead with the 4.62. Good luck!

[QUOTE=schickel;175861]I called off the search, so go ahead with the 4.62. Good luck![/QUOTE]
Ok. I've brought in another PC temporarily to shave off some more time. I might abort the ecm run just to start the sieving early. 
Well, I've got good news and bad news. The good news is, automatic submission works. The bad news is, the speed of the 64bit sieve is not that much better , so it'll take more time than I had estimated. Expect the factors to pop out in 3 days time.

[quote=axn;175887]Well, I've got good news and bad news. The good news is, automatic submission works. The bad news is, the speed of the 64bit sieve is not that much better , so it'll take more time than I had estimated. Expect the factors to pop out in 3 days time.[/quote]
are you using sievers with 64bit asm? from this thread [URL="http://mersenneforum.org/showthread.php?t=11660?"]http://mersenneforum.org/showthread.php?t=11660[/URL] ? 
[QUOTE=henryzz;175904]are you using sievers with 64bit asm? from this thread [URL="http://mersenneforum.org/showthread.php?t=11660?"]http://mersenneforum.org/showthread.php?t=11660[/URL] ?[/QUOTE]
Yes. At least, I think so. There were a bunch of links lying around. And, it is faster than the regular 32bit version. Just not 2x faster that I was anticipating. 
[quote=axn;175914]Yes. At least, I think so. There were a bunch of links lying around.
And, it is faster than the regular 32bit version. Just not 2x faster that I was anticipating.[/quote] might be worth checking if your not getting even 2x(i dont know what improvement i get but i would guess at least ~2x) dont waste cpu cycles:smile: 
[QUOTE=axn;175887]Well, I've got good news and bad news. The good news is, automatic submission works. The bad news is, the speed of the 64bit sieve is not that much better , so it'll take more time than I had estimated. Expect the factors to pop out in 3 days time.[/QUOTE]I hope that I can one day say that getting factors of a 130+ digit number in [b]3 days time[/b] is bad news. I just looked at a log for a c130 that I did in 2006. It took me [b]3 weeks[/b] to get my answer back then. (At least it was p64*p67, so it wasn't anywhere close to an ECM miss!)
I could probably do it about 4 times as fast today, so ~5 days...... 
i am still yet to do a 130+ myself
i am perfectly able though it would take 23 weeks(vague guess:smile:) for me because my pc is off a lot 
[QUOTE=jrk;175821]I did 2000 curves @ B1=3000000, B2=5706890290 on the c134.[/QUOTE]
Add to that, 337 curves @ B1=11000000, B2=35133391030 
[QUOTE=axn;175887]The good news is, automatic submission works.[/QUOTE]
Yes it does, apparently. Next line has a c146. 
[QUOTE=jrk;176500]Yes it does, apparently.[/QUOTE]
Apparently! However, it missed the ETA by about 20hrs. I wonder why. I have some sleuthing to do once I get back. 
[quote=jrk;176500]
Next line has a c146.[/quote] I am curious: What is the limit of ggnfs in terms of factoring? Can it go this high? I remember reading that 'major changes' were needed to go above a certain point. 
It depends what you mean by 'ggnfs'.
Using ggnfs sieving and msieve completion (as is being done on this number so far), and with distributed sieving by members of this forum, I've done a c180 and a c177; it takes about ten CPUyears and about a month on a quadcore to finish the job. I've also done a c158 using only the computers in my study to demonstrate that one could. Above c140 the parameters in the factLat script weren't terribly good the last time I looked, so you had to do some trial runs to figure out better parameters. Using just the ggnfs tools, I ran into trouble at the S200 level, which is around 140 digit SNFS; but that was mostly that the script was hitting quadratictime problems loading in a huge number of processed relations, merging in a few more, saving out the processed relations ... 
[quote=fivemack;176548]I ran into trouble at the S200 level, which is around 140 digit SNFS[/quote]
Surely S200 = 140 digit [B]G[/B]NFS? 
I've done 2000 curves @ B1=3000000, B2=5706890290, and 512 curves @ B1=11000000, B2=35133391030 on the c146.

Probably around 5000 curves at 43M is in order.

[QUOTE=axn;176707]Probably around 5000 curves at 43M is in order.[/QUOTE]
Are you sure? I estimate that will take over 13 CPU days. 
A little more P+/1 work reported. I've also got two cores running ECM @43M.

whats happening with the c146?
there has been no action in the last few days 
[QUOTE=jrk;178560]Factors:
[code]prp49 factor: 3359627097803529119333594314130597504746070467741 prp97 factor: 7795221876206296885438566840237095660684790701904608649645175033444675074244992073402000994845613[/code] Logfile is attached.[/QUOTE] Lines 2411 to 2417 now in database. Line 2418 is stuck on a c133. 
i am currently running 2x100 curves @11M
they should be finished by 4:30PM BST edit: done 
It morphed into 2[SUP]5[/SUP] * 3
Bummer! :sad: 
A poly for the c133:
[code]n: 2683707943146349562594913594368740338601585630194076012340326110184392613944024472013854299982635985994208438381006903892356157576267 # norm 7.175371e13 alpha 7.032243 e 4.896e11 skew: 340207.35 c0: 220447390434500974223511645135405 c1: 2904318470965874020216051696 c2: 19002076953846866514443 c3: 70418254980845816 c4: 315214246008 c5: 293760 Y0: 24668965982194073654364148 Y1: 642550403627239 rlim: 10000000 alim: 10000000 lpbr: 28 lpba: 28 mfbr: 56 mfba: 56 rlambda: 2.5 alambda: 2.5 [/code] Shall we make a team sieve of it or does someone want to take it alone? 
[QUOTE=Batalov;178617]It morphed into 2[SUP]5[/SUP] * 3
Bummer! :sad:[/QUOTE]Bad, but not disastrous....at least it's not the driver. The c133 makes it kinda nice, too. 
[QUOTE=jrk;178647]A poly for the c133:
<snippity> Shall we make a team sieve of it or does someone want to take it alone?[/QUOTE]I'll throw in some CPU time if it's a team job. I've got 3 SNFS jobs in my pipeline that I can put off a little.... 
if no one claims it by this evening it will be a team sieve

Run 388 out of 1000:
Using B1=10000000, B2=35132741290, polynomial Dickson(12), sigma=3382755867 Step 1 took 53455ms ********** Factor found in step 1: 88373238977310581080739438230179037711853 Found probable prime factor of 41 digits: 88373238977310581080739438230179037711853 Probable prime cofactor 30367880301811492056466046252344450173362667610608986764253700680964865962194016382618838039 has 92 digits (another 1000@1e7 on the other core of that machine finished without result) Currently running 2x1000@1e7 on the C129 of 2419; results tomorrow evening 
[quote=fivemack;178729]Run 388 out of 1000:
Using B1=10000000, B2=35132741290, polynomial Dickson(12), sigma=3382755867 Step 1 took 53455ms ********** Factor found in step 1: 88373238977310581080739438230179037711853 Found probable prime factor of 41 digits: 88373238977310581080739438230179037711853 Probable prime cofactor 30367880301811492056466046252344450173362667610608986764253700680964865962194016382618838039 has 92 digits (another 1000@1e7 on the other core of that machine finished without result) Currently running 2x1000@1e7 on the C129 of 2419; results tomorrow evening[/quote] thank you 
Run 17 out of 1000:
Using B1=10000000, B2=35132741290, polynomial Dickson(12), sigma=1243555911 Step 1 took 53048ms Step 2 took 27474ms ********** Factor found in step 2: 3819627744841045633763167711966392293449 Found probable prime factor of 40 digits: 3819627744841045633763167711966392293449 Probable prime cofactor 38660873208570819794500633296917762712062378249423725061586686009144937647689395700561613 has 89 digits 
[quote=fivemack;178735]Run 17 out of 1000:
Using B1=10000000, B2=35132741290, polynomial Dickson(12), sigma=1243555911 Step 1 took 53048ms Step 2 took 27474ms ********** Factor found in step 2: 3819627744841045633763167711966392293449 Found probable prime factor of 40 digits: 3819627744841045633763167711966392293449 Probable prime cofactor 38660873208570819794500633296917762712062378249423725061586686009144937647689395700561613 has 89 digits[/quote] also a nice result:smile: 
Line 2420 cofactor:
[code]Using B1=3000000, B2=5706890290, polynomial Dickson(6), sigma=3944710087 Step 1 took 7913ms Step 2 took 4095ms ********** Factor found in step 2: 7750028045411158946557578572254773307 Found probable prime factor of 37 digits: 7750028045411158946557578572254773307 Probable prime cofactor 109565870207105400323440452413834928006878979932220284688027280315169513356210038361477771 has 90 digits [/code] 
Line 2421 cofactor:
[code]Using B1=3000000, B2=5706890290, polynomial Dickson(6), sigma=768155566 Step 1 took 7969ms Step 2 took 4403ms ********** Factor found in step 2: 190431080913976704860818639728300686557 Found probable prime factor of 39 digits: 190431080913976704860818639728300686557 Probable prime cofactor 204522606372200186975999518399003917395671633812465359496470444180956996556509658617978604220713 has 96 digits [/code] 
Line 2422 is 2^5 * 3 * c157 :ouch2:

[quote=jrk;178749]Line 2422 is 2^5 * 3 * c157 :ouch2:[/quote]
How long would something like that take to GGNFS compared to a c146 (like the last team sieve)? Hopefully ECM will find a big factor and put this within the range of a single user. 
A C158 took me about three weeks on and off on four CPUs for the polynomial search, plus 15 million CPUseconds (about three weeks on eight CPUs) for the sieving, plus about one week on four CPUs for linalg at the start of this year  I started the polynomial search 1 Feb 2009 and finished the factorisation 25 Apr 2009. I may not be the typical single user.

[QUOTE=jrk;178749]Line 2422 is 2^5 * 3 * c157 :ouch2:[/QUOTE]
Believe it or not, that is good news. There is a 5050 chance of losing this guide and dropping down to 2^4 
I did 2018 curves @ B1=3000000, B2=5706890290
And now starting to do curves @ 43M How much ECM is wanted for a c157? 
[QUOTE=jrk;178772]How much ECM is wanted for a c157?[/QUOTE]
Slightly more than full t50 level would be adequate, I would think. Say, 8000 curves @ 43M? 
do we have anyone capable of the linear algebra for this thing?
i am guessing 2Gb of memory isn't enough 
I ran 10 curves with B1=43M, auto B2 (it picked B2=~240G). Reported to Syd's DB. No factors.

[quote=henryzz;178790]do we have anyone capable of the linear algebra for this thing?
i am guessing 2Gb of memory isn't enough[/quote] I can do it, if needed. 
[quote=bsquared;178810]I can do it, if needed.[/quote]
i just didnt want to do all the sieving and have no one to do the postprocessing:smile: 
I will backup Ben, if needed, too.
Rest assured, if it will be sieved, someone will solve. Do lots of ECM first, of course. 
[QUOTE=Batalov;178822]
Do lots of ECM first, of course.[/QUOTE] I have queued a bunch of curves @ B1=11M 
So what is the command to run ecm on this? I have been letting aiqueit handle it up to this point. Might as well learn how to do it by hand and help out at the same time.

[QUOTE=Greebley;178826]So what is the command to run ecm on this? I have been letting aiqueit handle it up to this point. Might as well learn how to do it by hand and help out at the same time.[/QUOTE]
create a file which contains the c157, followed by an endline character. (e.g. named alq4788.2422.txt) then in the command line type: [code]ecm nn c <number of curves you want to do> <alq4788.2422.txt <B1> >>outputfile.out[/code] so an example command line might look like this: ecm nn c 100 <alq4788.2422.txt 11e6 >>alq4788.2422_11e6.out the nn option makes ecm run in lowest priority mode (so that it doesn't interfer with other programs like word or excel), c specifies the number of curves you want to run, B1 is the "B1" bound (see the ECM article in the mersennewiki). For this job, use either 11e6 or 43e6 as B1 bound. 
[quote=Greebley;178826]So what is the command to run ecm on this? I have been letting aiqueit handle it up to this point. Might as well learn how to do it by hand and help out at the same time.[/quote]
Here's an example command: [code]ecm c 5 one 43000000 < c157.txt[/code]ecm: the name of your ECM.exe executable (.exe part is optional on Windows, of course) c 5: Says to run 5 curves. Adjust accordingly. one: Stops if it finds a factor. 43000000: the B1 value. Adjust accordingly. (optional after the B1 value is the B2 value) " < c157.txt" sends ECM the data in c157.txt, so before you run this command make a c157.txt with the c157 in it. (without any "n: " before it or anything) p.s. at this size and B1, it took me (one core of an Athlon 2.5 GHz dual core) just over 10 minutes for step 1 and just under 4 minutes for step 2, so if you don't see any "step 1 completed" for a few minutes, don't think it froze or anything. :smile: 
echo number  ecm c count one b1limit  tee a ecmout
where "number" is the number to factor, "count" is how many curves you want to do, and "b1limit" is the size of B1 you want. The one is so that the program will quit when it finds a factor even if the cofactor is still composite (since at that point it will be easier to GNFS than continuing ECM). You definitely want to output the results to a file so you won't lose the results if the system goes down. I like using tee for this. In this example the output will be appended to a file named ecmout. So if you want to do 100 B1=43M curves on this number you would do [code]echo 2179298864647941593089329843480960377234329034733632473654418628122413680526625181566302348541331686710338443591873700839852521892809407336286970457974236409  ecm c 100 one 43e6  tee a ecmout[/code] 
FYI I just rechecked the sequence from the database, since others have mentioned errors with other sequences. 4788 is right.

[QUOTE=jrk;178843]FYI I just rechecked the sequence from the database, since others have mentioned errors with other sequences. 4788 is right.[/QUOTE]
this should be done with [b]all[/b] sequences in the database, because there are seqs with false computation of an index and if such incorrectness will be continued with the workers, they won't find those errors, only with aliqueit or other checking progs! see the thread with errorseqs! 
3000@1e7 complete, no factors

3200 @ 43M completed last night, no factors.

Are the P1 and P+1 sufficient? The DB lists P1 B1=1000000000 (1e9 or 1G) and B2=205702371522480 (just over 2e14 or 205T) and two runs of P+1 with the same parameters. If not, what bounds would be good, about how long will it take, and how much memory will it use in stage 2? I'd be interested in doing more P1 and/or P+1 if it would be useful and fit in 1GB of available memory.
From a few short tests of whether GMPECM errors or not with the given maxmem and B1, it looks like I can do up to B1=4e9 and would take between 400 and 450 MB. Anyone know if that's remotely right? 
1800@43e6 set going over the weekend; by Monday either we'll have a factor or should start polynomial search.

I'll do another 3000 or so at 43M as well, this weekend.

I finished 68 at the 43 million mark and I finished 22 at the 110 million mark because I miscounted the zeros. I guess that is why there is the 11e6 method instead of needing to type 11000000. No factors.

The default B2=13494038602573560 for P1 B1=1e10 uses 16GB of memory with default parameters (k=2). With 8GB of memory, the default B2 is 9924641473652116 (just about 1e16, memory estimate 8064MB) with k=6. With only 1GB, you get k=360 which is very inefficient. If you have a 32 bit machine, you might also get a problem finding enough NTT primes.
I think the runs with B1=1e9 I did are probably sufficient, but I'll try P1 with B1=1e10 anyway, merely because it fits in 8GB so neatly... (Hey, we could start a "Worst reasons for factoring" list!) I won't do more P+1, though, I think it's not worthwhile. Alex 
I have queued 500 curves at B1=43e6 to run over the weekend on the office PC, and I will do a few curves (maybe some 200 to 300) at 110e6 at home on my laptop.

Ok, thanks for the info and the extra P1 run, Alex. Is there any particular reason why it seems that B1 for P1 is always 1ex (1e9, 1e10, etc.) while B1s for ECM are all different? (e.g. 5e4, 25e4, 43e6)
[B]10metreh:[/B] Basically it's because the ECM limits correspond to a specific factor size (eg 25e4 is 30 digits), but the P1 limits don't, so the obvious powers of 10 are used. 
No particular reason other than that people consider powers of ten "round" numbers.
For ECM, the set of standard B1 values is derived from an analysis which B1 produces the smallest expected time for finding a factor of a certain size, and early papers on ECM gave figures like B1=11000 for 20digit factors, B1=50000 for 25 digits, etc. These values stuck and everyone keeps using them, mostly for the sake of easy bookkeeping. (Montgomery's thesis for example suggests slightly higher values.) Alex 
1 Attachment(s)
[quote=akruppa;178919]No particular reason other than that people consider powers of ten "round" numbers.
For ECM, the set of standard B1 values is derived from an analysis which B1 produces the smallest expected time for finding a factor of a certain size, and early papers on ECM gave figures like B1=11000 for 20digit factors, B1=50000 for 25 digits, etc. These values stuck and everyone keeps using them, mostly for the sake of easy bookkeeping. (Montgomery's thesis for example suggests slightly higher values.) Alex[/quote] i did some experiments with gmpecm 6.2.2 a short while ago and the bounds we use are too low for example t35 at B1=3M is faster than B1=1M (i didn't optimise that one) i optimised t25 and t30 i noticed that B2scale of 0.5 was usually better(it was unless it meant a high k value) [code] Size B1 esttime curves needed B2scale t25 5e4 1.84m 214 1 t25 7e4 1.71m 178 0.5 t30 25e4 17.56m 430 1 t30 40e4 16.89m 311 0.5 [/code]this is the normal and the best i found in addition to being slightly faster there is a greater chance of finding higher factors with higher bounds i havent tested much but the results were better for 64bit linux(using GMP 4.2.4 i think) as well the tests were done on 32bit windows using a binary compiled with MPIR 1.0 i will post some results files in the morning t35 needs more than 5 curves to be run at each level to get a reasonable minimum time and i dont have time to parse that much data manually 
@Alex: What is the formula to calculate how many curves are needed for a certain set of B1, B2, size of factor to find?

[quote=Andi47;178974]@Alex: What is the formula to calculate how many curves are needed for a certain set of B1, B2, size of factor to find?[/quote]
Use the v (verbose) option on GMPECM and it prints out those stats (among other things). First it prints the expected # of curves for factor sizes, then it runs the ECM curve, then it prints the expected time for the factor sizes. I recommend doing just one curve with v on, because it's so verbose that a large number of curves would needlessly produce a humongous output. 
[QUOTE=MiniGeek;178992]Use the v (verbose) option on GMPECM and it prints out those stats (among other things). First it prints the expected # of curves for factor sizes, then it runs the ECM curve, then it prints the expected time for the factor sizes. I recommend doing just one curve with v on, because it's so verbose that a large number of curves would needlessly produce a humongous output.[/QUOTE]
I know about v. What I want to know is the formula which v uses. 
[quote=Andi47;178995]I know about v. What I want to know is the formula which v uses.[/quote]
Oh. Does this help? [code]static void print_expcurves (mpz_t B2min, mpz_t effB2, unsigned long dF, unsigned long k, int S, int clear) { double prob; int i; char sep; if (!test_verbose (OUTPUT_VERBOSE)) return; rhoinit (256, 10); outputf (OUTPUT_VERBOSE, "Expected number of curves to find a factor " "of n digits:\n20\t25\t30\t35\t40\t45\t50\t55\t60\t65\n"); for (i = 20; i <= 65; i += 5) { sep = (i < 65) ? '\t' : '\n'; prob = ecmprob (mpz_get_d (B2min), mpz_get_d (effB2), pow (10., i  .5), (double) dF * dF * k, S); if (prob > 1. / 10000000) outputf (OUTPUT_VERBOSE, "%.0f%c", floor (1. / prob + .5), sep); else if (prob > 0.) outputf (OUTPUT_VERBOSE, "%.2g%c", floor (1. / prob + .5), sep); else outputf (OUTPUT_VERBOSE, "Inf%c", sep); } if (clear) rhoinit (1, 0); /* Free memory of rhotable */ } [/code]Edit: here's the code for the ecmprob that's called in the last one there. [code]double ecmprob (double B1, double B2, double N, double nr, int S) { double alpha, beta, stage1, stage2, brsu; ASSERT(rhotable != NULL); /* What to do if rhotable is not initialised and asserting is not enabled? For now, bail out with 0. result. Not really pretty, either */ if (rhotable == NULL) return 0.; if (B1 < 2.  N <= 1.) return 0.; if (N / ECM_EXTRA_SMOOTHNESS <= B1) return 1.; #ifdef TESTDRIVE printf ("B1 = %f, B2 = %f, N = %.0f, nr = %f, S = %d\n", B1, B2, N, nr, S); #endif alpha = log (N / ECM_EXTRA_SMOOTHNESS) / log (B1); stage1 = dickmanlocal (alpha, N / ECM_EXTRA_SMOOTHNESS); stage2 = 0.; if (B2 > B1) { beta = log (B2) / log (B1); stage2 = dickmanmu (alpha, beta, N / ECM_EXTRA_SMOOTHNESS); } brsu = 0.; if (S < 1) brsu = brsudickson (B1, B2, N / ECM_EXTRA_SMOOTHNESS, nr, S * 2); if (S > 1) brsu = brsupower (B1, B2, N / ECM_EXTRA_SMOOTHNESS, nr, S * 2); #ifdef TESTDRIVE printf ("stage 1 : %f, stage 2 : %f, BrentSuyama : %f\n", stage1, stage2, brsu); #endif return (stage1 + stage2 + brsu) > 0. ? (stage1 + stage2 + brsu) : 0.; } [/code] 
3200 more curves at 43M completed. no factor.

[QUOTE=fivemack;178890]1800@43e6 set going over the weekend; by Monday either we'll have a factor or should start polynomial search.[/QUOTE]
[QUOTE=bsquared;178998]3200 more curves at 43M completed. no factor.[/QUOTE] I guess it is time to start the poly selection. If Tom does indeed find a factor, we can abort. 
You can use the Dickman function ρ(α) (a.k.a [URL="http://en.wikipedia.org/wiki/Dickmande_Bruijn_function"]Dickmande Bruijn function[/URL]) to estimate the probability that an integer N has no prime factors >N[SUP]1/α[/SUP].
You can add a correction term described in Knuth and Pardo, "Analysis of a Simple Factorization Algorithm," to reduce the error term of ρ(α). That by itself doesn't lead to better estimates for ECM, though, as ρ(α) estimates the average density of smooth numbers in [1, N], not the density of smooth numbers close to N, and that error term is just as big as the one KnuthPardo fix. Fortunately, the latter can be fixed too, and then you end up with reasonably good estimates. To estimate the probability that a factor is found in stage 2, you can integrate over the stage 2 primes p to sum the probabilities that pN and N/p is smooth. This is described, for example, in Silverman and Wagstaff, "A practical analysis of the Elliptic Curve Factoring Algorithm," but they give the relevant integral incorrectly. It is correct in Brent, "Some integer factorization algorithms using elliptic curves." The standard recommendation Crandall and Pomerance, prime numbers, it probably the best place to start reading about it. With the BrentSuyama extension (e.g., with f(x)=x^k or a kth Dickman polynomial), you also sum over the probability that a p>B2 appears as a divisor among the various f(j)f(j), divides N, and N/p is smooth. The order of elliptic curves does not behave like a random integer, though, for example with BrentSuyama's curve parameterization you know the order of the curve is divisible by 12. But they don't behave like integers that are a multiple of 12, either, for example the average exponent of 2 in the order is close to 10/3, not 3, the average exponent of 3 is close to 27/16, not 3/2, and average exponents of larger primes are also slightly greater than for "random" integers. All in all the order is as likely smooth as if it were smaller by a factor of about 23.4. Similarly, for the P1 method, the order p1 is divisible by a prime power q^k with probability (q1)*q^(k1), which makes the order as likely smooth as if it were smaller by a factor of 3.41. For P+1 you can play a few tricks with the starting point to get order divisibly by 4 or 6, or favour some other small primes in the order. Estimating this isn't implemented yet in GMPECM. All in all there's no simple answer. To get accurate estimates, you need to take a lot of different effects into account... it gets a bit messy. If you'd like to get into this (which I recommend, it's a [I]fun[/I] mess!) I can send you the relevant papers and give you a guided tour of the GMPECM rho.c code, if you like. Alex 
Guys, assuming we do end up needing to do a team sieve for this, the FTP server is, as before, open for business. This time uploads will go in the c157relations directory.
Note that I will be out of town from late afternoon (EST) Monday, June 29 to late evening Saturday, July 4 with essentially no internet access. The FTP server doesn't really require any maintenance, so you guys should be OK to use it even while I'm not around. (That's why I figured I'd better set up the directory now, since most likely the team sieve will not begin until after I'm goine.) If, while I'm away, you guys find a factor for the C157 and need to do a team sieve on a different number, I obviously wouldn't be around to make another directory for the next number. It's easy enough to plop a directory on the serverjust log in as you normally do for an upload, and execute the command "mkdir [i]name[/i]". Max :smile: 
@Alex: Thanks.
P.S.: aborting ECM @B1=11e7 (I had to restart my laptop anyway), starting msieve poly search. (a5=5k ... 10k) 
i will create a poly selection thread for the c157
edit: done 
i just made a bit of a beginer mistake i copied andi47's post to the new thread and then discovered that it appeared before mine so i deleted his post(physically removing it so it dint look weird to mods)
unfortunately this deleted the thread:redface: i will recreate it 
[quote=henryzz;179015]i just made a bit of a beginer mistake i copied andi47's post to the new thread and then discovered that it appeared before mine so i deleted his post(physically removing it so it dint look weird to mods)
unfortunately this deleted the thread:redface: i will recreate it[/quote] Hey, wait a minute...I didn't know mods could physically remove posts. You sure about that? Because I can only soft delete stuff (in the forums where I have mod privileges, that is, not here). 
[quote=mdettweiler;179016]Hey, wait a minute...I didn't know mods could physically remove posts. You sure about that? Because I can only soft delete stuff (in the forums where I have mod privileges, that is, not here).[/quote]
you have to edit and then there is a delete menu at the top which includes physically delete post this mistake makes me realize how much power i have:smile: 
Xyzzy warned the supermods that physically deleting posts is, in his words, "kinda permanent" and urged us to be cautious.
Alex 
[quote=henryzz;179019]you have to edit and then there is a delete menu at the top which includes physically delete post
this mistake makes me realize how much power i have:smile:[/quote] Huh, I had no idea I could do that all this while. I just tried it on a test post in the NPLB forum and lo and behold, it works! Previously I had only seen the choice to soft delete on the menu; it must have been changed since then. 
i think i will soft delete and then physically delete in future if i have the need to do that
it was only because it was the first post and we would be editing what should be the first post continuously that i permanently deleted it 
I don't like the "physically remove post" thing. It's OK with spammers, but why should it be used for others?

[quote=henryzz;179031]i think i will soft delete and then physically delete in future if i have the need to do that
it was only because it was the first post and we would be editing what should be the first post continuously that i permanently deleted it[/quote] Yes, indeed. BTW, there's a handy little trick that makes it somewhat easier to change who started a thread. What I usually do is find a post from the desired user that predates the current first post in the thread, and copy it to the thread. Then, I edit it so that it contains the text from the original first post, and delete that original post. Presto! The thread is now shown as started by the desired user. :smile: Also, keep in mind that when you delete the only post in a thread (whether soft or physically), it will delete the entire thread. If it's soft deleted, the thread will be shown in italics and be visible only to mods until Xyzzy takes out the "garbage" by purging all the soft deleted posts (or if someone then goes in and physically deletes it). [quote=10metreh;179033]I don't like the "physically remove post" thing. It's OK with spammers, but why should it be used for others?[/quote] Indeed. It's mainly useful for supermods and administrators who are cleaning up old soft deleted posts that are no longer needed. 
[quote=10metreh;179033]I don't like the "physically remove post" thing. It's OK with spammers, but why should it be used for others?[/quote]
it shouldn't most of the time how does a soft deleted post appear? i cant find one 
[quote=henryzz;179036]it shouldn't most of the time
how does a soft deleted post appear? i cant find one[/quote] That's becuase Xyzzy cleans them up. They gradually build up, then he physically removes them. In fact, I'll show you by posting again here, and deleting that one. 
I finished 1000 curves @ B1=43000000, B2=388112953420, no factors.

finished 1800@43e6 on 4788:2422; no factors

235@43e6 (was running on a windoze C2D over the weekend), no factor.

[QUOTE=bsquared;181826]Just finished, and reported to the [url=http://factorization.ath.cx/search.php?query=&se=1&aq=4788&action=last&fr=&to=]database[/url]:
[CODE]prp62 factor: 74399348298738292963450206114740189729787693944911556416750987 prp95 factor: 29291907986845892835734723953465997517662174446720902693683027129696345656977730249786953578507 elapsed time 62:23:11 [/CODE][/QUOTE] Line 2423 is 2^5 * 3 * c157. ECM is in progress. 
700 3e6 curves.

[quote=jrk;181836]Line 2423 is 2^5 * 3 * c157. ECM is in progress.[/quote]
not another c157:doh!: 
[quote=henryzz;181847]not another c157:doh!:[/quote]
We've been a bit unlucky with big composites  once we got three team sieves in three lines. Only we've only had one c157 (so far), and it could have been much worse... If you're wondering why 130396 is still lounging around at 108 digits, it is mainly because of big composites (well, big relative to 108 digits). Some sequences don't get many, even with a slow driver like 2*3. In only about 100 lines since reaching 100 digits, it's had a c105, a c104, a c102, a c101, two c99s (three if ECM doesn't crack the one it's on now), and two c96s. :sad: Even worse, they're really packing it in  I seem to be getting one every 6 or 7 lines right now. :cry: A stark contrast to 10212, which didn't have the fastest of drivers but didn't reach a c>100 until it was at about 115 digits. I hope there's a gap soon (and a driver escape). :smile: Anyone have an idea how unlcuky that is (or have I just not noticed the ones that hit eight c105s before reaching c110 with a driver)? 
+400 @11e6 curves.

2000 curves at 43M done.

[quote=bsquared;182064]2000 curves at 43M done.[/quote]
In doing so, you "gatesed" the workers (remember that occasion)... ..but these curves are real, so it doesn't matter. :smile: 
[quote=10metreh;182069]In doing so, you "gatesed" the workers (remember that occasion)...
..but these curves are real, so it doesn't matter. :smile:[/quote] I remember someone reporting false curves under that name, yeah :smile:. But does reporting work affect the workers behavior? 
All times are UTC. The time now is 16:35. 
Powered by vBulletin® Version 3.8.11
Copyright ©2000  2022, Jelsoft Enterprises Ltd.