![]() |
I restrict stage 2 norm such that my .ms file has around 100 hits per day, and I run -npr on the entire file. I have not found that keeping a CPU core busy during -np1 -nps is optimal; but I'm usually using the machine for jobs filling ~75% of hyperthreads, so a full core isn't usually available. "top" usually lists msieve as using 30-40% load during -np1 -nps.
I usually divide stage 1 default by 9 and stage 2 by 30-35. Those are close to optimal production rate of hits saved to the .ms file (larger stage1 norms produce more hits if stage 2 is set looser, but I don't care about producing hits that I'm not going to -npr, thus my tight stage 2 bound). If the job is below 180 digits, I use -t 2 to more fully use the GPU. |
[QUOTE=unconnected;472005]I need a poly for C175 from 11040:i10071.
[CODE]n: 6484689970303129020517057103365894793216912303102493240993925500641398178923457820535718262847740804273820043327446358642419268046237801717129218758317768624690555974293740733 [/CODE][/QUOTE] I found only one poly better than 1.6e-13: [code]# norm 6.007105e-17 alpha -6.096724 e 1.935e-13 rroots 1 skew: 23869212.60 c0: 215515253399468621655935634084010628842875 c1: 18221026257270678573040535301222710 c2: 810840940444709291461562333 c3: -92370495278835169378 c4: -2376612213148 c5: 147408 Y0: -8485425654983521960650346075265192 Y1: 20565115814604599[/code] Just over a day of GPU covered coefficients from 86000 to 625000. I ran -npr on 300 hits better than 2.2e23 (my original stage2norm of 3.2e23 got over 800 hits!). Since this comfortably breaks the record for C175, I am not running a second day of GPU. |
Thank you all for polys! Last one seems favourite, but I'll test others as well.
|
Wow! In light of Curtis's record - I give up.
[CODE] n: 6484689970303129020517057103365894793216912303102493240993925500641398178923457820535718262847740804273820043327446358642419268046237801717129218758317768624690555974293740733 # norm 4.665729e-017 alpha -8.425021 e 1.593e-013 rroots 5 skew: 28923646.00 c0: -4386970700220487276313859890840413197203565 c1: 2330306998768055234780689215308746616 c2: 174681301195108131696645536261 c3: -16475412756814004420584 c4: -209121776364624 c5: 2537640 Y0: -4802770578237298517499556665133202 Y1: 164874057212673787[/CODE] |
AS 3408 - C144 @step 1672
[code]
203199071456459697917238464502146641404872934772312366864554138143380230225301058598152641199346864903889613842978343346766926580275162084582763 [/code] [url=http://www.mersenneforum.org/showpost.php?p=472237&postcount=412]Per this post[/url], the number is ready for GNFS. Anybody have some spare cycles? |
144? Sure, I'll do it tonight via CADO. Should have factors Saturday.
|
[QUOTE=VBCurtis;472814]144? Sure, I'll do it tonight via CADO. Should have factors Saturday.[/QUOTE]
Just kidding! CADO thinks a C145 is a good size for 3 large primes. Sieving finished at 6pm, matrix will be a day or two. |
[QUOTE=VBCurtis;473105]Just kidding! CADO thinks a C145 is a good size for 3 large primes. Sieving finished at 6pm, matrix will be a day or two.[/QUOTE]
Just curious if CADO ever finished this job. |
[QUOTE=swellman;473724]Just curious if CADO ever finished this job.[/QUOTE]
Whoops! Sorry for the oversight. [code]Factors: 2533400905211008233039858902482299735650548193556157676631651481450022838850807 80208020388125323283373170734151811678592313227255048158773246509 [/code] The log notes elapsed time of 5 million thread-seconds. I think parameter choices could perhaps be improved (I think I changed a couple, must've made it worse!). |
Thanks for closing the loop. Do you have any plans for continuing this series? If not someone may be interested, though not me - no available resources.
Just returned home today to find my best machine powered off, maybe permanently! Some type of short in the power system, it’s being evaluated now. |
No, a 202-digit sequence is for shared-forum efforts, or RyanP.
I'm still working on tweaking CADO with GNFS inputs in 120-150 digit range, likely will be for the next month or so. |
| All times are UTC. The time now is 23:11. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.