mersenneforum.org

mersenneforum.org (https://www.mersenneforum.org/index.php)
-   NFS@Home (https://www.mersenneforum.org/forumdisplay.php?f=98)
-   -   BOINC NFS sieving - NFS@Home (https://www.mersenneforum.org/showthread.php?t=12388)

Mini-Geek 2014-02-23 20:27

1 Attachment(s)
[URL="http://www.factordb.com/index.php?id=1100000000041882572"]GC_6_302[/URL]'s c160 = p68*p93. It took my i5-750 about 240 calendar hours for the 12.9M matrix, but even with -t 4 the CPU usage averaged around 69% (~2.76 cores, so ~662 CPU hours for the LA). All of the logs from my harrowing experience are attached, in case anyone is silly enough to want to pore through them. :smile:
[CODE]prp68 factor: 10922077385007413926959844864921688969621938072657126375168970438021
prp93 factor: 396325903304263476304132664503070294343931044028469032434386038612698624514895837180351478017[/CODE]

Oddly enough, my other NFS@Home post-proc, [URL="http://www.factordb.com/index.php?id=1100000000636572825"]C161_4788_5193, also split as p68*p93[/URL]. I had to double-take, and for a brief moment thought I had somehow rerun the sqrt of that Aliquot number instead of GC_6_302.

fivemack 2014-02-23 20:43

Taking GW_5_337 (eta Friday morning)

fivemack 2014-02-24 09:17

GC_4_391 done
 
[code]
Mon Feb 24 06:38:24 2014 prp65 factor: 14198581372780559361298881324341243632487515607631810135943939711
Mon Feb 24 06:38:24 2014 prp113 factor: 98044219316434010330311264237664197011511274436894672700768724559671562464495191860992245914432914109657107625823
[/code]

9.0M matrix, 91 hours @ i7/2600

Log at [url]http://pastebin.com/mPnrEm54[/url]

fivemack 2014-02-24 09:19

Taking GW_4_391 (ETA Thursday night, but I won't be in to see the result until Monday)

fivemack 2014-02-25 10:41

Taking GC_6_303 to run after GW_4_391 finishes

swellman 2014-02-25 11:49

I'll take W_2_781 if it's still available.

fivemack 2014-02-25 22:03

Reserving F1463, though won't start it until Thursday evening because there are still relations left to come in

RichD 2014-02-27 04:35

I'll take GC_11_227 next.

fivemack 2014-02-27 09:59

Nice split for GW_4_391
 
[code]
Thu Feb 27 07:04:59 2014 prp65 factor: 68223389739926815612370531813805556395561800915997384296702325089
Thu Feb 27 07:04:59 2014 prp74 factor: 11984243818960683602193241167479366973837783703539461935891776675547390097
Thu Feb 27 07:04:59 2014 prp76 factor: 5766445047517722556357101302048874219297585702791949537424059961454002480717
[/code]

8.5M matrix, 65 hours on i7/2600 - I think I had a runaway firefox process on the machine when I did the initial runtime estimate. Log at [url]http://pastebin.com/dncsRJMT[/url]

GC_6_303 ETA is Monday morning.

fivemack 2014-02-27 10:01

Reserving GC_10_236. I'm expecting this one to generate a nasty matrix and take a long time.

swellman 2014-02-27 12:48

I'll take GW_10_237 once it's finished sieving.

fivemack 2014-02-28 09:45

GW_5_337 done
 
[code]
Fri Feb 28 09:44:22 2014 prp59 factor: 28531307759938440787683911916962371941203896877059269187147
Fri Feb 28 09:44:22 2014 prp107 factor: 68141190653242360313887056780662841622160471411865096875596106757939556143056213060965791020793460054367177
[/code]

72.1 hours, 12.6M target_density=96 matrix on six threads on i7/4930

F1463 eta is Tuesday morning

swellman 2014-03-01 01:24

I'll take GC_6_304 next.

RichD 2014-03-01 04:27

GC_11_227 splits as:
[CODE]
prp71 factor: 24013573097415000462985655593434665537916678988110599464127913117714211
prp135 factor: 439286544732716591887311221013349373538132387268354397485024871768431029824624530196541345827372316603652416192358915641775905314464951[/CODE]45 hours (in BL) built a 8.25M matrix using "-nc target_density=114 -t 4" on i5/2500

pinhodecarlos 2014-03-01 14:57

I would like to try GC_3_496 but I need some info regarding the latest msieve. Been out of post-processing almost a year and things changed since then. Don't have unlimited bandwidth anymore so I am limited to post-process 2-3 numbers/month, msieve version changed, etc.
What version should I use for 64-bit windows machine? And, do I need pthreadGC2.dll and pthreadVC2.dll files on the msieve folder?

Carlos

swellman 2014-03-01 15:59

I've been using the executable and associated file found [url=http://www.mersenneforum.org/showthread.php?t=18725]here[/url].

pinhodecarlos 2014-03-01 18:11

I have to give up GC_3_496 due to heat problems. Anyway, LA has an ETA of 76 hours for 9.8M^2 matrix. Sorry.

swellman 2014-03-01 21:22

[QUOTE=swellman;367765]I'll take W_2_781 if it's still available.[/QUOTE]

[code]
prp60 factor: 854643608564173566055954493461963414327770125674674003206399
prp111 factor: 829470699876789341315231711633049803730963085776901862130515192148745174538740550793880454144054148383946858223
[/code]

RichD 2014-03-01 23:34

I'll take GC_3_496 next.

fivemack 2014-03-03 10:23

GC_6_303 ETA is now Tuesday afternoon; a runaway Firefox meant it made little progress over the weekend.

xilman 2014-03-03 13:35

[QUOTE=fivemack;368203]GC_6_303 ETA is now Tuesday afternoon; a runaway Firefox meant it made little progress over the weekend.[/QUOTE]I ditched FF last year for that very reason. Now running Midori which does pretty much everything I ask of it and is much lighter weight than FF and Chrome.


Paul

fivemack 2014-03-04 13:37

F1463 done
 
[code]
Tue Mar 4 11:48:46 2014 prp78 factor: 724901011535214037845874972824565483006757602317236033423370083435767781234797
Tue Mar 4 11:48:46 2014 prp111 factor: 891413068611899193736220698858807819904586081225032674362266394055114505379699337711097101194312538214688008561
[/code]

About 90 hours for 13.2M matrix (density 112) on six threads of i7/4930

I suppose the very best automation protocol would not have required me to copy-type the 78-digit factor from my phone, but I deliberately don't have efficient access to my compute cluster at home from my desk at work.

Reserving F1881, will set up this evening and edit to insert ETA then

fivemack 2014-03-04 13:41

GC_6_303 done
 
[code]
Tue Mar 4 12:22:44 2014 prp72 factor: 215290914147000790812689007424442844957319292819239150613345580364711241
Tue Mar 4 12:22:44 2014 prp99 factor: 151825369037900078063922325167583391673084915398865559605969446693935136477201824284072243950242799
[/code]

123 hours for 9.7M matrix density=96 on sort of three-to-four-ish cores of i7/2600 - would probably have run in 84 hours on three cores.

ETA for 10C236 is Tuesday morning; 12.8M matrix density=96, 151 hours on three cores anticipated.

RichD 2014-03-05 01:37

GC_3_496
 
GC_3_496 splits as:

[CODE]prp68 factor: 34558630114648526421540494777207045843755870156121410528411719897833
prp135 factor: 890198443460579047649250096723450598716680985003533287186615497352057375239425980651670627348360194020701310297797549559585881599490339[/CODE]74.5 hours with 9.9M matrix (-t 4 -nc target_density=122) on Core-i5

swellman 2014-03-05 17:09

I'll take W_2_785 if it's still available.

swellman 2014-03-05 19:40

GW_10_237 splits as

[code]
prp69 factor: 756141863108859602476539572284242484762490376644444949275721237198289
prp107 factor: 38888738692406720036746755253971151997258249702396129874653552998384179769556796397919700495710035236212677
[/code]

swellman 2014-03-06 14:12

GC_6_304 splits

[code]
prp58 factor: 5882709405064032518923319203772718822420587201926467336713
prp113 factor: 61010787295416647364492088993973724306751430698726748739591374398921574209363900211965301175835286885719584213871
[/code]


ETA for W_2_785 is Monday.

swellman 2014-03-06 22:49

I'll take GC_8_262 next.

RichD 2014-03-07 23:51

I'll take GC_10_237 next.

RichD 2014-03-08 19:51

[QUOTE=RichD;368558]I'll take GC_10_237 next.[/QUOTE]

After numerous filtering attenots, I had to take target_density down to 80 before it would build a matrix. TD=84 failed.
It's running now. ETA 161 hrs.

I'll take GW_8_262 next.

swellman 2014-03-10 10:45

W_2_785 splits

[code]
prp66 factor: 934443226126541129082720461205245650960335939112000770758451298807
prp122 factor: 57101038886265245988877384368413212465076045950587050089975193516427302141879062453060498091400389236679391726581628438349
[/code]

fivemack 2014-03-10 13:21

F1881 done
 
[code]
Mon Mar 10 09:58:07 prp60 factor: 175475990729679681680213343588355640147876283181276859555813
Mon Mar 10 09:58:07 prp119 factor: 40439315954546661942899394819324936304012828371998328106695045285001255720445060467225841360189771914294010443872140941
[/code]

13.7M matrix (density 112); just under 96 hours on 6 threads i7/4930K

fivemack 2014-03-12 10:40

GC_10_236 finally done
 
[code]
Wed Mar 12 09:27:41 2014 prp81 factor: 370514039103297589163947469354351108889478563221749443111016730766104949845145643
Wed Mar 12 09:27:41 2014 prp135 factor: 164605928702288067539647117070647194630732448665473685751092055424564430567289359832727027960508027695182042388897842243799664456354947
[/code]

186 hours for a 12.8M matrix on a moderately busy i7/2600K: log at [url]http://pastebin.com/i8c82anE[/url]

I do wonder whether 31-bit large primes might be sensible for these SNFS-240 numbers; we're having to sieve a long way at rather low yields and rather high duplication rates to get rather large matrices.

fivemack 2014-03-12 10:44

Taking GC_9_248

RichD 2014-03-13 13:08

GW_8_262 splits as:

[CODE]prp60 factor: 904076630073200973071845071612316104114727407128281752180859
prp141 factor: 606625086440938588570886913614578101130135357532683232001986727791754202704750890694263443026373641246753028278422941760796166329290826567827[/CODE]107 hrs to solve a 11.5M matrix using -nc target_density=108 -t 4 on Core-i5.

RichD 2014-03-15 15:29

GC_10_237 splits as:

[CODE]prp91 factor: 1512377072986105793004196812647603257592302047201854599339358341182117713918227729923555823
prp91 factor: 7822173151770781861682077554423204567826632176976141792644026318725502158714722126226603209[/CODE]161 hrs to solve 15.35M matrix using target_density=80 -t 4 on Core-i5.

xilman 2014-03-15 19:02

[QUOTE=RichD;369019]GC_10_237 splits as:

[CODE]prp91 factor: 1512377072986105793004196812647603257592302047201854599339358341182117713918227729923555823
prp91 factor: 7822173151770781861682077554423204567826632176976141792644026318725502158714722126226603209[/CODE]161 hrs to solve 15.35M matrix using target_density=80 -t 4 on Core-i5.[/QUOTE]Brilliant!

Paul

Jarod 2014-03-16 03:45

GW_5_339
 
Hi all, it has been a while since I have been postprocessing I would be keen to take GW_5_339. I think it will fit into 12 gig of RAM? If this is not the case could somebody please advise me. I will start to download of the .DAT file in about 16 or so hours. Monday 17th around 8:30 a.m. New Zealand daylight saving time If anybody can suggest a shorter running job I would be happy to take it instead of this one

swellman 2014-03-16 12:05

GC_8_262
 
Just completed this composite after 130+ hours on my i7. Imagine my surprise when I go to report the results in factordb and find the number is already fully factored! Just one of those things I guess - someone helped out over the years and no one noticed. My fault too for not checking status before starting the post processing.

[code]
prp60 factor: 904076630073200973071845071612316104114727407128281752180859
prp141 factor: 606625086440938588570886913614578101130135357532683232001986727791754202704750890694263443026373641246753028278422941760796166329290826567827
[/code]


eta: I will take C176_118_93 next.

xilman 2014-03-16 12:50

[QUOTE=swellman;369098]Just completed this composite after 130+ hours on my i7. Imagine my surprise when I go to report the results in factordb and find the number is already fully factored! Just one of those things I guess - someone helped out over the years and no one noticed. My fault too for not checking status before starting the post processing.

[code]
prp60 factor: 904076630073200973071845071612316104114727407128281752180859
prp141 factor: 606625086440938588570886913614578101130135357532683232001986727791754202704750890694263443026373641246753028278422941760796166329290826567827
[/code]
eta: I will take C176_118_93 next.[/QUOTE]
I always give credit to the first person who informs me of a factorization either directly (as in your mail earlier today) or indirectly (as when I happen across it as in your posting). No-one has informed me of the earlier discovery so, as far as I am concerned, you and NFS@Home and Rob Hooft are the discoverers. Rob has done a vast amount of ECM pre-testing, work which is still on-going, for the NFS@Home candidates and it's only fair that he gets his share of recognition.

Apologies to the earlier person(s) who completed the factorization but if you want your result to be known you need to tell the world about it in a manner which is attributable.


Paul

fivemack 2014-03-16 14:27

Taking F1893

fivemack 2014-03-16 14:31

[QUOTE=Speedy51;369069]Hi all, it has been a while since I have been postprocessing I would be keen to take GW_5_339. I think it will fit into 12 gig of RAM? If this is not the case could somebody please advise me. I will start to download of the .DAT file in about 16 or so hours. Monday 17th around 8:30 a.m. New Zealand daylight saving time If anybody can suggest a shorter running job I would be happy to take it instead of this one[/QUOTE]

Yes, that should fit fine in 12G; they're starting to be a bit of a squeeze on a well-used 8G machine (I had one run crash my desktop, which is an 8G iMac with a 2G virtual machine constantly active).

You might find that GW_3_497 is a bit quicker (I say this only because it has rather more relations, and we're on a cusp of matrix size vs relation count); remember '-nc1 target_density=112'.

Jarod 2014-03-16 20:03

[QUOTE=fivemack;369106]Yes, that should fit fine in 12G; they're starting to be a bit of a squeeze on a well-used 8G machine (I had one run crash my desktop, which is an 8G iMac with a 2G virtual machine constantly active).

You might find that GW_3_497 is a bit quicker (I say this only because it has rather more relations, and we're on a cusp of matrix size vs relation count); remember '-nc1 target_density=112'.[/QUOTE]
Thanks Five mack, can I please get GW_3_497 assigned to me and to give back GW_5_339. The reason I want to do this is because it sounds like GW_3_497 is going to run faster for me. I will start the download of GW_3_497 now

RichD 2014-03-17 13:00

I see the problem. I did GC_8_262 as GW_8_262.

I can start the REAL GW_8_262 download tomorrow.

[QUOTE=swellman;369098]Just completed this composite after 130+ hours on my i7. Imagine my surprise when I go to report the results in factordb and find the number is already fully factored! Just one of those things I guess - someone helped out over the years and no one noticed. My fault too for not checking status before starting the post processing.

[code]
prp60 factor: 904076630073200973071845071612316104114727407128281752180859
prp141 factor: 606625086440938588570886913614578101130135357532683232001986727791754202704750890694263443026373641246753028278422941760796166329290826567827
[/code][/QUOTE]


[QUOTE=RichD;368880]GW_8_262 splits as:

[CODE]prp60 factor: 904076630073200973071845071612316104114727407128281752180859
prp141 factor: 606625086440938588570886913614578101130135357532683232001986727791754202704750890694263443026373641246753028278422941760796166329290826567827[/CODE][/QUOTE]

swellman 2014-03-17 17:31

[QUOTE=RichD;369190]I see the problem. I did GC_8_262 as GW_8_262.

I can start the REAL GW_8_262 download tomorrow.[/QUOTE]

:max:

Wait - the mystery is solved! Happy ending.:tu:

All is well.

Another topic - seeking advice. When I start post processing C176_118_93, a 31 bit job, what target_density should I use? Typically I just use default values but maybe it's worth trying to tighten up the matrix a bit prior to LA? It's a pretty ugly poly with a terrible yield but best we could find.

Thanks in advance for any suggestions.

fivemack 2014-03-17 22:29

I tend to use target_density 112, if that doesn't work then 96, if that doesn't work then the default 70. The difference between 112 working and 70 working is often only about 5% of the total relation count.

What I don't quite understand is why I don't get a usable matrix, even with enormous over-sieving, at target densities 128 or over.

jasonp 2014-03-17 23:51

I don't understand it either. I used to think the failure happens because the filtering throws away information at a steady rate during the merge phase, and making the target density too large would make the filtering throw away too much. But then Greg tried a huge filtering run with high target density and nothing thrown away, and it still failed.

My current guess is that either it's a bug somewhere that destroys the matrix during the merge phase, or that gauss elimination somehow starts to behave strangely when many matrix columns start to look almost identical.

Jarod 2014-03-18 04:51

GW_3_497 help please
 
1 Attachment(s)
Can somebody please advise me what I need to do to continue the LA phase. I get the following lines just before the command prompt exits
[code]10+ relations: 2553021
heaviest cycle: 28 relations
RelProcTime: 3868
elapsed time 01:04:30
[/code] please find log start and resume files attached
I am using target_density = 112 with out the spaces of course. Thanks for any assistance

frmky 2014-03-18 05:18

[QUOTE=Speedy51;369256]Can somebody please advise me what I need to do to continue the LA phase.[/QUOTE]
Looks like you've finished -nc1, which runs filtering. Now run with
-nc2 -nc3 -t 4 -v
which will run the linear algebra and square roots. Replace the 4 with however many cores your computer has.

Jarod 2014-03-18 05:46

[QUOTE=frmky;369268]Looks like you've finished -nc1, which runs filtering. Now run with
-nc2 -nc3 -t 4 -v
which will run the linear algebra and square roots. Replace the 4 with however many cores your computer has.[/QUOTE]
Thanks I restarted – nc 2 and all appears well. A quick question will I need to run start command again at some point with nc 3?
The la phase is going to take around 46 hours I should have it completed within the week. It is currently using 6 cores after it has written a checkpoint for the first time I will see if it will work faster on 3 cores

VBCurtis 2014-03-18 16:52

[QUOTE=Speedy51;369272]Thanks I restarted – nc 2 and all appears well. A quick question will I need to run start command again at some point with nc 3?
The la phase is going to take around 46 hours I should have it completed within the week. It is currently using 6 cores after it has written a checkpoint for the first time I will see if it will work faster on 3 cores[/QUOTE]

Frmky meant for you to run -nc2 -nc3 in the same command. You don't have to run them individually. If you only flagged -nc2, you will have to invoke msieve with -nc3 once the matrix finishes.

If you don't have a reason to run them individually, use -nc next time; that runs all three post-processing phases sequentially.

fivemack 2014-03-18 17:31

GC_9_248 done
 
[code]
Tue Mar 18 17:22:30 2014 prp96 factor: 513771344530953638399207528250780162329591444656481384108792960998927618778379458678020370035651
Tue Mar 18 17:22:30 2014 prp96 factor: 704336210027665042766919139818581008384098536410985081125139978424946084001665660346633709606967
[/code]

144 hours for 11.9M matrix on three cores of i7/2600K. Log at [url]http://pastebin.com/g3UqPrCi[/url]

fivemack 2014-03-18 17:37

I'm afraid my machine won't finish F1893 until after I leave for Mexico; depending on the availability of wifi in Palenque and Merida, the results may not reach the Internet until April 7th or so.

fivemack 2014-03-18 18:59

[QUOTE=VBCurtis;369297]Frmky meant for you to run -nc2 -nc3 in the same command. You don't have to run them individually. If you only flagged -nc2, you will have to invoke msieve with -nc3 once the matrix finishes.

If you don't have a reason to run them individually, use -nc next time; that runs all three post-processing phases sequentially.[/QUOTE]

There is one oddity: you have to do '-v -nc target_density=112 -t 6' ... the target_density parameter must be right after the -nc on the command line, or it will be ignored.

RichD 2014-03-19 00:11

[QUOTE=RichD;369190]I can start the REAL GW_8_262 download tomorrow.[/QUOTE]

I've got GW_8_262 running in LA with ETA 38 hrs + sqrt.

Sorry everyone for the confusion.

swellman 2014-03-19 01:30

[QUOTE=fivemack;369234]I tend to use target_density 112, if that doesn't work then 96, if that doesn't work then the default 70. The difference between 112 working and 70 working is often only about 5% of the total relation count.

What I don't quite understand is why I don't get a usable matrix, even with enormous over-sieving, at target densities 128 or over.[/QUOTE]

Thanks for the guidance. C176_118_93 seems to be heavily over-sieved, or will be. I might try a target density of 128 just to experience the crash if nothing else, falling back to 112, 96, or 70 as needed.

RichD 2014-03-20 14:04

Finally, GW_8_262 splits as:

[CODE]prp79 factor: 6482638054086919779081042827196458878063715247525998104955700869682305142904453
prp100 factor: 1728418981523664944606126436837534164627159260942767750412042635120454352600710802892574502352523633[/CODE]40 hrs to solve 7.65M matrix using target_density=116 -t 4 on Core-i5/2500

RichD 2014-03-22 01:05

I'll take GC_12_220 & GW_12_220 next. This way, if I step on any toes, they will be my own.

The first will download overnight and the next tomorrow night.

swellman 2014-03-22 02:28

I'll take GW_5_339 next. Will download in a day or two.

RichD 2014-03-24 03:17

GC_12_220 splits as:

[CODE]prp78 factor: 909610524772850577836741438743462338922534445770251397645974467015674057170943
prp140 factor: 41457574347649746582472895367625811744895707429503500183751357067562812172200418831105455984042172202448678478792391995083079294709836023713[/CODE]46 hrs to solve an 8.3M matrix using -t 4 target_density=112 on Core-i5/2500

P.S. GW_12_220 is another 1.5 days away.

Jarod 2014-03-24 05:22

1 Attachment(s)
GW_3_497 factors are as follows. Log is attached for anyone who is interested
[Code]prp65 factor: 21305876605290611652106316147014599120854876134210881381658837219
prp160 factor: 2409066606569240538218963475465073927047931021580343551041067518535995930430180632121713521850744535733137117964752998708446554076731810458133699486349914924887[/code]

RichD 2014-03-25 13:45

GW_12_220 splits as:

[CODE]prp96 factor: 773230913565360050939838843652228176446517279002966255024356847094350274854135462248477398594853
prp130 factor: 2632571829126584439795604606143889674364174058332315570845632880917359249538367600102552019189203185714167364509596530876104846447[/CODE]54 hrs to solve an 8.1M matrix using -t 3 target_density=116 on Core-i5.
(One core was dedicated to another project.)

fivemack 2014-03-26 10:06

F1893 done
 
[code]
Wed Mar 26 07:45:08 2014 prp82 factor: 6610596433770852168964846705856632210967139857389928428088061560605774195694064769
Wed Mar 26 07:45:08 2014 prp157 factor: 8874208543886029501502031980197226500991966140044144061451950476364658976731224630424676614715255597690092959776115567546800243547196893286495196754590261617
[/code]

400380 seconds for 14.7M matrix (density 112) on i7/4930 -t6

Sent at local-4am from an iPad in a hotel in San Cristobal de las Casas, Chiapas.mx. I think I'll go back to sleep now.

RichD 2014-03-26 17:04

I'll take GC_3_497 next.

swellman 2014-03-27 11:31

GW_5_339 should be done on Sunday morning.

C176_118_93 is next Wednesday. I attempted to use the target_density=128 parameter but msieve appears to have ignored it and used default values.:unsure: Such is life.

RichD 2014-03-28 19:47

GC_3_497 splits as:

[CODE]prp87 factor: 916183202161314102216658975525600180112761728328787933656381394287611619286788424444083
prp140 factor: 29856528983606753410315510164773604949657583105656160172679484730278242411422926844416180067175244244771749141028804615154074537419261457679[/CODE]40 hrs to solve a 7.8M matrix using -t 4 target_density=120 on Core-i5.

pinhodecarlos 2014-03-28 22:00

Please reserve GW_12_215 for me. I'll start it next week.

RichD 2014-03-29 22:54

I'll take GW_9_249 next.

swellman 2014-03-30 03:40

GW_5_339 splits:

[code]prp78 factor: 174561382997634489020809082736845509891386929515392407891589956307191949013753

prp98 factor: 44544112330958277773084231338925599725011581817314413219376765334168917598785944869083702285968761[/code]

pinhodecarlos 2014-03-30 14:29

Calling all post-processors helpers. 9 available jobs to crunch in the queue.

wombatman 2014-03-30 15:39

I'll take GW_9_248.

RichD 2014-03-30 17:28

I'll take GC_11_228 next but it may be a day or so before I can download it.

ETA for GW_9_249 just under 30 hours.

swellman 2014-03-30 17:59

I'll take GC_7_281 next.

wombatman 2014-03-30 20:49

ETA on GW_9_248 is ~70 hours with 8 threads and density of 115.

pinhodecarlos 2014-03-31 02:15

Where can I download the latest msieve 64 bit linux client? The dll's are the same from the windows version? Thank you in advance.

RichD 2014-04-01 01:35

GW_9_249 splits as:

[CODE]prp55 factor: 3341147571762894746616313727026296885197667290370142783
prp161 factor: 54686564269637850634116594505519128973752898935720676602847847250720120659476629305425247855519101443689109180817067055830007789856782006869704861766517032270947[/CODE]40 hrs to solve a 7.8M matrix using -t 4 target_density=120 on Core-i5.

pinhodecarlos 2014-04-01 10:47

Which one is faster, -t 4 or -t 3? Is it dependable on the processor? I might have to run my job with -t 3 or even -t 2 due to heat problems.

RichD 2014-04-01 15:52

I'll take GW_8_263 next.

A little under 40 hrs remain on GC_11_228. But that will be in the middle of the night for me, so call it closer to 48 hrs.

-t 4 is faster if you are using a version later than ~ SVN 920 which has the improved BL code.

pinhodecarlos 2014-04-02 05:05

LA ETA for GW_12_215 just under 41 hours.

debrouxl 2014-04-02 07:54

GW_7_274 has two relatively large factors:
[code]prp88 factor: 1774336685203560775619635932367619211405050143199091754260921412538532153324600686036593
prp96 factor: 508814448654890845611675210767981560379062102165956391595504821886446270068528880368551236606523[/code]
8.6M matrix, ~53h total post-processing time on a computer which was not necessarily otherwise idle.

I should have the factors of my other reserved number this evening. The matrix is smaller, but the computer has both slower processor and slower RAM.

debrouxl 2014-04-02 19:12

GW_3_485:
[code]prp67 factor: 1647763113002796428498894808352407866801862446764328113316541883361
prp128 factor: 58409493901454168844078681660634255352530279026127264357580723321139500368541402474023174905797322476984395014447565647779466631[/code]
7.8M matrix, ~72h total post-processing time, again on a non-otherwise idle computer.

swellman 2014-04-02 20:59

C176_118_93 splits as

[code]
prp86 factor: 20935304876961223796740678608143809018535192662955670250058036106636226526232640840319
prp90 factor: 508795715895597611812475803243000629326526511820057707809875041721248257247814281402177831
[/code]

wombatman 2014-04-03 12:57

GW_9_248 splits as:

[CODE]prp64 factor: 1651391854486772280321008396908147949900938268099726348899375813
prp97 factor: 1668674835181278920665129912842910733783116812806987167180617108173973483982086709797947479248799[/CODE]

Total time was 84 hours with 8 threads on an i7-4930K that was doing other work as well. Matrix was 11174751 x 11174976.

RichD 2014-04-03 15:05

A nice split for GC_11_228:

[CODE]prp108 factor: 575613145550424477458097758035133628971890655177858605000332740732088282760128755260459374285937421252983003
prp110 factor: 40673193023861303637375147953873366519598609788554265123214441130960822462638302881173294967321658892712040939[/CODE]41 hrs to solve an 8.1M matrix using -t 4 target_density=116 on Core-i5.
(I had a power outage so the time is a guesstimate.)

pinhodecarlos 2014-04-03 21:31

1 Attachment(s)
GW_12_215 factors.

pinhodecarlos 2014-04-04 00:23

Taking GW_5_331.

pinhodecarlos 2014-04-04 12:02

[QUOTE=pinhodecarlos;370272]Taking GW_5_331.[/QUOTE]

As of now less than 30 hours to finish LA.

pinhodecarlos 2014-04-04 12:13

I will so take GC_3_484.

pinhodecarlos 2014-04-04 21:53

I also would like to reserve L1860 but I need to download it from university due to bandwidth limitation at home. I will do it next Monday. Thomas Womack, please feel free to grab this one in case you want it.

Carlos

fivemack 2014-04-05 12:27

Thanks for the offer, I will take L1860.

RichD 2014-04-05 15:50

GW_8_263 splits as:

[CODE]prp86 factor: 14993859254684218224274401872969607955501408151645864433222624348017612927923056503249
prp121 factor: 2369462562967544539096559646479748604345687278376253236367375137638309356755902044493644433640401391104044534203324463959[/CODE]Power outage, different number of threads, all timing with this one is meaningless.

pinhodecarlos 2014-04-05 18:48

1 Attachment(s)
GW_5_331 factors.

pinhodecarlos 2014-04-05 20:09

LA for GC_3_484 is underway, less than 42 hours.

debrouxl 2014-04-06 07:57

GC_4_384:
[code]prp58 factor: 3218881625104747586861721692946029893778548463404968773333
prp142 factor: 9351509494388464753107850760039001641647044118501633450270478027999632692208759821268900078269374761128790542575051230377536658301505835160259[/code]~5.7M matrix, ~24h total post-processing time.

Carlos, it would be great if you copied the factors in your posts, in addition to the post-processing log :smile:

fivemack 2014-04-07 07:15

L1860 is underway (12.1M matrix), ETA Friday afternoon.

fivemack 2014-04-07 10:47

Taking GW_11_222, ETA Wednesday morning

swellman 2014-04-08 01:53

GC_7_281 splits:

[code]
prp57 factor: 138181263894115920445278190474755009260893492947364839667

prp154 factor: 2604997162532584085283664901161188671935679233174943480745132810933566857272874793511238026373946261452706259220847722189150153070864602404891914365149823
[/code]


All times are UTC. The time now is 10:17.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.