mersenneforum.org

mersenneforum.org (https://www.mersenneforum.org/index.php)
-   NFS@Home (https://www.mersenneforum.org/forumdisplay.php?f=98)
-   -   BOINC NFS sieving - NFS@Home (https://www.mersenneforum.org/showthread.php?t=12388)

fivemack 2014-01-12 13:15

GC_8_258 splits as

[code]
Sun Jan 12 12:59:22 2014 prp86 factor: 16863706043300063407415729559081338998251361319129409125512005320975971394246015454547
Sun Jan 12 12:59:22 2014 prp107 factor: 23580021108883492406216749588110724009770131277490709950495587517298730650801096281472535638683887876151207
[/code]

Taking C_2_773 (eta Wednesday morning) and GW_8_258 (eta Monday night)

fivemack 2014-01-13 23:23

GW_8_258 splits as

[code]
Mon Jan 13 23:16:00 2014 prp56 factor: 38506111741215132280143237471272099824121319104322530227
Mon Jan 13 23:16:00 2014 prp134 factor: 57850364937690828634216544175636584460041713004220307171148630445069966054326796817097512277775316557982128480595783499286086859239389
[/code]

swellman 2014-01-13 23:30

GW_9_244 factored

[code]
prp53 factor: 31720964232725149209145478183153094160374114840897209
prp69 factor: 233550844217597221784141102025911494242166005842175760845781288139483
prp96 factor: 153364959725148730311335086245192058017116669327995125254594516073180374730304204149484140250897
[/code]

swellman 2014-01-14 22:55

76661_236 splits as a triple.

[code]
prp75 factor: 863912015575367070720093130101936565686669977459652527985140011128834420029
prp77 factor: 46393316326451840132573875574090116138141207062161502053705995770435583646777
prp86 factor: 19128532303555233973360749690047943719224192545179869356514519602597643621165365643217
[/code]


eta: I'll take GW_3_489 next.

fivemack 2014-01-14 23:21

I'll take F1241 (ed: ETA Sunday morning)

fivemack 2014-01-15 19:23

C_2_773 completed
 
C_2_773 splits as
[code]
Wed Jan 15 11:21:46 2014 prp97 factor: 3117825145483538503510506841967793176977337342336978962235626802764825978125648824153038902342001
Wed Jan 15 11:21:46 2014 prp113 factor: 28327160626610268405492806521366241697988663592163106002782607457978031946900259579146637358187714111265218555209
[/code]

jrk 2014-01-17 00:32

[QUOTE=jrk;363862]I'd like to reserve GW_10_233 please.[/QUOTE]

Done:[code]Sun Jan 12 10:22:21 2014 prp92 factor: 29979308826391241865129986842839370466310499444094132497139708854405051952253039242363774603
Sun Jan 12 10:22:21 2014 prp103 factor: 8373590966613765807211741939276591545098904111162695185147457862850973646066136447553586420579125799491
[/code]

swellman 2014-01-19 02:48

W_2_773 splits

[code]
prp94 factor: 1134993303997894913232886778939153885845511550813440944782829039887103754783461908013911192971
prp130 factor: 7614496197242925149549166355850558978187140253576572120261609622722660396849618440994545531139710966081521754696330903391581949629
[/code]

fivemack 2014-01-19 11:38

F1241 done
 
[code]
Sun Jan 19 10:39:10 2014 prp86 factor: 50319493187701371072386917937259362801430722745483689483011738185898826844412318061869
Sun Jan 19 10:39:10 2014 prp145 factor: 1133075944965914511191840829220062343450529825743206505971641328149601427895128808119878771108089030580488136003895916921440559531897822067365273
[/code]

Taking GW_11_224, eta Tuesday afternoon

RichD 2014-01-19 20:40

I'll take GC_3_488 next.

XYYXF 2014-01-21 17:37

There are two C168's pushed up to 18000 curves at B1 = 110M:
[code]C168_130_71 = 293577856524534308556608110931494014404182621098756377812259533965962071178386204940945650625875365752664844816196696488552291293374296950182835664858833152967071700503
C168_134_94 = 451591044633621500700127843125932943919387601290262860485200418433795934760393784972054631775554879954085888690144804817796633480540639229986141076270921955279078568333[/code]Are there any spare cycles for them? :)

fivemack 2014-01-21 18:24

GW_11_224 done
 
[code]
Tue Jan 21 11:31:26 2014 prp75 factor: 692710618145635187953577836515278205200550617222403948547140138833095928617
Tue Jan 21 11:31:26 2014 prp94 factor: 4332397564279070122859471546174834163899791071765616372958614385171511923875451529268501896949
[/code]

swellman 2014-01-21 20:41

I'll take C_2_774 if it's still available.

RichD 2014-01-22 15:49

GC_3_488 Done
 
GC_3_488 splits as:

[CODE]prp79 factor: 1760698750288796416510983866887402003295141578644502208188475349976711747998541
prp135 factor: 860667887342619118571685213169987193617060626748782005352554503583330204049866617726717922323551976839243911111808359658904090448223329[/CODE]

swellman 2014-01-25 01:18

GW_3_489 factored - another triple!

[code]
prp58 factor: 8309578200234203001562509655997847367287587314361099074693
prp69 factor: 384627514508465375500856030883779117286078079168699841920417570739329
prp97 factor: 3724518480516414218978353226777552507164214496551788110946811689789915304406436966051815005842841
[/code]

swellman 2014-01-25 15:10

I'll take GC_3_489.

swellman 2014-01-26 21:59

GC_3_489 eta is Saturday.

C_2_774 is problematic - LA was 73% complete when the job failed. Msieve then automatically restarted, so I couldn't manually continue the job. Sigh.

Eta is next weekend, assuming it doesn't crash again.

fivemack 2014-01-26 22:06

I see we're getting a bit of a backlog.

I'll take 37777_233 (ETA Wednesday morning) and GC_9_245 (ETA also Wednesday morning) now, and F1259 when it's ready.

wombatman 2014-01-28 19:22

I'd like to try C_2_776 to see if my new system is working correctly.

fivemack 2014-01-29 07:28

GC_9_245 done
 
[code]
Wed Jan 29 06:55:01 2014 prp73 factor: 9903098574986974103362081686446747367398730774095742389498142843980704183
Wed Jan 29 06:55:01 2014 prp123 factor: 253750273471733853234752294706294911205280533906726912316473517903401207871913278164748795382321811959696210719517071097849
[/code]

fivemack 2014-01-29 07:29

37777_234 done
 
[code]
Wed Jan 29 07:16:17 2014 prp59 factor: 39805838245128375757250665494950002862394601621476034142741
Wed Jan 29 07:16:17 2014 prp175 factor: 9490511805112205749458401288737739707213702753085950542196355427901121646484863032195955743622165543053640011339272287516341338233202898822893964694843291458272520697457365997
[/code]

Now starting F1259, eta Monday evening (and big enough to be 30% faster on 4930 than 4770)

fivemack 2014-01-29 20:19

Taking GC_8_259, ETA Monday morning

wombatman 2014-01-31 03:04

C_2_776 splits as:

[CODE]prp86 factor: 28525416039922049626770142728533712341961572795502167688680982805040627336789038519213
prp147 factor: 282202079464803697071770654163208355134519627766398296278521727169577703841952282549766335335416002033065363890939367261436243588329451332261049373[/CODE]

swellman 2014-01-31 14:34

I'll take GW_3_490 next.

swellman 2014-02-01 01:21

I'll take 6101_59_minus1 next.

swellman 2014-02-01 10:26

GC_3_489 factors:

[code]
prp77 factor: 44872292278889286231844316745343781976871693878122036131600260104302685851981
prp92 factor: 69834097513209989248090786475663496420475349025947404620235984575194146151746682538704948753
[/code]

pinhodecarlos 2014-02-01 12:29

Just a side note, due to the introduction of the NFS@Home badges and the start of Formula Boinc 2014, NFS@Home had a good increase in work done. This is the reason you guys are so busy doing non stop post-processing tasks. Keep the factors coming!

Carlos

swellman 2014-02-01 12:53

C_2_774 splits:

[code]
prp92 factor: 11344379175370261065600199053658618504490042411773612164277668331159210068920104432492113481
prp94 factor: 1378036442776330103757752987023933376565858878272631691504937736041803986311667094867212159189
[/code]

fivemack 2014-02-01 18:36

I have rearranged things so that I can devote the whole i7-4930 to post-processing for the next month or so. Not sure even that's enough to keep up, and I'm off to Mexico at the end of March leaving plenty of time for backlogs to accumulate ...

fivemack 2014-02-02 14:15

Taking GW_4_389, ETA Wednesday afternoon

fivemack 2014-02-03 08:08

GC_8_259 done
 
[code]
Mon Feb 3 02:05:12 2014 prp72 factor: 574753244791239318244082762431899855066971363810176863992506028782567853
Mon Feb 3 02:05:12 2014 prp113 factor: 11890854832929847680716109259261935434392079437878967972534666467678384630388834769723103848925089542884797428171
[/code]

xilman 2014-02-03 11:46

[QUOTE=fivemack;365999][code]
Mon Feb 3 02:05:12 2014 prp72 factor: 574753244791239318244082762431899855066971363810176863992506028782567853
Mon Feb 3 02:05:12 2014 prp113 factor: 11890854832929847680716109259261935434392079437878967972534666467678384630388834769723103848925089542884797428171
[/code][/QUOTE]
Thanks.

Please note that the PSU on the machine holding the database died yesterday. A replacement is on order but updates can't be processed until tomorrow at the very earliest.

The ecmnet server is also down for the duration.

Paul

fivemack 2014-02-03 22:51

Taking GC_3_493

swellman 2014-02-04 00:47

I'll take GC_11_225.

fivemack 2014-02-04 08:03

Fib(1259) done
 
[code]
Tue Feb 4 02:15:53 2014 prp83 factor: 31212026112839519974864329553245367946471277004029144461650689013661011471410970813
Tue Feb 4 02:15:53 2014 prp148 factor: 1602435277896213536888309487641627813610302430033091971163761836285438379650501056269073962983195332703992371097479685139612538638962227664938762737
[/code]

fivemack 2014-02-04 14:10

Taking GW_9_246

xilman 2014-02-05 10:10

[QUOTE=xilman;366006]Please note that the PSU on the machine holding the database died yesterday. A replacement is on order but updates can't be processed until tomorrow at the very earliest.

The ecmnet server is also down for the duration.[/QUOTE]Back up again for 45 minutes now. An update run is in progress.

Paul

swellman 2014-02-05 10:52

[QUOTE=swellman;365758]I'll take GW_3_490 next.[/QUOTE]

[code]
prp84 factor: 138286337120082079963110190483308221778115504788303873001706696556470017430811374653
prp99 factor: 765771602658914004726682836031885877590684761297919715334144492762304051565711498993076076181241323
[/code]

fivemack 2014-02-05 18:12

4W389 done
 
[code]
Wed Feb 5 10:52:42 2014 prp56 factor: 24153556984215729816571883488635461710033123027845513781
Wed Feb 5 10:52:42 2014 prp167 factor: 28928684409009817714261609957814881246364825237594708261157628403489897389990241950309530367992085291342155481460346045534049467588015893216930205619535033942005233703
[/code]

fivemack 2014-02-06 13:58

Taking GW_3_493

swellman 2014-02-06 23:12

6101_59_minus1 splits as a triple.


[code]
prp66 factor: 635796217255867478592888914258401870713582966726842908523325007663
prp74 factor: 32694647481949129411793929329392889655270723900683259948912782475475461679
prp81 factor: 172003007521558283180326414775305504436825517273290300033980009813656015719362967
[/code]

fivemack 2014-02-07 18:03

GW_9_246 splits as
[code]
Fri Feb 7 15:12:59 2014 prp76 factor: 4370727639771118361875173025642612421868648893657762053322435022922567650803
Fri Feb 7 15:12:59 2014 prp129 factor: 208301112507050079086235067571272842538624499546654410164033843873269383669046278448234500002379302259836380662305461858639935627
[/code]

fivemack 2014-02-07 23:39

Taking GC_8_260

RichD 2014-02-08 00:40

I'll take GC_10_235 next.

Mini-Geek 2014-02-08 00:53

Taking GC_6_302.

I've done a post-processing from NFS@Home with an Aliquot sequence number before, so I think I know what I'm doing. :smile: If I'm stepping on toes or otherwise doing something stupid by trying to post-process this number, feel free to stop me.

swellman 2014-02-08 01:39

[QUOTE=pinhodecarlos;365826]Just a side note, due to the introduction of the NFS@Home badges and the start of Formula Boinc 2014, NFS@Home had a good increase in work done. This is the reason you guys are so busy doing non stop post-processing tasks. Keep the factors coming!

Carlos[/QUOTE]

You weren't joking! It's like drinking from a fire hose.:shock:

Good stuff though. I'll reserve some more this week.

pinhodecarlos 2014-02-08 01:43

[QUOTE=swellman;366399]You weren't joking! It's like drinking from a fire hose.:shock:

Good stuff though. I'll reserve some more this week.[/QUOTE]

Why you though I was joking when I am a big supporter of the project? Right now I can't help due to bandwidth problems but I do every possible to bring more people to NFS@Home sieve. And if Greg increases the points per wu for all, like doubling, more people would come. There are BOINC projects out there that per wu take the same amount of time as the ones in NFS@Home but they score precisely double. Just a thought......

swellman 2014-02-08 01:55

It's just a figure of speech. No criticism intended.

Impressive queue!

debrouxl 2014-02-08 09:45

Thanks for taking on post-processing tasks :smile:
I recently gained limited access to a laptop equipped with Core i7-4700HQ, 8 GB DDR3-1600 RAM and a GTX-760M, but I don't think I'll be able to do much post-processing on it. The airflow is mild to poor, anyway: Prime95 TF or stress-test bring core temperature above 90°C...

I keep being no fan of inflating WU credit in an artificial arms race between projects.

Mini-Geek 2014-02-08 12:46

1 Attachment(s)
[QUOTE=Mini-Geek;366396]Taking GC_6_302.

I've done a post-processing from NFS@Home with an Aliquot sequence number before, so I think I know what I'm doing. :smile: If I'm stepping on toes or otherwise doing something stupid by trying to post-process this number, feel free to stop me.[/QUOTE]

Linear algebra on this has begun, but the matrix is larger than I had anticipated. Compared to a recent [URL="http://www.mersenneforum.org/showthread.php?p=362272#post362272"]c161 GNFS[/URL] (log file is attached there) I post-processed, this SNFS has fewer unique relations, but a much larger matrix (log file attached).

The ETA is currently about 320 hours (13.35 days) running on 4 threads of an i5-750. So, is there something wrong here, or is this normal for an SNFS of this size?

My guesses are either: for equivalent sieving difficulty, SNFS post-processing is harder than GNFS post-processing, or using 30 bit instead of 29 bit makes post-processing harder, or this number is quite undersieved (there were 16,967 bad relations; more than I'd hope, but not enough to dent the 110M raw relations).

pinhodecarlos 2014-02-08 13:30

I think the issue here is the old processor you are using, it is from the first generation of the i5 series. You should not run any post-processing.
Look at this example of L1803, harder number than yours, using an old version of msieve without taking advantage of the 50 % speed up and it was less than 13 days.

pinhodecarlos 2014-02-08 14:12

Sorry but I can't attach the log file, I am having issues with my internet connection. I've been trying to do that for the last half an hour. Also lost remote control of all my machines at the lab. Stupid country!

fivemack 2014-02-09 01:27

GC_3_493 done
 
[code]
Sat Feb 8 22:00:43 2014 prp66 factor: 404675335237566634047776648973783207303326616897166891987863934321
Sat Feb 8 22:00:43 2014 prp130 factor: 1696730016839592422643861847045769083116586949070298346176899128778077851774931860994847291623499641743091454357450360253615261427
[/code]

(this is on an 27-inch late-2009 iMac, so a 2.66GHz i5-750; the calculation took three days and two hours)

The problem Mini-Geek is having is that GC_6_302 hasn't really been adequately sieved; GC_3_493 was

[code]
Wed Feb 5 18:29:55 2014 found 23759882 hash collisions in 114929460 relations
Wed Feb 5 18:32:37 2014 found 23918010 duplicates and 92230841 unique relations
Wed Feb 5 19:19:36 2014 weight of 7866793 cycles is about 881162366 (112.01/cycle)
Wed Feb 5 19:40:09 2014 matrix is 7866504 x 7866793 (3440.3 MB) with weight 1007198370 (128.03/col)
Wed Feb 5 19:46:12 2014 matrix is 7861747 x 7861995 (3309.9 MB) with weight 854233213 (108.65/col)
Wed Feb 5 19:46:12 2014 sparse part has weight 789062072 (100.36/col)
Wed Feb 5 19:46:12 2014 using block size 8192 and superblock size 786432 for processor cache size 8192 kB
Sat Feb 8 21:16:56 2014 BLanczosTime: 266001
[/code]

so my 92.2M unique relations allowed a much kinder matrix than Mini-Geek's 84.7M. (Note that I did msieve -nc1 target_density=112 to produce a denser matrix; but that's not enough to make the difference between my 7.8M and Mini-Geek's 13.5M)

fivemack 2014-02-09 11:11

GC_8_260 done
 
[code]
Sun Feb 9 10:51:44 2014 prp77 factor: 64009584216674889468458818694189952605977464081091106962702687045133335779049
Sun Feb 9 10:51:44 2014 prp145 factor: 8554780841752594133475867349359594755953859533742822341879655839115512478425071586100240422814547272554917378482675533036850453079886122821118929
[/code]

7.4M sparse-weight 101.3 matrix; 24 hours on i7/4930K -t6

The ready-for-post-processing queue is now empty - it looks as if a few more mega-Q were pushed last night to get weights down. Excellent.

Mini-Geek 2014-02-09 13:28

[QUOTE=fivemack;366466]The problem Mini-Geek is having is that GC_6_302 hasn't really been adequately sieved[/QUOTE]

Based on this and some PM discussion, I'm unreserving GC_6_302, at least for the time being. Thanks for the help.

xilman 2014-02-09 15:50

[QUOTE=swellman;366399]You weren't joking! It's like drinking from a fire hose.:shock:

Good stuff though. I'll reserve some more this week.[/QUOTE]It's easier for me but even so I'm having to pay serious attention. I'm upstream of all you guys so the major impact on my queue is ensuring that enough ECM pre-testing is being done. Rob Hooft is doing all the heavy crunching (thanks Rob!) but my part is ensuring that he knows how much to do on what and in which order.

My guess is that pretty much all the GCW numbers with SNFS difficulty below 250 digits and all GNFS under 160 digits will be finished this year.

Thanks to everyone who is helping, whether ECM pre-testing, NFS sieving, NFS post-processing or running ECMNET clients.

Paul

debrouxl 2014-02-09 19:38

GC_6_302 cannot easily be moved back to sieving stage: either I have to create a new entry in the Web interface (but the Web interface does no longer show what the range already sieved for that number was), or Greg needs to connect to the DB and manually change the status of the number...

swellman 2014-02-09 19:56

I'll post process GC_6_302 if that makes things easier.

While it seems additional sieving would help speed up LA for this number, I can throw it on an i7 and grind it out. Probably faster (net) than all the human intervention Lionel describes.

RichD 2014-02-09 20:12

[QUOTE=swellman;366521]While it seems additional sieving would help speed up LA for this number, I can throw it on an i7 and grind it out. Probably faster (net) than all the human intervention Lionel describes.[/QUOTE]

How about having a few people perform additional sieving and feed Mini-Geek if he still has the big .dat file.

I can throw a couple cores beginning tomorrow if someone can define the parameters.

debrouxl 2014-02-09 20:21

GC_6_302.poly is
[code]n: 4328702185572131219922243105028645058472249163140012398932525775802806127639978382319196735323771365180776850901067568757442949739304109446756885190267242484357
m: 808281277464764060643139600456536293376
deg: 6
c6: 10872
c0: 1
skew: 0.212
type: snfs
rlim: 90000000
alim: 90000000
lpbr: 30
lpba: 30
mfbr: 60
mfba: 60
rlambda: 2.7
alambda: 2.7[/code]It's unlikely that the range sieved by NFS@Home reached 200M, so I'm pretty sure you can sieve from q=200M - but at such high values, the yield will be lower...

BTW, Sean: I haven't queued C176_118_93 to NFS@Home because the leading coefficient produced by snfspoly is large:
[code]c6: 1643032
c0: 74805201[/code]and as a consequence (it occurred for other OddPefect jobs, for instance), the yield is poor: slightly less than 1 relation per q value from q0=45M to q1=45M+2500, using 31-bit LPs.
If the leading coefficient had been pretty small, a 5th degree polynomial might have done the job... but the leading coefficient produced by snfspoly for degree 5 is even higher than the one for degree 6, so I didn't even bother run sieving tests.

EDIT: hmm, found your e-mail with a different poly again, this time. I'll try it :smile:

frmky 2014-02-09 20:25

[QUOTE=debrouxl;366519]GC_6_302 cannot easily be moved back to sieving stage: [/QUOTE]
Actually it is easier now. I've added a bit more sieving. :smile:

swellman 2014-02-09 22:20

[QUOTE=debrouxl]
BTW, Sean: I haven't queued C176_118_93 to NFS@Home because the leading coefficient produced by snfspoly is large:
[code]c6: 1643032
c0: 74805201[/code]and as a consequence (it occurred for other OddPefect jobs, for instance), the yield is poor: slightly less than 1 relation per q value from q0=45M to q1=45M+2500, using 31-bit LPs.
If the leading coefficient had been pretty small, a 5th degree polynomial might have done the job... but the leading coefficient produced by snfspoly for degree 5 is even higher than the one for degree 6, so I didn't even bother run sieving tests.

EDIT: hmm, found your e-mail with a different poly again, this time. I'll try it :smile:[/QUOTE]

The latest version of Yafu has a nice poly finder for several of the more popular composite forms, including xyyx. It evaluates a lot of polys, selects the top three and runs some test sieving. I could sometimes beat it using hand methods during beta testing, but now it seems pretty solid. Yafu produced the poly given in my message. Hope it gives a better yield than what snfspoly produced.

<break>

Looks like GC_6_302 is still sieving. It's Mini-Geek's if he wants it.

swellman 2014-02-09 22:38

Another data point in the sieving-post processing discussion: GC_11_225 filtered and entered LA with no issues but it will take 281 hours runtime on an i7. Seemed like a lot for a 30 bit job. (Projected completion date is Feb 18.)

By comparison, 6101_59_minus1 only took about 38 hours in LA on an i7.

FWIW.

fivemack 2014-02-10 15:15

Taking GW_7_278 (eta Monday morning)

fivemack 2014-02-11 09:19

GW_3_493 done
 
[code]
Tue Feb 11 03:30:34 2014 prp63 factor: 474219412211946126250636132859379175735120519441930235382721823
Tue Feb 11 03:30:34 2014 prp74 factor: 54512644954810266463936050260253071511281486423503168096101663043974538979
Tue Feb 11 03:30:34 2014 prp81 factor: 110346700865783842783226509439515387375682318954773407730638286318892359003874047
[/code]

10.4M matrix (target_density 96, because not enough relations to do 112), 102 hours on four threads of i7/2600

RichD 2014-02-12 05:12

GC_10_235 splits as:

[CODE]prp83 factor: 17018236043176407335979492703787844281561871171516165490019071880180094131426675591
prp121 factor: 4054397293024538926351407354713352834805217821458755366711970750544753997448888866960379879416941259547505048002397428701[/CODE]

swellman 2014-02-12 23:42

I'll take GC_11_226.

Mini-Geek 2014-02-13 04:04

[QUOTE=Mini-Geek;366496]Based on this and some PM discussion, I'm unreserving GC_6_302, at least for the time being. Thanks for the help.[/QUOTE]

I'm re-reserving GC_6_302. LA has begun, ETA 203h (down from 320h) :smile:

fivemack 2014-02-14 08:18

Started linear algebra on 3270_687, ETA Wednesday morning

fivemack 2014-02-14 19:02

I'll take 6029_59_minus1 (Sunday evening) and GW_11_226 (Monday morning)

fivemack 2014-02-16 22:39

I'm guessing that C160_3408_1385 post-processing is spoken for by whoever queued it up. Taking GC_5_338 (eta Friday morning)

fivemack 2014-02-16 23:18

6029_59_minus1 done
 
[code]
Sun Feb 16 23:01:47 2014 prp83 factor: 67141807044283229077358236748035008411636484309852244321504451498044521818879229943
Sun Feb 16 23:01:47 2014 prp137 factor: 26748872385867688946765727296484882882801222298831805114374185447343224886085015932367652019388080180191994877838628988551176227876625017
[/code]

swellman 2014-02-17 00:25

[QUOTE=fivemack;367136]I'm guessing that C160_3408_1385 post-processing is spoken for by whoever queued it up. [/QUOTE]

Believe that's Lionel (or Greg) tidying up. As a "read only" user, I know I don't have the ability to queue up a number. Dunno, maybe others can.

Taking C_2_784 next.

(ETAs for my two current factorizations are Monday night and Tuesday afternoon.)

fivemack 2014-02-17 08:11

GW_11_226 done
 
[code]
Mon Feb 17 02:47:04 2014 prp89 factor: 92476653255063785062306678908901259050475298804173872522675855697123488869893253371162057
Mon Feb 17 02:47:04 2014 prp96 factor: 351344399757049146838444544160277988016411556943829896389334726632886332857805999008920695017839
[/code]

8.8M matrix, 50 hours on i7/4770

fivemack 2014-02-17 10:15

GW_7_278 done
 
[code]
Mon Feb 17 09:43:28 2014 prp85 factor: 1598535687254573960761730497664270492821179174923075610363354771607667987767882523633
Mon Feb 17 09:43:28 2014 prp104 factor: 13552244648559094262907882442641502157484163380797839754302854665381739912877976277997246548328256622757
[/code]

11.7M matrix, 142 hours on i7/2600 (log at [url]http://pastebin.com/hLgcVy7Z[/url])

fivemack 2014-02-17 10:18

Taking GW_3_494; ETA Thursday morning

swellman 2014-02-17 22:58

GC_11_225 splits as

[code]
prp57 factor: 117035737516206000301422658636143086870641529583702222247
prp146 factor: 13224271120294906777055213566759272332532501009255338788377364771003595699735034613666924913891744508392976903138521770966663385051064509021025077
[/code]

BudgieJane 2014-02-18 10:21

[QUOTE=swellman;367209]GC_11_225 splits as

[code]
prp57 factor: 117035737516206000301422658636143086870641529583702222247
prp146 factor: 13224271120294906777055213566759272332532501009255338788377364771003595699735034613666924913891744508392976903138521770966663385051064509021025077
[/code][/QUOTE]

Aren't they the factors of GW_11_225?

pinhodecarlos 2014-02-18 11:31

[QUOTE=BudgieJane;367235]Aren't they the factors of GW_11_225?[/QUOTE]

From [url]http://escatter11.fullerton.edu/nfs/crunching.php[/url] it is correct.

swellman 2014-02-18 11:46

[QUOTE=BudgieJane;367235]Aren't they the factors of GW_11_225?[/QUOTE]

No. What makes you think so?

RichD 2014-02-18 18:24

I'll take C160_3408_1385 next.

swellman 2014-02-18 20:26

[QUOTE=swellman;366800]I'll take GC_11_226.[/QUOTE]

[code]
prp67 factor: 1553921527600287191718979686051836235379378213250456395166935257817
prp99 factor: 641954075373038761835304204196178626395167333003090077344083180514397760392495700368829521814229341
[/code]

BudgieJane 2014-02-18 21:26

[QUOTE=swellman;367239]No. What makes you think so?[/QUOTE]

Something strange happened in my application this morning. When I entered
GC_11_225 it returned me 225*11^225-1. I don't understand why. It's working correctly now. I'm sorry for not checking better before writing.

fivemack 2014-02-18 22:13

3270_687 done
 
[code]
Tue Feb 18 21:44:08 2014 prp70 factor: 4514010676621572312243238194224226413610657161090911759766610581587907
Tue Feb 18 21:44:08 2014 prp108 factor: 696745277774474878010296222298269346995956048994202312288360817744618546275970282901710521370801719192054041
[/code]

108 hours on i7/4930K -t6 for a 14.6M matrix

swellman 2014-02-19 00:35

[QUOTE=BudgieJane;367262]Something strange happened in my application this morning. When I entered
GC_11_225 it returned me 225*11^225-1. I don't understand why. It's working correctly now. I'm sorry for not checking better before writing.[/QUOTE]

No worries - double checking is always welcome. I was afraid that I'd missed something.

Come, join the post processing fun.:smile:

BudgieJane 2014-02-19 09:38

[QUOTE=swellman;367276]
Come, join the post processing fun.:smile:[/QUOTE]

I would, but I'm too busy with generalised Cunninghams at the moment.

debrouxl 2014-02-19 17:54

Sean: I tried your yafu-generated poly for the XYYXF C176_118_93 number:
[code]skew: 49.22
c0: 14210583768
c6: 1
Y0: -2342388736625917052139104541473924426001
Y1: 11973747886018297523742405394432[/code]
but it doesn't sieve much better than the polys generated by snfspoly: ~1 relation per q value at q0=45M (rlim /2 = alim / 2) using 31-bit LPs.

That number proves surprisingly hard for a SNFS difficulty 23x (depending on the polynomial) number. Even though 14e could probably do it with a lengthy sieving, 15e would probably be more efficient.

frmky 2014-02-19 20:00

[QUOTE=debrouxl;367314]Sean: I tried your yafu-generated poly for the XYYXF C176_118_93 number:
[/QUOTE]
Try these:

[CODE]skew: 0.417
c0: 8649
c6: 1643032
Y0: -2342388736625917052139104541473924426001
Y1: 1412902250550159107801603836542976

skew: 0.909
c0: 8649
c5: 13924
Y0: -175222860263437786894593195184969752945814431201
Y1: 2321443610525929019209484754762878943232[/CODE]
Not sure if either will be better...

debrouxl 2014-02-19 21:09

The sextic difficulty 242 has basically the same ~1 relation per q value yield at q0=45M, but it's slower than the sextic posted by Sean.
The quintic difficulty 240 has pretty much the same yield as the others in the same range, faster than your sextic but slightly slower than the sextic posted by Sean.

swellman 2014-02-20 00:29

[QUOTE=debrouxl;367314]Sean: I tried your yafu-generated poly for the XYYXF C176_118_93 number

[I]snip[/I]

but it doesn't sieve much better than the polys generated by snfspoly: ~1 relation per q value at q0=45M (rlim /2 = alim / 2) using 31-bit LPs.

That number proves surprisingly hard for a SNFS difficulty 23x (depending on the polynomial) number. Even though 14e could probably do it with a lengthy sieving, 15e would probably be more efficient.[/QUOTE]

Yes, my test sieving used 15e. Also, I ran it on the algebraic side FWIW.

Really appreciate you guys taking the time to even look at this composite. I've likely got tunnel vision on it, as I have spent a lot of effort trying to factor it.

RichD 2014-02-20 02:40

C160_3408_1385 splits as:

[CODE]prp68 factor: 17880278619984695184050502790206873556361981045516886688928661227471
prp93 factor: 228502311121984348922806887427415202001523194440641388076613271354581292876973003527160307423[/CODE]

fivemack 2014-02-20 09:07

GW_3_494 done
 
[code]
Wed Feb 19 22:53:55 2014 prp79 factor: 6544139735540280558492681564220189691920039926567620314258950958984293109132781
Wed Feb 19 22:53:55 2014 prp141 factor: 356041230470821300547265003988221353676049541264454212033019140637311164592494814068768166474884500439741810418568259479009834952510802545219
[/code]

Log at [url]http://pastebin.com/wcG0pwQG[/url]

7.4M matrix, 57 hours on i7/2600

fivemack 2014-02-20 09:09

Taking GC_4_391, eta Monday morning

Also, ETA for 2340_723 is Friday morning - I suppose I could have asked for slightly less sieving.

RichD 2014-02-20 22:39

Data Point
 
The C160_3408_1385 built a 6.0M matrix using target_density=124 and took just over 22 hrs (in BL) on a Core-i5 (-t 4).

fivemack 2014-02-21 08:05

2340_723 done
 
[code]
Fri Feb 21 04:01:00 2014 prp68 factor: 33764306304594609841151385700590125608455755423072146929348187732319
Fri Feb 21 04:01:00 2014 prp101 factor: 34171287558191566663884446485221802740049394413628828438507905979286574134540823657330555119715424329
[/code]

19.5 hours on i7/4770 for a 5.9M matrix

fivemack 2014-02-21 08:08

GC_5_338 done
 
[code]
Fri Feb 21 06:55:18 2014 prp56 factor: 34470240407647113273727147928606646498352887484685412491
Fri Feb 21 06:55:18 2014 prp118 factor: 2146601110002558873853257238895849150974507931390059443029499513822227376252498348631767281157367204096404394646303011
[/code]

93 hours on i5/750 -t4 for 8.7M matrix

swellman 2014-02-21 11:38

C_2_784 splits

[code]
prp95 factor: 62989812257204905134330822366504114809484465259945825102840522386263274321822248099151904049177
prp110 factor: 22394703467804797846381410780003403139074909488534959121267242416187245444804045804712301688742956615022179337
[/code]


All times are UTC. The time now is 10:17.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.