![]() |
1 Attachment(s)
[QUOTE=swellman;426258]Reserving C174_135_52.[/QUOTE]
[code] prp76 factor: 9522934190484078560627227170020897016348276436122156113726254873334429818369 prp98 factor: 77093168327668050465441236212732541916869933675177640726485696867474680362362047477567204019987107 [/code] |
C167_3408_1608
Reserving C167_3408_1608 for post processing once it has completed sieving.
|
1 Attachment(s)
C157_933436_12482 completed - 8.7 hours for 4.5M matrix (TD=140).
Log attached. [CODE]prp58 factor: 2439345585408025978404552561568450595980354627236117171809 prp100 factor: 1695313030822313666898570147433079020814677625796631913519440944463895959728028918973569569615516347 [/CODE] |
1 Attachment(s)
[QUOTE=swellman;425470]I'll take 12479_61_minus1. Thanks.[/QUOTE]
[code] prp78 factor: 234581736627631575569286329618828840291560317650597565817944915556589121865799 prp169 factor: 2514940352456535078253297775049475868478850647972423096287631935494731241582412199628332251736389362447793363173510304237608323606337041340927963982237859317419417668599 [/code] |
C164_P207_plus_1
1 Attachment(s)
[CODE]p62 factor: 44759574242121922804439718825827331957323307658500917675356489
p102 factor: 403191373767604997806014519426833119591412612947520596022458100343630460153930235270884974763433011337[/CODE] |
1 Attachment(s)
[QUOTE=swellman;426510]Reserving C217_134_64 for post processing.
[/QUOTE] [code] prp68 factor: 17264579094044243022303536057641054098755549539204142553596274819809 prp150 factor: 216944286532672044297155636572660704424950149794745929331964015358353650959225869704894559221222907803785640353525719915289664087352434519744474367553 [/code] |
I'll take P42_7_minus_1 next.
|
C166_P172_plus_1 factored
1 Attachment(s)
[B]C166_P172_plus_1[/B]
[code] prp66 factor: 243043034115974300237504687368898851346557494443467933047871873451 prp101 factor: 18265889924425951009864183423501788076175814592837993999479201649757616664460645536409492731516940381[/code] Bit of trouble with filtering, but with an older msieve version was able to get a 9.05M matrix. |
Delay on Euclid-Mullin C198
The current matrix (OK, the matrix from yesterday 9am GMT, since the download takes most of a working day) with 948.17Mrel, 623.19Muniq, 565.47M unique-ideals, is enormous (50.74M), to the point that it doesn't leave me with enough memory on a 32GB workstation to keep using it as a workstation, and it's not going to run sensibly over my vacation. So I've added a bit more sieving and will repeat the filtering and start the linear algebra after Easter. The high duplication rate suggests that, at 198-digit GNFS, we are beyond the edge of sensible 15e jobs - I suppose that was known already.
|
C168_P169_plus_1
1 Attachment(s)
[CODE]p84 factor: 200401097410250158352459496850688120594688821053111508459200768844303331264279632291
p84 factor: 547902687439159563737662708450586973656483735247111554013480453710431385480349027371[/CODE] |
Ooooh, nice split.
|
C170_P226_plus_1
Reserving C170_P226_plus_1 for post processing.
|
Taking C202_149_37 and C224_147_40 for postprocessing.
|
C193_146_35
1 Attachment(s)
[CODE]p73 factor: 7327548063132608434181872870518544222864755386045933779282355250804289829
p121 factor: 1050363951982746315741179165768532452671157900974401213827016103594515595275523115907310107047395921745149383517252315159[/CODE] |
C167_3408_1608
1 Attachment(s)
[code]
prp64 factor: 2678481158977753273679547556455011874186982297657962086452195309 prp103 factor: 7197782349139844391411466860696285050352067890796094886329086274040158759635570371935098370649864248327 [/code] |
Reserving C189_147_41 for post processing.
|
P42_7_minus_1 factored
1 Attachment(s)
155.5 hours to solve a 14.2M matrix on Core-i5, -t 4, using target_density=106 (TD=112 failed).
[CODE]p92 factor: 18451859743529547149760485001446389586823399007727358015708942546179588615570905282536296807 p105 factor: 521154354861522302288734283430026567621266597956012925964101100004473158171795450383412594744159315810843[/CODE] |
1 Attachment(s)
C202_149_37 completed - 36 hours for 8.44M matrix with TD=130
[CODE]prp86 factor: 48894746053369317332712195445798023853027194181762459053882580933629603176550242014637 prp116 factor: 20615124867659438785212002823018517282237833650001261358550785666328397826471420433224594995968166369229350931784299 [/CODE] |
Reserving 5519_59_minus1 and 88887_225 for post processing.
|
1 Attachment(s)
C224_147_40 completed - 43 hours for 9.3M matrix with TD=120
[CODE]prp68 factor: 27954497509537991568719623616188782322183531646871409940041454838197 prp157 factor: 1011215342562907140939910037872844840093534504086727780767567225294419296760143952117953195096382053233130625802104214926352963481682034191114336976528127167[/CODE] |
C233_128_67
1 Attachment(s)
[CODE]p59 factor: 42011736291751946863027162166638524339005202697933984895459
p75 factor: 153176713332615391300842647329243748455432989078571558857128949562511608889 p100 factor: 9435585596449866249134587233162063184366706239238395466335800793355228695561269943640759898225745867[/CODE] |
Reserving 31333_223 for post-processing.
|
Reserving C162_145_107 and 11131_225 for postprocessing.
|
Reserving C225_144_86 for postprocessing.
|
Hi
As i dont have enough space in memory for store a large matrix and have just one computer for linear algebra stage, i might consider the idea of storing matrix in hard disk, load parts of matrix in memory to solve it and after merge results ? |
[QUOTE=prss;429215]Hi
As i dont have enough space in memory for store a large matrix and have just one computer for linear algebra stage, i might consider the idea of storing matrix in hard disk, load parts of matrix in memory to solve it and after merge results ?[/QUOTE] How much memory do you have? Most modern machines should have enough memory to handle the postprocessing of a 30-bit job. |
I have a notebook core i5 with 6GB of memory and i want factorize a semiprime number with 220 digits.
|
This thread is for volunteering and reserving the postprocessing work (filtering, linear algebra, sqrt, GCD, all using msieve) for various factoring efforts listed [url=http://escatter11.fullerton.edu/nfs/crunching.php]here[/url] and [url=http://escatter11.fullerton.edu/nfs/crunching_e.php]here[/url]. If you are attempting to factor a specific 220 digit semi-prime of your choosing, this is not the appropriate thread. If you have a means of factoring it with SNFS, start or add to a thread in the factoring subforum. And splitting so large a number with GNFS is beyond all but the most powerful clusters.
If I've misunderstood your intentions, please let me know. |
If your semiprime does not have a special form you will basically need a cluster to do the linear algebra in less than a year. You can't solve that problem in pieces, all the data needs to be in high-speed memory all the time.
|
C170_P226_plus_1
1 Attachment(s)
[code]
prp82 factor: 5303258990189889511430437164253892679796133232738990641774462289151589284087109171 prp88 factor: 3567385621612041633609685002897196562668614097641201395138179222011019878627683367957233 [/code] |
1 Attachment(s)
C162_145_107 completed - 21 hours for 6.66M matrix with TD=140
[CODE]prp79 factor: 2003036979769084365915195150631131038175961331435920440414819893898131595160057 prp84 factor: 158397560600399204930433968281157818394873287876550371319082524405850881378911710181[/CODE] |
1 Attachment(s)
C162_933436_12487 completed - 18.5 hours for 6.1M matrix with TD=140
[CODE]prp65 factor: 22131034012344789387802206414211495789164983960870252586906452363 prp97 factor: 5486043294775208670334787765354449737640262732391650937290280884652611369291810503676276109456149[/CODE] |
Taking C186_2426789_31_plus_3162104763 and 32333_224 for LA.
|
I'll take 11101_228 next.
|
I'll also take 55551_228.
|
William: C152_P171_plus_1 factored as
[code]p67 factor: 2668400554531314975973645152119742976041832327640414634947565752407 p86 factor: 25257255185319797278196990545359279076105045141113879192452842595315390222372208800581[/code]on the first dependency. |
Andrey: C213_150_38 factored as
[code]p86 factor: 24447799794407141096225004772255947844310075180022560417925004506390411275155004526781 p127 factor: 8777202281924228246945443581759161651632699617567248732946108372082416472930847595429198871720649917289203614408285875093862881[/code] on the second dependency. |
William: C152_P155_plus_1 factored as
[code]p51 factor: 930753595950231486904333869636526588952001591191273 p101 factor: 78269095416135400601955386083962682450419851919346667294382555360818726170618969852208256247496736377[/code] on the second dependency. |
No need to add more MQ to C186_2426789_31_plus_3162104763, I've already got the factors.
|
And C152_P198_plus_1 factored as
[code]p64 factor: 4708171329338727608043332481177377813983792533771534179351950177 p88 factor: 4045711185586638365358613074262647606588145382985298731030974319242815167969209281575913[/code] on the second dependency. |
11101_228 factored
1 Attachment(s)
15 hours to solve a 5.15M matrix using Core-i5, -t 4 with target_density=120.
[CODE]p73 factor: 3138731866858964711158809589181877903016066883057647980304758622657579393 p155 factor: 35400000963543204269504058843898777589030165843769860381894774822156997938472890990363669674509213300193440729142719451637572271805547069622875875265424957[/CODE] |
727_91_minus1
1 Attachment(s)
[CODE]p85 factor: 1263433496001781325152093771580888256916047194442409597504125421792685232817632278927
p116 factor: 11639821451398554154115124228438116399698651056291132298769786177235714109159748535494864293004013309028747436361629[/CODE] |
55551_228 factored
1 Attachment(s)
20 hours to solve 5.0M matrix on (not completely idle) Core-i5, -t 4 using target_density=120.
[CODE]p75 factor: 360547932138944530255229596581938545582982530921247150790017849244519636943 p154 factor: 1540864628621586001898994342634739411239415715115404773592795070915450181433523242067260607019927615178468699069633836071372476654229599569149507101788657[/CODE] |
Reserving 149^47+47^149.
|
I'll take C193_117_115 next.
|
31333_223 results
1 Attachment(s)
[code]prp112 factor: 1052772858839127625389208993845564243546872425259431579107345255419890439450300685383709542020768000672970081887
prp114 factor: 297626720429362104780882120427617928824514443402293951099056617318006222994815246646967306819704484029048689146059[/code] |
I'll take [B]95557_229[/B] if it is still available.
|
C185_150_37
1 Attachment(s)
[CODE]p65 factor: 23332130014932256793571407658856046535203722881628000939044865421
p120 factor: 987892448345606557659674382414709599869967572832240664099336195738303534547459043256431285626201623996540613361740411357[/CODE] |
3 Attachment(s)
C186_2426789_31_plus_3162104763: 12.5 hours for 5.3M matrix with TD=130
[CODE]prp80 factor: 46415800359340225096306514439553911478766555975350829288593540858075745503310807 prp107 factor: 10848037184865810316925444593750866307822014016255420184709891919685774571700253185341308000013779942875547 [/CODE] 11131_225: 16 hours for 6.1M matrix with TD=120 [CODE]prp88 factor: 7751829167692735534669866359608637543218345959863314579069916310606337688409399860571149 prp137 factor: 14333534538427188535967998137251487252272175991367345730789033589253832856686462421009277199840158878026971897018764391318820453653795719 [/CODE] 32333_224: 15.5 hours for 5.9M matrix with TD=120 [CODE]prp77 factor: 10069991644435937822293535207006309563047632198936619298898704039867221060643 prp150 factor: 321085999621447133932638325823112010558832424201880979833470001293252484545206698018538204150428238598779688427767796342643888307244858702838935073831 [/CODE] All logs attached. |
Reserving 10007_246.
|
5519_59_minus1
1 Attachment(s)
[code]
prp57 factor: 317757342105402636122752877673957877794921565773826038131 prp120 factor: 234828927998042970741825676045586682755190500698945559902463774734846020790037600309063710090616261529247698629606465771 [/code] |
I'll take 6_316_plus_5_316 and 6_317_plus_5_317 next.
|
C172_145_36
1 Attachment(s)
[CODE]p52 factor: 2902501069552897182242290339255789868845050645696613
p120 factor: 360484708137501878291181858628785113103872978612727836014583485393456934775062160129704855557106500673938242538444858381[/CODE] |
88887_225
1 Attachment(s)
[code]
prp89 factor: 12885184573296242058422944793597586085380960323748957713813136757397827282147984550636983 prp137 factor: 68985343891080676769874038289825978296281213065649419534605018948253721115285817355019069520500716195272122538725728251489807843503317889 [/code] |
Reserving GC_12_230 for postprocessing. Thanks.
|
1 Attachment(s)
The C222 from 4283^61-1 is the product of 3 factors:
[code]Mon Mar 28 09:37:34 2016 p56 factor: 35286932462952672872125482693628881102271748845056604051 Mon Mar 28 09:37:34 2016 p71 factor: 74023776056241628262401761743998141577961826782996197047930820899567529 Mon Mar 28 09:37:34 2016 p87 factor: 284900761456595292540926043911643786126274886002455314584029997761238237728406289112089[/code] |
I'll take 4_409_minus_3_409 next but it won't start until late Thursday when C193_117_115 is expected to complete.
My other two LA will complete on Friday & Sunday. |
Taking a quick look at the 14e status page right now, there are 28 numbers queued or in progress. The amount of activity here amazes me, and you all are great keeping up with it. Thanks!
|
I'll do the linear algebra on C181_3366_2124 (12.7M matrix, ETA Monday morning)
|
It's going to be a little bit, but I'm going to go ahead and reserve the Home Primes 2 (4496) number on the 15e server. Thanks again to Fivemack for queuing it up and NFS@Home for running it. :smile:
|
22229_230
1 Attachment(s)
[CODE]p56 factor: 15512216907486634503749780135668659782668538466196883479
p57 factor: 365560193147104184544040796710514557252803636385099289561 p118 factor: 3918814616735684356338065282581713988933627929089672170899343017427517780106932882673731872515365336477060956898509291[/CODE] |
Reserving C210_126_122 and C213_140_41 for postprocessing.
|
C189_147_41
1 Attachment(s)
[code]
prp66 factor: 194945911018226268909006110181294210010213379221080485073447499139 prp124 factor: 1053889193632012112140446114924794354332602048150477737837170863462825710440704951497762600562891494704662817193475571458233 [/code] |
C158_146_114
1 Attachment(s)
[CODE]p72 factor: 958348290789129006119641796196274839876690370543534752321494909456424293
p86 factor: 11930871709743989701062943061725901823290778991951114248102625917350163242220775566389[/CODE] |
6_317_plus_5_317 factored
1 Attachment(s)
82 hours to solve a 10.8M matrix on a (mostly idle) Core-i5, -t 4, using target_density=116. (TD=120 failed)
[CODE]p71 factor: 34810309444160640494995611580944418069999061493120106349856522609225011 p102 factor: 143672063339286487963285646414380927995866976636314486989458158312545706504789967907323773786033042381[/CODE] |
We are having some trouble with C231_117_97.
With ~110M relations it could not build a matrix. We sent it back for an additional 10M relations. Now, with TD=100, it finally works, but the run time is stupid long for a 30-bit job. (They normally are under 200 hours for us.) [CODE]Found 85463350 unique, 33151136 duplicate, and 37 bad relations. Largest dimension used: 341 of 100000 Average dimension used: 260.8 of 100000[/CODE][CODE]Msieve v. 1.53 (SVN 991) Thu Mar 31 17:06:54 2016 random seeds: be975b7c bd173379 factoring 615935275494760690241148635622719657233885549986453551434780202582054649693168405371419014960793961264584034357208278741435936345448933965890974621943655323568976128254098687953770479444789238477246928976443897963480846069056783309 (231 digits) no P-1/P+1/ECM available, skipping commencing number field sieve (231-digit input) R0: -1233030410813767585190839206937281 R1: 56061272466674974994842655144706937633 A0: 912673 A1: 0 A2: 0 A3: 0 A4: 0 A5: 0 A6: 117 skew 1.00, size 3.127e-12, alpha 0.634, combined = 2.694e-13 rroots = 0 commencing relation filtering setting target matrix density to 100.0 estimated available RAM is 16010.6 MB commencing duplicate removal, pass 1 read 10M relations read 20M relations read 30M relations read 40M relations read 50M relations read 60M relations read 70M relations read 80M relations skipped 3 relations with b > 2^32 found 3316451 hash collisions in 86681790 relations commencing duplicate removal, pass 2 found 0 duplicates and 86681790 unique relations memory use: 253.2 MB reading ideals above 57475072 commencing singleton removal, initial pass memory use: 2756.0 MB reading all ideals from disk memory use: 1810.3 MB commencing in-memory singleton removal begin with 86681790 relations and 82441568 unique ideals reduce to 50450715 relations and 41825598 ideals in 15 passes max relations containing the same ideal: 34 reading ideals above 720000 commencing singleton removal, initial pass memory use: 1378.0 MB reading all ideals from disk memory use: 1965.4 MB keeping 48392850 ideals with weight <= 200, target excess is 274717 commencing in-memory singleton removal begin with 50450715 relations and 48392850 unique ideals reduce to 50450485 relations and 48392620 ideals in 5 passes max relations containing the same ideal: 200 removing 3109912 relations and 2709912 ideals in 400000 cliques commencing in-memory singleton removal begin with 47340573 relations and 48392620 unique ideals reduce to 47192008 relations and 45532593 ideals in 9 passes max relations containing the same ideal: 198 removing 2333913 relations and 1933913 ideals in 400000 cliques commencing in-memory singleton removal begin with 44858095 relations and 45532593 unique ideals reduce to 44765050 relations and 43504769 ideals in 8 passes max relations containing the same ideal: 192 removing 2092419 relations and 1692419 ideals in 400000 cliques commencing in-memory singleton removal begin with 42672631 relations and 43504769 unique ideals reduce to 42592173 relations and 41731155 ideals in 7 passes max relations containing the same ideal: 188 removing 1967038 relations and 1567038 ideals in 400000 cliques commencing in-memory singleton removal begin with 40625135 relations and 41731155 unique ideals reduce to 40550644 relations and 40088953 ideals in 7 passes max relations containing the same ideal: 182 removing 798573 relations and 655554 ideals in 143019 cliques commencing in-memory singleton removal begin with 39752071 relations and 40088953 unique ideals reduce to 39740273 relations and 39421573 ideals in 6 passes max relations containing the same ideal: 180 relations with 0 large ideals: 9697 relations with 1 large ideals: 2909 relations with 2 large ideals: 16568 relations with 3 large ideals: 166339 relations with 4 large ideals: 951610 relations with 5 large ideals: 3316834 relations with 6 large ideals: 7331625 relations with 7+ large ideals: 27944691 commencing 2-way merge reduce to 27173537 relation sets and 26854837 unique ideals commencing full merge memory use: 3209.8 MB found 12464797 cycles, need 12443037 weight of 12443037 cycles is about 1244824832 (100.04/cycle) distribution of cycle lengths: 1 relations: 707090 2 relations: 1057933 3 relations: 1225679 4 relations: 1198013 5 relations: 1143211 6 relations: 1031982 7 relations: 928742 8 relations: 813969 9 relations: 710506 10+ relations: 3625912 heaviest cycle: 28 relations commencing cycle optimization start with 92992197 relations pruned 3460224 relations memory use: 2622.7 MB distribution of cycle lengths: 1 relations: 707090 2 relations: 1087361 3 relations: 1283348 4 relations: 1244783 5 relations: 1190145 6 relations: 1063308 7 relations: 952184 8 relations: 824185 9 relations: 716692 10+ relations: 3373941 heaviest cycle: 28 relations RelProcTime: 4598 commencing linear algebra read 12443037 cycles cycles contain 39550056 unique relations read 39550056 relations using 20 quadratic characters above 4294917295 building initial matrix memory use: 5142.7 MB read 12443037 cycles matrix is 12442860 x 12443037 (5031.4 MB) with weight 1458808159 (117.24/col) sparse part has weight 1182081791 (95.00/col) filtering completed in 2 passes matrix is 12442367 x 12442544 (5031.4 MB) with weight 1458793169 (117.24/col) sparse part has weight 1182078296 (95.00/col) matrix starts at (0, 0) matrix is 12442367 x 12442544 (5031.4 MB) with weight 1458793169 (117.24/col) sparse part has weight 1182078296 (95.00/col) saving the first 48 matrix rows for later matrix includes 64 packed rows matrix is 12442319 x 12442544 (4794.6 MB) with weight 1223961181 (98.37/col) sparse part has weight 1132443772 (91.01/col) using block size 8192 and superblock size 294912 for processor cache size 3072 kB commencing Lanczos iteration memory use: 4068.0 MB linear algebra at 0.0%, ETA 713h26m[/CODE]:help: |
C193_117_115 factored
1 Attachment(s)
168 hours to solve a 14.5M matrix on Core-i5, -t 4 using target_density=116 (TD=120 failed).
[CODE]p75 factor: 447503944204251253385550074563481476341705448852420669334114424242887810871 p119 factor: 15770101990356358948677487843718781326727726740565118277510833866339355513576590308908476278203812784200439394463506731[/CODE] |
1 Attachment(s)
C210_126_122 completed: 16 hours for 6.1M matrix with TD=130
[CODE]prp89 factor: 49146994345612745883708757038591887504370520648724266287709863398825788014109159046460041 prp121 factor: 2416212901279653241174143557946169136042325479353466201582124289843112181438315930353068011155595404385027437320849628401[/CODE] |
95557_229 results
1 Attachment(s)
[B]95557_229[/B]
[code] prp61 factor: 6607097592644589319930558848748488577787903051413743649982417 prp108 factor: 222971844346472267429345690752086532485202974645868795054840615626940963837588639662502561668437223781730323[/code] |
Taking [B]6_317_minus_5_317[/B]
|
Reserving [B]11_236_plus_10_236[/B] for post processing.
|
C162_OP_t500 appears to be factored last week.
On a different note, I will be out of town for a couple weeks after completing my two assignments. Meaning I will not take on anymore work for awhile because it would lead to delayed results reporting. |
[QUOTE=Xyzzy;430481]We are having some trouble with C231_117_97.
With ~110M relations it could not build a matrix. We sent it back for an additional 10M relations. Now, with TD=100, it finally works, but the run time is stupid long for a 30-bit job. (They normally are under 200 hours for us.) [/QUOTE] Is multithreading turned on? |
[QUOTE=jasonp;430636]Is multithreading turned on?[/QUOTE]We assume it is, because we can run on multiple cores.
We sent it back for another 10M relations. :mike: |
[QUOTE=jasonp;430636]Is multithreading turned on?[/QUOTE]
[QUOTE=Xyzzy;430643]We assume it is, because we can run on multiple cores. We sent it back for another 10M relations. :mike:[/QUOTE] Jason is right, you forgot the [c]-t[/c] option. Compare the output from [URL="http://mersenneforum.org/showpost.php?p=430481&postcount=745"]your most recent post[/URL] with this [URL="http://mersenneforum.org/showpost.php?p=425896&postcount=669"]earlier post of yours[/URL]: Recent issue: [code]saving the first 48 matrix rows for later matrix includes 64 packed rows matrix is 12442319 x 12442544 (4794.6 MB) with weight 1223961181 (98.37/col) sparse part has weight 1132443772 (91.01/col) using block size 8192 and superblock size 294912 for processor cache size 3072 kB commencing Lanczos iteration memory use: 4068.0 MB linear algebra at 0.0%, ETA 713h26m[/code] Older non-issue: [code]Wed Feb 10 14:06:40 2016 saving the first 48 matrix rows for later Wed Feb 10 14:06:43 2016 matrix includes 64 packed rows Wed Feb 10 14:06:46 2016 matrix is 20344510 x 20344735 (7917.1 MB) with weight 1985767002 (97.61/col) Wed Feb 10 14:06:46 2016 sparse part has weight 1871966901 (92.01/col) Wed Feb 10 14:06:47 2016 using block size 8192 and superblock size 294912 for processor cache size 3072 kB Wed Feb 10 14:08:43 2016 commencing Lanczos iteration[B] (2 threads)[/B] Wed Feb 10 14:08:43 2016 memory use: 6798.9 MB Wed Feb 10 14:14:03 2016 linear algebra at 0.0%, ETA 1160h 1m[/code] |
[QUOTE=Dubslow;430647]Jason is right, you forgot the [c]-t[/c] option.[/QUOTE]We use one core for 30-bit jobs. They are usually under 200 hours with one core.
:mike: |
6_316_plus_5_316 factored
1 Attachment(s)
163.5 hours to solve a 12.7M matrix on Core-i7, -t 4 using target_density=112. (TD=116 failed)
[CODE]prp80 factor: 20206957651913041022106918338431611081608753047535017044690679163933015220179833 prp98 factor: 10779363587393581426022753409639415729646552704321518431728759202432038225102184178297087696791833[/CODE] |
I'll run C200_150_148 on seven threads on an eight-core IVB Xeon
(31.7M matrix with target_density=125; ETA ~700 hours, so first week of May) |
C225_144_86
1 Attachment(s)
[code]
prp111 factor: 671342408532585030413375163596357949676226138287753713705683541106134186386986989011053481247060842244138502361 prp114 factor: 493212617024213634262709401296724719952303885031291960809123296301627333886125449694064551485662436628094626913033 [/code] |
1 Attachment(s)
C213_140_41 completed: 18 hours for 5.3M matrix with TD=130 (msieve -t 12 on slightly busy Dual Xeon E5-2620).
[CODE]P49 = 3475165686373875903666718871788291265014575235709 P80 = 29232665309692611631824455117308296871373561392983191483135367795849212115521363 P85 = 8559252530563782928742711462092688493947080608035399704981804587293254547082488527013[/CODE] |
C181 from 3366_2124 done
1 Attachment(s)
[code]
Sun Apr 3 22:33:23 2016 p80 factor: 57923143033220305023958487436543995468449985472956082605792711734409076751404781 Sun Apr 3 22:33:23 2016 p101 factor: 26726178394349824283131168583671964894013236832821034682889462638071283465645681907849428046986082467 [/code] 117.8 hours on seven cores E5-2650v2 for 12.7M matrix produced with density 130 |
1 Attachment(s)
10007_246 completed: 67 hours for 11.8M matrix with TD=120
[CODE]prp107 factor: 60165844998490249229875772005850193896086858292743579409020446596677606993520932226579989079718232269908449 prp140 factor: 16620725596475760512784333837955342706476327308083513172242469232231446977901981919077023860933002718387640889541806766973201100566876698343 [/CODE] |
4_409_minus_3_409 factored
1 Attachment(s)
100+ hours (power outages) to solve a 11.7M matrix on Core-i5, -t 4 using target_density=112 (TD=116 failed).
[CODE]prp89 factor: 37289909580309862111697012978469498997530066231388984708324720746304706221428537956177353 prp107 factor: 88583088619420528636918241313342683394522041267031243728860331525329131767391281619454902547768857806726687[/CODE] |
Reserving C182_127_60.
|
149^47+47^149 Factored
1 Attachment(s)
[code]
prp87 factor: 100934986697067346453928228914157616105430494078090767451399893816540977001896071658117 prp113 factor: 10916635393565195773837132504399009064271490362651309447359337539055573882616884767912921679705368021673854985149 [/code] |
Reserving 9_257_minus_8_257 from 14e.
|
Problem with GC_12_230
Having problems getting through filtering stage with GC_12_230. Made several attempts to build a matrix but msieve keeps hanging at the full merge step. I've varied TD and used remdups but still keeps hanging.
Are more relations required? Latest attempt (TD=90): [code] Wed Apr 06 13:16:34 2016 Msieve v. 1.52 (SVN unknown) Wed Apr 06 13:16:34 2016 random seeds: 0245285c 049cd774 Wed Apr 06 13:16:34 2016 factoring 164545400782112944741606396046145542779156965440821788928971769946682191203934597422608613820255140277282787351316937551505694842862297783757434863700678144145048497647731929374605144699770396705745172065009899408881421432609394881902623495267 (243 digits) Wed Apr 06 13:16:36 2016 no P-1/P+1/ECM available, skipping Wed Apr 06 13:16:36 2016 commencing number field sieve (243-digit input) Wed Apr 06 13:16:36 2016 R0: -102067469997853225734913580209377959215104 Wed Apr 06 13:16:36 2016 R1: 1 Wed Apr 06 13:16:36 2016 A0: 1 Wed Apr 06 13:16:36 2016 A1: 0 Wed Apr 06 13:16:36 2016 A2: 0 Wed Apr 06 13:16:36 2016 A3: 0 Wed Apr 06 13:16:36 2016 A4: 0 Wed Apr 06 13:16:36 2016 A5: 0 Wed Apr 06 13:16:36 2016 A6: 33120 Wed Apr 06 13:16:36 2016 skew 1.00, size 1.211e-012, alpha 0.367, combined = 1.271e-013 rroots = 0 Wed Apr 06 13:16:36 2016 Wed Apr 06 13:16:36 2016 commencing relation filtering Wed Apr 06 13:16:36 2016 setting target matrix density to 90.0 Wed Apr 06 13:16:36 2016 estimated available RAM is 8098.7 MB Wed Apr 06 13:16:36 2016 commencing duplicate removal, pass 1 Wed Apr 06 13:36:46 2016 error -15 reading relation 101283406 Wed Apr 06 13:36:46 2016 error -15 reading relation 101314729 Wed Apr 06 13:46:25 2016 error -15 reading relation 149886651 Wed Apr 06 13:46:25 2016 error -15 reading relation 149886698 Wed Apr 06 13:46:25 2016 error -11 reading relation 149886712 Wed Apr 06 13:46:25 2016 error -15 reading relation 149886725 Wed Apr 06 13:46:25 2016 error -15 reading relation 149886738 Wed Apr 06 13:47:42 2016 error -11 reading relation 156199277 Wed Apr 06 13:50:05 2016 skipped 1820 relations with b > 2^32 Wed Apr 06 13:50:05 2016 found 6175590 hash collisions in 167008136 relations Wed Apr 06 13:50:31 2016 commencing duplicate removal, pass 2 Wed Apr 06 13:53:32 2016 found 0 duplicates and 167008136 unique relations Wed Apr 06 13:53:32 2016 memory use: 506.4 MB Wed Apr 06 13:53:32 2016 reading ideals above 119865344 Wed Apr 06 13:53:32 2016 commencing singleton removal, initial pass Wed Apr 06 14:30:08 2016 memory use: 3012.0 MB Wed Apr 06 14:30:08 2016 reading all ideals from disk Wed Apr 06 14:30:49 2016 memory use: 3304.0 MB Wed Apr 06 14:31:00 2016 commencing in-memory singleton removal Wed Apr 06 14:31:08 2016 begin with 167008136 relations and 156691035 unique ideals Wed Apr 06 14:32:38 2016 reduce to 90939529 relations and 70932838 ideals in 18 passes Wed Apr 06 14:32:38 2016 max relations containing the same ideal: 33 Wed Apr 06 14:32:46 2016 reading ideals above 720000 Wed Apr 06 14:32:47 2016 commencing singleton removal, initial pass Wed Apr 06 14:57:43 2016 memory use: 2756.0 MB Wed Apr 06 14:57:44 2016 reading all ideals from disk Wed Apr 06 14:58:24 2016 memory use: 3647.5 MB Wed Apr 06 14:58:36 2016 keeping 84121527 ideals with weight <= 200, target excess is 475495 Wed Apr 06 14:58:48 2016 commencing in-memory singleton removal Wed Apr 06 14:58:58 2016 begin with 90939529 relations and 84121527 unique ideals Wed Apr 06 15:00:13 2016 reduce to 90938600 relations and 84120598 ideals in 8 passes Wed Apr 06 15:00:13 2016 max relations containing the same ideal: 200 Wed Apr 06 15:01:03 2016 removing 7725842 relations and 6725842 ideals in 1000000 cliques Wed Apr 06 15:01:06 2016 commencing in-memory singleton removal Wed Apr 06 15:01:15 2016 begin with 83212758 relations and 84120598 unique ideals Wed Apr 06 15:02:31 2016 reduce to 82758584 relations and 76933968 ideals in 9 passes Wed Apr 06 15:02:31 2016 max relations containing the same ideal: 195 Wed Apr 06 15:03:16 2016 removing 5705298 relations and 4705298 ideals in 1000000 cliques Wed Apr 06 15:03:18 2016 commencing in-memory singleton removal Wed Apr 06 15:03:26 2016 begin with 77053286 relations and 76933968 unique ideals Wed Apr 06 15:04:29 2016 reduce to 76770784 relations and 71942649 ideals in 8 passes Wed Apr 06 15:04:29 2016 max relations containing the same ideal: 188 Wed Apr 06 15:05:11 2016 removing 5064578 relations and 4064578 ideals in 1000000 cliques Wed Apr 06 15:05:13 2016 commencing in-memory singleton removal Wed Apr 06 15:05:21 2016 begin with 71706206 relations and 71942649 unique ideals Wed Apr 06 15:06:18 2016 reduce to 71464068 relations and 67632886 ideals in 8 passes Wed Apr 06 15:06:18 2016 max relations containing the same ideal: 181 Wed Apr 06 15:06:57 2016 removing 4717753 relations and 3717753 ideals in 1000000 cliques Wed Apr 06 15:06:59 2016 commencing in-memory singleton removal Wed Apr 06 15:07:06 2016 begin with 66746315 relations and 67632886 unique ideals Wed Apr 06 15:07:52 2016 reduce to 66517440 relations and 63683385 ideals in 7 passes Wed Apr 06 15:07:52 2016 max relations containing the same ideal: 174 Wed Apr 06 15:08:29 2016 removing 4502419 relations and 3502419 ideals in 1000000 cliques Wed Apr 06 15:08:30 2016 commencing in-memory singleton removal Wed Apr 06 15:08:37 2016 begin with 62015021 relations and 63683385 unique ideals Wed Apr 06 15:09:20 2016 reduce to 61790077 relations and 59953026 ideals in 7 passes Wed Apr 06 15:09:20 2016 max relations containing the same ideal: 164 Wed Apr 06 15:09:53 2016 removing 4357094 relations and 3357094 ideals in 1000000 cliques Wed Apr 06 15:09:55 2016 commencing in-memory singleton removal Wed Apr 06 15:10:01 2016 begin with 57432983 relations and 59953026 unique ideals Wed Apr 06 15:10:41 2016 reduce to 57202643 relations and 56362260 ideals in 7 passes Wed Apr 06 15:10:41 2016 max relations containing the same ideal: 158 Wed Apr 06 15:11:12 2016 removing 1543031 relations and 1254223 ideals in 288808 cliques Wed Apr 06 15:11:13 2016 commencing in-memory singleton removal Wed Apr 06 15:11:18 2016 begin with 55659612 relations and 56362260 unique ideals Wed Apr 06 15:11:51 2016 reduce to 55631124 relations and 55079403 ideals in 6 passes Wed Apr 06 15:11:51 2016 max relations containing the same ideal: 153 Wed Apr 06 15:12:31 2016 relations with 0 large ideals: 18095 Wed Apr 06 15:12:31 2016 relations with 1 large ideals: 3305 Wed Apr 06 15:12:31 2016 relations with 2 large ideals: 43146 Wed Apr 06 15:12:31 2016 relations with 3 large ideals: 389203 Wed Apr 06 15:12:31 2016 relations with 4 large ideals: 1970295 Wed Apr 06 15:12:31 2016 relations with 5 large ideals: 6076827 Wed Apr 06 15:12:31 2016 relations with 6 large ideals: 11866012 Wed Apr 06 15:12:31 2016 relations with 7+ large ideals: 35264241 Wed Apr 06 15:12:31 2016 commencing 2-way merge Wed Apr 06 15:13:15 2016 reduce to 36836318 relation sets and 36284597 unique ideals Wed Apr 06 15:13:15 2016 commencing full merge [/code] Are the number of relation sets too close to the number of unique ideals? |
Reserving C219 from 145^66+66^145
I can't start it for a couple of weeks, but I will run the post processing if no one else wants it.
146^61+61^146 is currently in LA, should be completely factored by April 26. On 14e queue: I have moved on to 11_236_plus_10_236 for now, temporarily abandoning GC_12_230 pending more relations/advice. Thanks. |
1 Attachment(s)
C182_127_60 completed: 12 hours for 5.3M matrix with TD=120
[CODE]P55 = 2880282454128516243454395533414032352034210941091038359 P63 = 732117081627912082562983709152174944312936941920985725453295251 P65 = 11079140921204638256014648286891848232293645621467937210429406971[/CODE] |
[QUOTE=swellman;430890]Having problems getting through filtering stage with GC_12_230. Made several attempts to build a matrix but msieve keeps hanging at the full merge step. I've varied TD and used remdups but still keeps hanging.
[/QUOTE] I presume it was a memory issue. I've successfully built the 18M matrix with TD=90. Will try another run with TD=120. [CODE]Thu Apr 7 13:17:07 2016 commencing full merge Thu Apr 7 13:36:10 2016 memory use: 4392.1 MB Thu Apr 7 13:36:16 2016 found 18141937 cycles, need 18009791[/CODE]Msieve v. 1.52 |
C180_146_44
1 Attachment(s)
[CODE]p63 factor: 444825782388636199084834457770306183960916829988172895974146117
p117 factor: 506494758042609067071050478734955452768535888560148697072508411038667203382873803050841604249833961312626421460425329[/CODE] |
[QUOTE=unconnected;430944]I presume it was a memory issue. I've successfully built the 18M matrix with TD=90. Will try another run with TD=120.
[CODE]Thu Apr 7 13:17:07 2016 commencing full merge Thu Apr 7 13:36:10 2016 memory use: 4392.1 MB Thu Apr 7 13:36:16 2016 found 18141937 cycles, need 18009791[/CODE]Msieve v. 1.52[/QUOTE] Hmmm. I've used this particular machine many times, and have since built another matrix for 11_236_plus_10_236 with no issue. But I can't argue with your success. Unconnected, do you mind finishing the postprocessing for GC_12_230? I won't have another capable machine to finish LA on this number until late April. Thanks for the feedback - was driving me crazy! |
9_257_minus_8_257 factored
1 Attachment(s)
[code]
p53 factor: 98544716677960792886845691738273274044954894124023191 p169 factor: 9122179221937835100625349289770495411060046756177380621920473458259234731142871007361008418444311107179769742069235457962316099742837170275543758107033388297121465847231 [/code] |
Reserving 9_257_plus_8_257.
|
Reserving 11_236_plus_8_236.
|
C171_142_35 ends in
[code]p83 factor: 11326649840165016249310962716474362205781332104613943225248810702951488902269723587 p89 factor: 11739546567559064250651915743676557891409444337785903694287862243837037617568535730366483[/code] |
Reserving C206_117_106.
|
| All times are UTC. The time now is 05:35. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.