![]() |
[QUOTE=VBCurtis;462920]I tried again last night at 136 with sufficient disk space, should have info tonight. The log indicated ~4M excess cycles at 128, which is why I thought 136 might work.[/QUOTE]
TD 128 built a 50.05M matrix with weight of sparse part 5.79G. TD 136 built a 49.2M matrix with weight of sparse part 6.0G. The ETAs match within a few dozen hours (136 is less than 1% higher after 100k dimensions, but it is running now so I left it). ~9000hr. If an expert could weigh on on the chances of transferring the matrix files for a partner to complete this monster a few months from now, I'd like that reassurance. |
I'm afraid I can't be very cheering: a little while ago I tried running the matrix preparation on one machine and moving to another, and didn't achieve success: see [url]http://mersenneforum.org/showthread.php?t=22176[/url]
('pumpkin' has defective memory, but I'm pretty sure 'butternut' doesn't) I am planning on getting an i9-7900X early in 2018, but I expect you to have found some way to finish the linear algebra by then; at the moment it's hot enough that the top two computers in my pile of ex-Facebook machines in the shed keep spontaneously turning themselves off, which is unpromising for doing MPI on that cluster. |
Taking C217_143_51: C162_11040_10042 I will leave for someone with a more bijou steamroller.
|
[QUOTE=fivemack;463077]
I am planning on getting an i9-7900X early in 2018, but I expect you to have found some way to finish the linear algebra by then; at the moment it's hot enough that the top two computers in my pile of ex-Facebook machines in the shed keep spontaneously turning themselves off, which is unpromising for doing MPI on that cluster.[/QUOTE] Have you considered the AMD ThreadRipper(16C/32T)? |
[QUOTE=pinhodecarlos;463080]Have you considered the AMD ThreadRipper(16C/32T)?[/QUOTE]
Yes, though I am more interested in the prospect of AVX-512; I expect AMD benchmarks to be available by the end of this year and they will play a part in my decision. |
[QUOTE=VBCurtis;462973]TD 128 built a 50.05M matrix with weight of sparse part 5.79G.
TD 136 built a 49.2M matrix with weight of sparse part 6.0G. The ETAs match within a few dozen hours (136 is less than 1% higher after 100k dimensions, but it is running now so I left it). ~9000hr. If an expert could weigh on on the chances of transferring the matrix files for a partner to complete this monster a few months from now, I'd like that reassurance.[/QUOTE] After discussions with VBCurtis via PM, he is going to abandon this job. I will attempt it on my hardware. The huge ETA combined with the potential difficulties/impossibilities of executing a relay LA on two different rigs were not appealing to either of us. Hoping this factorization will finish by winter! |
13*2^800-1 factored
[code]prp81 factor: 160495860510299089900884881150565381014314892318013981084372922252235562566936671
prp109 factor: 1827332725815904515769354793594180789579187639474678166920356073397044898104863056938970852957550120922053489[/code] Q=15M to 120M produced 385M rels; I downloaded 380M of them, yielding 322M unique. TD 136 failed, but TD 128 produced an 8.8M matrix, which took 67 hrs to solve on 6 threads of a not-idle i7-5820. Log: [url]https://pastebin.com/0i2SZLUv[/url] |
C198_149_50 complete
1 Attachment(s)
[code]
Wed Jul 12 02:14:36 2017 p63 factor: 541684712662784107865620586576703470646926294644158189450179227 Wed Jul 12 02:14:36 2017 p135 factor: 656960557069151588584995162383011277660705889083339105189749606002541691335451415407892037924827117775856772433500663369818477002938683 [/code] 122.6 hours for 15.15M matrix at density 134 (142 didn't work) on 7 threads E5-2650v2. Log attached and at [url]https://pastebin.com/Xjwsw2b4[/url] |
Reserving C162_11040_10042 (14e). I won't be able to start it until July 27.
|
I'll take C216_143_53 next.
(Should start it when C215_145_52 finishes up on Monday.) |
[QUOTE=RichD;463322]I'll take C216_143_53 next.
(Should start it when C215_145_52 finishes up on Monday.)[/QUOTE] Crap, due to a power outage C215_145_52 won't finish until early Tuesday. I couldn't get to the machine for several hours. Therefore, I can't post the results or start the next job until I get back next weekend. If someone wants to start C216_143_53 earlier, feel free to do so. Else I will start it as soon as I get back. I already have it downloaded but can't run two of these jobs simultaneously. |
| All times are UTC. The time now is 23:10. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.