![]() |
|
|
#1200 |
|
Sep 2008
Kansas
26×53 Posts |
GC_10_237 splits as:
Code:
prp91 factor: 1512377072986105793004196812647603257592302047201854599339358341182117713918227729923555823 prp91 factor: 7822173151770781861682077554423204567826632176976141792644026318725502158714722126226603209 |
|
|
|
|
|
#1201 | |
|
Bamboozled!
"πΊππ·π·π"
May 2003
Down not across
2×5,393 Posts |
Quote:
Paul |
|
|
|
|
|
|
#1202 |
|
AKA Speedy51
Oct 2012
New Zealand
3438 Posts |
Hi all, it has been a while since I have been postprocessing I would be keen to take GW_5_339. I think it will fit into 12 gig of RAM? If this is not the case could somebody please advise me. I will start to download of the .DAT file in about 16 or so hours. Monday 17th around 8:30 a.m. New Zealand daylight saving time If anybody can suggest a shorter running job I would be happy to take it instead of this one
|
|
|
|
|
|
#1203 |
|
Jun 2012
22×773 Posts |
Just completed this composite after 130+ hours on my i7. Imagine my surprise when I go to report the results in factordb and find the number is already fully factored! Just one of those things I guess - someone helped out over the years and no one noticed. My fault too for not checking status before starting the post processing.
Code:
prp60 factor: 904076630073200973071845071612316104114727407128281752180859 prp141 factor: 606625086440938588570886913614578101130135357532683232001986727791754202704750890694263443026373641246753028278422941760796166329290826567827 eta: I will take C176_118_93 next. Last fiddled with by swellman on 2014-03-16 at 12:06 |
|
|
|
|
|
#1204 | |
|
Bamboozled!
"πΊππ·π·π"
May 2003
Down not across
250428 Posts |
Quote:
Apologies to the earlier person(s) who completed the factorization but if you want your result to be known you need to tell the world about it in a manner which is attributable. Paul |
|
|
|
|
|
|
#1205 |
|
(loop (#_fork))
Feb 2006
Cambridge, England
23·11·73 Posts |
Taking F1893
|
|
|
|
|
|
#1206 | |
|
(loop (#_fork))
Feb 2006
Cambridge, England
144308 Posts |
Quote:
You might find that GW_3_497 is a bit quicker (I say this only because it has rather more relations, and we're on a cusp of matrix size vs relation count); remember '-nc1 target_density=112'. |
|
|
|
|
|
|
#1207 | |
|
AKA Speedy51
Oct 2012
New Zealand
227 Posts |
Quote:
|
|
|
|
|
|
|
#1208 | |
|
Sep 2008
Kansas
339210 Posts |
I see the problem. I did GC_8_262 as GW_8_262.
I can start the REAL GW_8_262 download tomorrow. Quote:
|
|
|
|
|
|
|
#1209 | |
|
Jun 2012
22·773 Posts |
Quote:
![]() Wait - the mystery is solved! Happy ending. All is well. Another topic - seeking advice. When I start post processing C176_118_93, a 31 bit job, what target_density should I use? Typically I just use default values but maybe it's worth trying to tighten up the matrix a bit prior to LA? It's a pretty ugly poly with a terrible yield but best we could find. Thanks in advance for any suggestions. |
|
|
|
|
|
|
#1210 |
|
(loop (#_fork))
Feb 2006
Cambridge, England
23·11·73 Posts |
I tend to use target_density 112, if that doesn't work then 96, if that doesn't work then the default 70. The difference between 112 working and 70 working is often only about 5% of the total relation count.
What I don't quite understand is why I don't get a usable matrix, even with enormous over-sieving, at target densities 128 or over. |
|
|
|
![]() |
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| Boinc Statistics for NFS@Home borked ? | thomasn | NFS@Home | 1 | 2013-10-02 15:31 |
| BOINC NFS sieving - RSALS | debrouxl | NFS@Home | 621 | 2012-12-14 23:44 |
| BOINC? | masser | Sierpinski/Riesel Base 5 | 1 | 2009-02-09 01:10 |
| BOINC? | KEP | Twin Prime Search | 212 | 2007-04-25 10:29 |
| BOINC | bebarce | Software | 3 | 2005-12-15 18:35 |