![]() |
[QUOTE=Mathew;323839]I would like to reserve 11411_108[/QUOTE]
Complete [CODE]prp65 factor: 47900800801223886139201610745670452318691497498742261311799912721 prp151 factor: 1364475667022941371781835892014764310532189565871850947023815893714797869685893030409033728507638580307710172399652041261314342813476975784077453819623[/CODE] |
I'm just doing the filtering on 3617523089023m19 for the third time, this time with target_density=100.
If it fails again then I will superstitiously do 'sort -t: -u -k1,1' on the .dat file to present msieve with something devoid of duplicates (30% of the relations in the file are duplicates) I'm surprised that target_density=130 and target_density=150 both gave the same numbers in the error message 'found 79713 cycles, need 6387567'. |
target_density=100 worked fine; the matrix is relatively large
Thu Jan 10 08:34:24 2013 matrix is 14183848 x 14184025 (5698.1 MB) with weight 1659511550 (117.00/col) despite what looked like significant over-sieving, results expected Wednesday evening. |
From 853_83, I observe (50% seriously, 50% not) that there seem to exist some fraction of way too smart users who submit garbage (or the same relations again and again?) in huge blocks and get free credit. Note: the blocks of meaningless relations are contiguous. It might be useful to write a lightweight poly-specific validator on the BOINC server side. The more users will get "smart", the more redundant (or simply void) the relation sets will become.
I know that it is unlikely that there's a ghost writer out there who would spoof the binary (or hack the boincclient and manager to disregard control sums). But it seems possible for the user to [SPOILER]deleted as Hollywood movies delete the actual recipe for a bomb[/SPOILER]... I will PM my thoughts about a validator. In any case, I am sure that this is not yet a serious concern. |
[I]HAI [/I]
[I]CAN HAS STDIO? [/I] [I]VISIBLE "CAN HAS L1112?" [/I] [I]KTHXBYE [/I] [I]_____________[/I] What a completely different picture! ...a 7.2M matrix and ETA 45 hours, not ~300 hrs like the recent undersieved monsters. |
I'll take L1117; just (1203 GMT) started the nine-hour download.
ETA: Friday evening (matrix is 9629022 x 9629206 (2841.2 MB) with weight 826269592) |
3617523089023^19-1 factored
1 Attachment(s)
[code]
Wed Jan 16 12:53:33 2013 prp84 factor: 557258425413806445667309985154135217521361150595029596359152854319857835915823200759 Wed Jan 16 12:53:33 2013 prp143 factor: 20199795244583513021981780712113866444867173343763106622339890390300388073800190109878675467564995471684114306237330166755975882748293540763207 [/code] About twelve hours to download 19.2GB of .dat.gz - I had to do this twice because I had a network glitch in the middle and screwed up the resume. Three filtering attempts at eight hours each (trying densities 150, 130 and 100). 144 real-time hours on 24 CPUs for the linear algebra on Thu Jan 10 08:34:24 2013 matrix is 14183848 x 14184025 (5698.1 MB) with weight 1659511550 (117.00/col) Eight sqrt jobs in parallel, 6h15m each, one of them got the factor so I killed the rest. |
[URL="http://factordb.com/index.php?query=lucas%281112%29"]Lucas(1112)[/URL] cofactor = p72 . p101
|
L1117 done
1 Attachment(s)
[code]
Fri Jan 18 21:02:56 2013 prp103 factor: 5265270109006540364117493740151322185266234296158288170514998448071532498087344398659233986762427993159 Fri Jan 18 21:02:56 2013 prp123 factor: 747535627193126498586684840308259038720410703671885928156444877971766437582129851274770198459817795701357498766825830490429 [/code] Seven hours to download 13.9GB; 14 minutes to decompress it. 275 minutes to filter; 2420 minutes on 24 CPUs to run the 9629022 x 9629206 (2841.2 MB) matrix with weight 826269592 (85.81/col); 99 minutes per square root. |
[QUOTE=fivemack;325162][code]
Fri Jan 18 21:02:56 2013 prp103 factor: 5265270109006540364117493740151322185266234296158288170514998448071532498087344398659233986762427993159 Fri Jan 18 21:02:56 2013 prp123 factor: 747535627193126498586684840308259038720410703671885928156444877971766437582129851274770198459817795701357498766825830490429 [/code]Seven hours to download 13.9GB; 14 minutes to decompress it. 275 minutes to filter; 2420 minutes on 24 CPUs to run the 9629022 x 9629206 (2841.2 MB) matrix with weight 826269592 (85.81/col); 99 minutes per square root.[/QUOTE] Only missing the kWh spent. Good work! |
lasieved
lasieved wus' needed.
|
| All times are UTC. The time now is 23:01. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.