mersenneforum.org

mersenneforum.org (https://www.mersenneforum.org/index.php)
-   NFS@Home (https://www.mersenneforum.org/forumdisplay.php?f=98)
-   -   BOINC NFS sieving - NFS@Home (https://www.mersenneforum.org/showthread.php?t=12388)

wombatman 2013-10-28 18:32

Here's the result:

[CODE]Found 31809992 unique, 1617420 duplicate (4.8% of total), and 181942 bad relations.
Largest dimension used: 147 of 150
Average dimension used: 97.1 of 150
Terminating program at Mon Oct 28 13:31:32 2013[/CODE]

Just confirmed that the remdups binary works properly on an older, smaller relations file (17M or so relations). It found exactly the same number of duplicates that msieve did.

VictordeHolland 2013-10-28 19:34

1 Attachment(s)
I got my relations to work with:
[CODE]remdups 600 GC_12_206.dat msieve.dat[/CODE]Copied the remdups.c from the link into the cross platform IDE "Code::Block" and pressed compile. "Code::Block" used "mingw32-gcc" with parameter "-march=core2" for compiling I believe, so it's 32bit unfortunately. "800" therefore gave me an out of memory message, but "600" worked just fine too. I attached the file, hope it runs for others too. Linear Algebra is now at 5.8% .

wombatman 2013-10-28 22:53

I tried Victor's exe and a rebuilt exe using both "-m64" and "-march=corei7". All of them end at the same point with the same number of dupes and unique relations.

VictordeHolland 2013-10-29 00:56

31.8M unique relations out of 120M+ relations indeed seems incorrect.
Wombatman are you able to construct a matrix and proceed with the LA, or are you stuck at the relations filtering?

wombatman 2013-10-29 01:22

Stuck on the filtering as far as I can tell unless there's a way to skip that part.

frmky 2013-10-29 01:39

Worked for me. Odd...

Uploading msieve.dat.gz now. Should be ready to download in a half hour or so.

wombatman 2013-10-29 03:16

Very strange. Maybe a Windows/Linux line ending issue or something? Don't know why it would pop up only now.

Batalov 2013-10-29 06:00

1 Attachment(s)
One doesn't simply [STRIKE]walk into Mordor[/STRIKE] debug Windows.

xilman 2013-10-29 09:22

Ben & Victor:

Could I ask you to email me your NFS@Home results in future please? Somewhat more trouble for you but much more reliable and much less hassle for me than the alternatives. I thjnk I managed to scrape your factors from the NFS@Home site and the factordb can't be certain that I got all the details correct. Neither are the reports made particularly promptly.

Thanks!

The GCW web pages will be updated later today, all being well.

Paul

VictordeHolland 2013-10-29 10:09

Paul,
Could you send me your email address by private message, so I can send you my NFS@Home factors in the future? Do you want the factor lines or also the log?
Victor.

pinhodecarlos 2013-10-29 10:14

[QUOTE=VictordeHolland;357823]Paul,
Could you send me your email address by private message, so I can send you my NFS@Home factors in the future? Do you want the factor lines or also the log?
Victor.[/QUOTE]

I sent you a PM with his email address.

xilman 2013-10-29 10:39

[QUOTE=VictordeHolland;357823]Paul,
Could you send me your email address by private message, so I can send you my NFS@Home factors in the future? Do you want the factor lines or also the log?
Victor.[/QUOTE]My address is [email]paul@leyland.vispa.com[/email]

The factor lines are all that's really required for my purposes but feel free to send more if it make life easier for you.

Thanks,

Paul

wombatman 2013-10-29 13:30

[QUOTE=xilman;357815]Ben & Victor:

Could I ask you to email me your NFS@Home results in future please? Somewhat more trouble for you but much more reliable and much less hassle for me than the alternatives. I thjnk I managed to scrape your factors from the NFS@Home site and the factordb can't be certain that I got all the details correct. Neither are the reports made particularly promptly.

Thanks!

The GCW web pages will be updated later today, all being well.

Paul[/QUOTE]

Absolutely!

VictordeHolland 2013-10-30 09:41

GC_12_206
 
[B]GC_12_206[/B]
[CODE]prp55 factor: 1538247757620052691167247035086944723662963700765532481
prp141 factor: 376240631979548270257795515920540293763704849419924847724940616318385472325459725842521019299341175082958473603235323956243335638987621294411[/CODE]

I'll take C168_127_110 once it is ready for post-processing. Should be more challenging with 31 bits.

wombatman 2013-10-30 15:23

GC_7_263
 
GC_7_263 factors as: [CODE]prp62 factor: 34682415971527549868607772647599659324331034424213324127029121
prp157 factor: 9634260255307492668406071713403239894667269926892491218011145215290222341766487589618896566106609445985782843545679706456803075503745482081628806181348954323[/CODE]

swellman 2013-10-31 22:04

C229_125_81
 
[code]
prp65 factor: 10341881593843117210354375650868401752008141537678796458320210001
prp165 factor: 557530277392856791118142372581524726962238586450690945016756441294151616210078532318781611953962512817140570477937969142620825707134001615024871468881947790946303481
[/code]

XYYXF 2013-10-31 22:52

Hardly an ECM miss. Congrats! :)

VictordeHolland 2013-11-04 10:58

C168_127_110
linear algebra (4,1% ETA 177h)

Wow, what a difference in LA processing time between 30bit (17-34h) and 31bit (184h)!

Batalov 2013-11-04 14:28

[QUOTE=VictordeHolland;358360]C168_127_110
linear algebra (4,1% ETA 177h)

Wow, what a difference in LA processing time between 30bit (17-34h) and 31bit (184h)![/QUOTE]
This difference is not due to "30bit vs 31bit".

R.D. Silverman 2013-11-04 17:30

[QUOTE=Batalov;358367]This difference is not due to "30bit vs 31bit".[/QUOTE]

Actually, it is.

Increasing the large prime bound from 30 to 31 bits increases the
size of the final factor base (after filtering and combining large primes) by
quite a bit. This in turn makes the matrix quite a bit bigger.

Batalov 2013-11-04 18:25

I stand by my remark.

In it, "[I]this difference[/I]" referred to the 5 or even 10-fold larger running time. See the original post. "Wow, what a difference, you go from 30bit to 31bit and the running time is 10 times longer!" This is not the case. Tom Womack ran some factorizations twice with the similar one-bit different LPBs and found minimal differences in running time (and matrix sizes). [I]With a simple condition[/I]: both have to be [I]comparably[/I] well-sieved (relative to the estimated minimum of relations which will be roughly 2x more, but the sieving can be 2x faster -- if all is done right and the conditions are not grossly contrived, like sieving with totally inappropriate LBPs).

Here's the real reason for 10-fold larger running time. One project was likely well-sieved and the other barely enough sieved (only enough for the proverbial "cusp of filtering convergence"). The 30-bit project may have also been much simpler, too; that would be another reason.

TL;DR version: if you would take a 30-bit project with SNFS-difficulty of 249, and then another project with SNFS-difficulty of 251 for which you would select 31-bit LPBs (or even the same project, once again*), _[I]and[/I]_ sieve them both comparably, the wall clock time will not be very different (but apparently somewhat larger for the more difficult project. [I]Somewhat[/I] larger, [U]not[/U] 10 times larger).

_________
*but your name has to be Tom for that.

R.D. Silverman 2013-11-04 18:38

[QUOTE=Batalov;358391]I stand by my remark.

In it, "[I]this difference[/I]" referred to the 5 or even 10-fold larger running time. See the original post. "Wow, what a difference, you go from 30bit to 31bit and the running time is 10 times longer!"
[/QUOTE]

Ah. I missed that (i.e. the part about "10-fold"). I would normally expect a
matrix that is 20 to 25% larger in going from 30 to 31 bits.

Batalov 2013-11-04 18:49

Right. Tom could probably quantify that. (I haven't dug up his post, but it exists somewhere here on the forum. It was an interesting report. Could have been 6-7 years ago, now.)

VictordeHolland 2013-11-04 21:44

There are quite a few differences between this run and the two runs I did before:
SNFS(225) vs. GNFS(168)
30 bits vs. 31 bits
Relations: 110M+ vs. 220M+
Matrix size: ~1.5GB vs. 3.7GB

So I guess they all contribute to a longer LA phase?

Batalov 2013-11-04 23:02

Yes.

RichD 2013-11-05 16:25

I would like to reverse GW_4_369.

RichD 2013-11-05 19:35

[QUOTE=RichD;358483]I would like to reverse GW_4_369.[/QUOTE]

Oops, I am almost done downloading [B]GC[/B]_4_369.
Please adjust my assignment.

RichD 2013-11-06 18:27

GC_4_369 splits as:
[CODE]prp51 factor: 833216020977010133611098079725963902150797467457397
prp118 factor: 1540723616026788250128258003100454105963168284173198740572019759032558763687534559921149354430092646605553304218219221[/CODE]

wombatman 2013-11-07 19:44

I'll take a whack at C168_130_119.

swellman 2013-11-07 20:21

Reserving GC_5_318
 
I'll grab it when it is ready for download.

Thanks.

wombatman 2013-11-08 05:03

ETA on the C168 is ~108 hours. And that's with the target density of the matrix at 100!

swellman 2013-11-10 04:08

[QUOTE=wombatman;358702]ETA on the C168 is ~108 hours. And that's with the target density of the matrix at 100![/QUOTE]

Impressive performance! How many threads are you running? Is this on an i7?

wombatman 2013-11-10 04:10

8 threads on a Core i7, yes. It uses a little over 5GB of memory!

fivemack 2013-11-10 12:33

[QUOTE=wombatman;358702]ETA on the C168 is ~108 hours. And that's with the target density of the matrix at 100![/QUOTE]

I'm not quite sure whether you're complaining that this is a long time or impressed that this is a short time, because increasing target density usually makes things faster (if you had enough relations to get a matrix built at the higher density).

An NFS@home (so well-oversieved) C178 was 95 hours running msieve-1.52 (SVN945) under 64-bit Linux on my stock-speed i7/4770 with target density 70:

[code]
Sun Sep 22 22:45:16 2013 weight of 13120698 cycles is about 918487032 (70.00/cycle)
Sun Sep 22 22:59:40 2013 matrix is 13120526 x 13120698 (3944.6 MB) with weight 1222719217 (93.19/col)
Sun Sep 22 23:03:39 2013 linear algebra at 0.0%, ETA 103h12m
[/code]

I still had the relations, so refiltering with target density 100, running on same hardware:
[code]
Sun Nov 10 15:44:16 2013 weight of 11580698 cycles is about 1158315404 (100.02/cycle)
Sun Nov 10 17:27:09 2013 matrix is 11580525 x 11580698 (4662.0 MB) with weight 1424440785 (123.00/col)
Sun Nov 10 17:31:19 2013 linear algebra at 0.0%, ETA 84h58m
[/code]

A bit more memory, but a lot less runtime

swellman 2013-11-10 14:10

Several of us are finding 31 bit NFS@Home GNFS factoring jobs of this size take 180-200 hours to complete LA. Wombatman has [url=http://www.mersenneforum.org/showthread.php?t=18725]compiled a version of Msieve[/url] that seems to be considerably faster.

Admittedly not an extensive study but a promising start.

I plan to run some 30 bit SNFS NFS@Home jobs using wombatman's executable. These sized jobs have been taking 25-30 hours to complete LA in the past. Hoping to see a substantial improvement.

wombatman 2013-11-10 17:10

I was mostly expressing surprise at the large jump from 30-bit to 31-bit. The 30-bit one I ran before was completed in ~1 day. I'll be interested to see how much of a difference the target density makes on your system. Thanks for checking!

[QUOTE=fivemack;358921]I'm not quite sure whether you're complaining that this is a long time or impressed that this is a short time, because increasing target density usually makes things faster (if you had enough relations to get a matrix built at the higher density).

An NFS@home (so well-oversieved) C178 was 95 hours running msieve-1.52 (SVN945) under 64-bit Linux on my stock-speed i7/4770 with target density 70:

[code]
Sun Sep 22 22:45:16 2013 weight of 13120698 cycles is about 918487032 (70.00/cycle)
Sun Sep 22 22:59:40 2013 matrix is 13120526 x 13120698 (3944.6 MB) with weight 1222719217 (93.19/col)
[/code]

I've still got the relations so am trying refiltering at target_density=100, and will update this post in probably six hours when the filtering is done.[/QUOTE]

swellman 2013-11-11 12:27

Reserving GW_4_369
 
If it is still unclaimed.

GC_5_318 is currently in LA, should finish tonight.

Thanks.

VictordeHolland 2013-11-11 21:09

Reserving GW_5_318
 
C168_127_110 is almost ready, now in square root fase.
Taking GW_5_318 next.

wombatman 2013-11-11 21:14

Just an update--C168_130_119 is 50% done on the linear algebra. It would be going faster, but I use my laptop when I'm at work, so it doesn't get to churn through as much...

Still, only a few days to go.

VictordeHolland 2013-11-11 22:14

C168_127_110 factors
 
[B]C168_127_110[/B]
[CODE]prp65 factor: 37792525645009215672640814114905288002695134234191638876505642649
prp104 factor: 13726515009404365290893989999446372257366451228738644541888215061872548011642399835314991189895341089667[/CODE]

swellman 2013-11-12 11:05

GC_5_318
 
[code]prp90 factor: 184763462141961298781835300446977817122734239690607168398305798472772898852528335132566597
prp123 factor: 808169483508868374406392455841213056787193069554340217585858124624893209801050341692036365491419721857836553682367085563193
[/code]

swellman 2013-11-13 02:41

[QUOTE=swellman;358925]
I plan to run some 30 bit SNFS NFS@Home jobs using wombatman's executable. These sized jobs have been taking 25-30 hours to complete LA in the past. Hoping to see a substantial improvement.[/QUOTE]

Data point: LA of GW_4_369 had an ETA of 15 hours with 8 threads on my i7. It will finish in a few hours.

swellman 2013-11-13 11:08

GW_4_369 factors

[code]prp60 factor: 177403006881130113063302674594810228750950806165895483532597
prp123 factor: 589203883570342218983575340055137303532176542861815435328583489006750174017203999175917086613246958570131196834769562261677
[/code]

VictordeHolland 2013-11-13 11:39

RichD completed G[B]C[/B]_4_369 about a week ago, it is displayed on the lasieved page as still being post-processed:
[URL]http://mersenneforum.org/showpost.php?p=358574&postcount=998[/URL]

VictordeHolland 2013-11-14 15:54

GW_5_318 factors
 
Took me about an hour to get the post-processing started. The original files had the entire number (a C225) in them. MSIEVE would find 127 as a factor by 15 digit Trial Factoring. It would then search for a C223 in the .fb and .poly which still had the C225 in them. Eventually I realized this and checked the factordb and found out two P16 were also known, so only a C192 remainded:
[B]
GW_5_318[/B]
[CODE]prp49 factor: 4261455268632083287436707603791222620930237118863
prp60 factor: 450063175620763356730945681744698257366095535261401256629363
prp84 factor: 432087545417727621877325644242496506135441396528813761437485005011126996638176541757[/CODE]

wombatman 2013-11-15 15:48

C168_130_119 (finally!) factors as:

[CODE]prp76 factor: 9094435764101492192625106166940437077708174382794083374653111205520090825523
prp92 factor: 68086751212664789039528265600039302233534265572638233049445891533014221617435036175941766901[/CODE]

That was a slog!

Edit: FactorDB seems to be down right now, but I'll add these factors when it returns.

swellman 2013-11-18 16:19

I'll take GC_11_234 if it is still available.

swellman 2013-11-19 15:07

GC_11_234 is currently in LA.

ETA 140 hours.

swellman 2013-11-22 21:33

Reserving GC_11_233
 
I'll take it if still available.

swellman 2013-11-26 01:02

GC_11_234 factors
 
[CODE]prp85 factor: 7229682174820941873563372695487534725675448372192856897563207983721936793375770327599

prp113 factor: 65086357586886271288591582353510914906890714130483592560109914639159864703876136112452835963483488138858618058513[/CODE]


Factordb is down, so I could not report these factors.

swellman 2013-11-26 23:28

GC_11_233 has successfully entered LA.

ETA is 160 hours from time of this posting.

fivemack 2013-11-26 23:36

Reserving L1364
 
I'll do L1364 - want to see how the new i7-4930 does on linalg

fivemack 2013-11-27 14:22

L1364.dat.gz, upon decompression, contains quite a lot (several tens of thousands) of corrupted lines such as

[code]
111,7D209C2D5D1B7E7,2B151D834BAC35,305,7DF56DA12864B276B7B9B177,,885185C9,41170,EB92F7016B118131193570D6:1F47E723E291,37416D49,08A2B10A38ADDF,4FD5,95EE4B,9FA11305,7EB:1F591119,7117967,3AD711A7A38D648AA18-,2C06869562A7118:5E22501,1F43D00FB11A91D6B93907ADDB0EB5116DC99F771305,7EB:17054CB6EAB6B109159,15F78242CA25C63,18951AD32
[/code]

I'm sure the run will work despite this, but I'm a little curious as to how they got there.

This is not unprecedented, though usually there are fewer than tens of thousands of odd lines; it's led to me decompressing the files before use rather than trusting msieve-with-libz to handle weird corruption perfectly.

ETA Sunday evening British time

RichD 2013-11-28 17:46

I'll take GC_12_223 next.

frmky 2013-11-29 05:46

[QUOTE=fivemack;360437]
I'm sure the run will work despite this, but I'm a little curious as to how they got there.

This is not unprecedented, though usually there are fewer than tens of thousands of odd lines; it's led to me decompressing the files before use rather than trusting msieve-with-libz to handle weird corruption perfectly.[/QUOTE]
I usually do that was well. I no longer ask how some really weird stuff that I've seen ended up in a BOINC result file...

RichD 2013-11-29 14:36

[QUOTE=RichD;360550]I'll take GC_12_223 next.[/QUOTE]

Just over a week. (171 hrs in LA). Hopefully next Friday.

fivemack 2013-12-01 19:44

L1364 done
 
[code]
Wed Nov 27 17:00:49 2013 matrix is 14215705 x 14215931 (5482.5 MB) with weight 1383940454 (97.35/col)
Wed Nov 27 17:00:49 2013 sparse part has weight 1295040980 (91.10/col)
Wed Nov 27 17:00:49 2013 using block size 8192 and superblock size 1179648 for processor cache size 12288 kB
Wed Nov 27 17:01:44 2013 commencing Lanczos iteration (6 threads)
Wed Nov 27 17:01:44 2013 memory use: 4679.6 MB
Wed Nov 27 17:02:22 2013 linear algebra at 0.0%, ETA 96h14m
Wed Nov 27 17:02:35 2013 checkpointing every 150000 dimensions
Sun Dec 1 17:09:31 2013 lanczos halted after 224812 iterations (dim = 14215702)
Sun Dec 1 17:41:15 2013 sqrtTime: 1891
Sun Dec 1 17:41:15 2013 prp93 factor: 268443246189049230127802830006228447047242512240190276186342184329021347724416601317119124087
Sun Dec 1 17:41:15 2013 prp159 factor: 231986812133163427239926954963988885761031781501478291209819616540855223852371089666846275271165863544023640572638351643361032428293227430884088709885365622023
[/code]

swellman 2013-12-04 02:02

GC_11_233 Factored
 
[code]prp68 factor: 35895779805368510262905302270945097533219424132313331736357875647227
prp154 factor: 2303868939859485226460351426842753409461161492806388716815220790142889684263829960396973181570701590884575430874126541526455519954436020342757782025929431[/code]

RichD 2013-12-05 00:43

[QUOTE=RichD;360616]Just over a week. (171 hrs in LA). Hopefully next Friday.[/QUOTE]

Minor mishap.
ETA now Sunday.

RichD 2013-12-08 19:08

Another mishap.
ETA is late Tuesday.
(I'll grab another one during the day.)

RichD 2013-12-10 16:23

I'll take GC_10_244 next.

RichD 2013-12-11 14:32

[QUOTE=RichD;360550]I'll take GC_12_223 next.[/QUOTE]

Splits as:
[CODE]prp86 factor: 24427213677866781415932492497514159497858972027647183874118658276835430435756667904063
prp100 factor: 4776137140925209925993192841640820987743784052397768741065151839113384289392438227758208944146294503
[/CODE]

xilman 2013-12-11 17:32

[QUOTE=RichD;361792]Splits as:
[CODE]prp86 factor: 24427213677866781415932492497514159497858972027647183874118658276835430435756667904063
prp100 factor: 4776137140925209925993192841640820987743784052397768741065151839113384289392438227758208944146294503
[/CODE][/QUOTE]Thanks Rich. Spotted it this time.

Paul

RichD 2013-12-13 19:40

[QUOTE=RichD;361677]I'll take GC_10_244 next.[/QUOTE]

A little over 5 days.
(ETA 128 hrs in LA.)

frmky 2013-12-14 01:37

The NFS@Home server has been upgraded significantly. Hopefully downloads will now be faster. Let me know if you discover any problems. Thanks!

swellman 2013-12-16 17:46

I'll take GC_10_243.

swellman 2013-12-18 03:09

[QUOTE=swellman;362205]I'll take GC_10_243.[/QUOTE]

LA started successfully. Should finish on Dec 27.

RichD 2013-12-19 03:23

GC_10_244
 
[QUOTE=RichD;361677]I'll take GC_10_244 next.[/QUOTE]

A three-way split:
[CODE]prp66 factor: 201761418575897363022970867994347221000379477899431800887073841771
prp69 factor: 292992981110644956801849884032870803325680869451258493705857928397237
prp97 factor: 4118062016894712294049710502267115496355080551429534490813175700967096807388471319719209210779617[/CODE]

RichD 2013-12-25 16:14

I'll take GC_10_242 next.

swellman 2013-12-25 20:46

I'll take 78883_239 next.

fivemack 2013-12-25 21:46

I'll take 48881_239: ETA Sunday evening

xilman 2013-12-26 12:58

[QUOTE=RichD;362440]A three-way split:
[CODE]prp66 factor: 201761418575897363022970867994347221000379477899431800887073841771
prp69 factor: 292992981110644956801849884032870803325680869451258493705857928397237
prp97 factor: 4118062016894712294049710502267115496355080551429534490813175700967096807388471319719209210779617[/CODE][/QUOTE]Thanks Rich.

According to my records, NFS@Home has three more GCW numbers outstanding: 10,242+ and 10,243+ are in LA and 10,245+ is still to start.


Paul

RichD 2013-12-26 14:45

[QUOTE=RichD;362878]I'll take GC_10_242 next.[/QUOTE]

Hopefully, factors on Monday.
ETA for LA - 97 hrs.

RichD 2013-12-26 15:00

[QUOTE=xilman;362939]... 10,245+ is still to start.[/QUOTE]

frmky came in through the back door and completed it when no one was looking. Check FDB.

xilman 2013-12-26 16:21

[QUOTE=RichD;362944]frmky came in through the back door and completed it when no one was looking. Check FDB.[/QUOTE]Oops!

You're quite right. He even mailed me the results which I processed some days ago. Unfortunately, I forgot to remove the reservation. Done now and will be uploaded shortly.

Paul :paul:

fivemack 2013-12-29 19:39

[QUOTE=fivemack;362891]I'll take 48881_239: ETA Sunday evening[/QUOTE]

[code]
prp106 factor: 48615301591508559327565678085836933118077817687908390075829261240
74336347744534502907013252089369878582019
prp133 factor: 27179124227671380778378533419878687724081930854387727291869520501
15202533119475146819380172585944335598836123722631096373754398047327
[/code]

fivemack 2013-12-29 19:56

I'll take 92221_237: ETA Friday evening

swellman 2013-12-29 22:45

[QUOTE=swellman;362205]I'll take GC_10_243.[/QUOTE]

[code]
prp92=24643820232009239441764469255577227296630050559475422398136903049264617376534693286381807269
prp141=493548569626046484800909415693178426809423090927136995032470665007586697589335852037474443244805627000780736435870165301724651953532947309583[/code]

RichD 2013-12-29 23:29

I'll take C158_3366_2103 next.

RichD 2013-12-30 15:52

GC_10_242 splits as:

[CODE]prp104 factor: 10734660940469282139734911760800142262203686793758836052680141316683441750603329678722138765602438716753
prp138 factor: 267328298428696239219201673192925932816272172960697272270016571854648490084359664519896880331777761027101980559782014705421725314950944449[/CODE]

xilman 2013-12-30 19:09

[QUOTE=RichD;363270]GC_10_242 splits as:

[CODE]prp104 factor: 10734660940469282139734911760800142262203686793758836052680141316683441750603329678722138765602438716753
prp138 factor: 267328298428696239219201673192925932816272172960697272270016571854648490084359664519896880331777761027101980559782014705421725314950944449[/CODE][/QUOTE]Yay!

IMAO, this is the largest co-factor yet found. Checking the progress files will doubtless show whether my opinion is valid.

Regardless, it's a nice result to sneak in before the end of the year.


Paul

RichD 2013-12-30 23:45

[QUOTE=RichD;363217]I'll take C158_3366_2103 next.[/QUOTE]

... has an awesome split by:
[CODE]prp79 factor: 3975756220164876299557800518659436824002469344153151448036616608507116743895159
prp79 factor: 5008025306267014527749323654873770285974781356294243380587847235506414468982861[/CODE]

xilman 2013-12-31 07:35

[QUOTE=RichD;363316]... has an awesome split by:
[CODE]prp79 factor: 3975756220164876299557800518659436824002469344153151448036616608507116743895159
prp79 factor: 5008025306267014527749323654873770285974781356294243380587847235506414468982861[/CODE][/QUOTE]Brilliant!

fivemack 2014-01-03 19:02

Taking F1237, eta Tuesday morning

swellman 2014-01-03 22:22

I'll take 76661_236 next.

RichD 2014-01-04 00:16

I'll take GW_3_487 next.
And start on GC_9_244 tomorrow.

swellman 2014-01-04 00:37

I'll also take GW_9_244 once it is ready for LA.

These 30 bit jobs should go quickly.

fivemack 2014-01-04 08:28

92221_237 splits as

[code]
Sat Jan 4 01:49:48 2014 prp106 factor: 1179063628337327539915078591222953890301260027519284628960080842009873310091438297668594171890357435736483
Sat Jan 4 01:49:48 2014 prp133 factor: 7821649316099304269560579640018398710349328715787996309040848207725532575674945993974591479951354962570980590119047288584704872585487
[/code]

fivemack 2014-01-04 18:01

May I take GC_5_333 ? (update: ETA Thursday afternoon)

frmky 2014-01-04 19:00

[QUOTE=fivemack;363783]92221_237 splits as

[code]
Sat Jan 4 01:49:48 2014 prp106 factor: 1179063628337327539915078591222953890301260027519284628960080842009873310091438297668594171890357435736483
Sat Jan 4 01:49:48 2014 prp133 factor: 7821649316099304269560579640018398710349328715787996309040848207725532575674945993974591479951354962570980590119047288584704872585487
[/code][/QUOTE]

That's your second P106 * P133. Keep it up! :smile:

jrk 2014-01-05 14:45

I'd like to reserve GW_10_233 please.

RichD 2014-01-05 19:26

GW_3_487 splits as:

[CODE]prp82 factor: 1170314028012780585726851302597496160000573925664480234237527410843217244780343409
prp131 factor: 18039812487520704301520084118889776324487023780089331527713858058891918849312441408100289033587392884928118983616250864950197287471[/CODE]

swellman 2014-01-05 23:10

[QUOTE=swellman;362886]I'll take 78883_239 next.[/QUOTE]

[code]
prp88 factor: 2546449501135960705315499562987520800025370304870525117117402230389773655271719772357233
prp151 factor: 6074500821302857426690581000554503610708139218667326713996930626410866333773471812395047536239795364305932456874027528398079782194033901234667562490001
[/code]

RichD 2014-01-06 17:49

GC_9_244 splits as:

[CODE]prp63 factor: 324819903042572690877938389549033182515857945680794580212896547
prp113 factor: 59415826705562741535743181250674227809897485523306694691882355145307205722384865631737407933360702672443076239647[/CODE]

fivemack 2014-01-07 08:01

F1237 splits as
[code]
Tue Jan 7 06:00:34 2014 prp61 factor: 5255385479501070778259142561142554147347457218557322783039977
Tue Jan 7 06:00:34 2014 prp128 factor: 21794098937808842372692530918946750232039336772881915706922655481175531618600957094849559912190833307550107894334150954947963817
[/code]

fivemack 2014-01-09 18:09

GC_5_333 splits as
[code]
Thu Jan 9 15:43:29 2014 prp74 factor: 23293492642346552326823306533375568432279161528366769356228709753899578997
Thu Jan 9 15:43:29 2014 prp100 factor: 1225721366571717479164536430500615726567584547341043044491159945559565665385807859588259183318495537
[/code]

Taking GC_8_258; eta Sunday afternoon

swellman 2014-01-09 22:11

76661_236 and GW_9_244 should both finish on Tuesday.

I'll take W_2_773 next if no one wants it.


All times are UTC. The time now is 10:17.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.