mersenneforum.org  

Go Back   mersenneforum.org > Factoring Projects > NFS@Home

Reply
 
Thread Tools
Old 2016-01-14, 18:16   #595
fivemack
(loop (#_fork))
 
fivemack's Avatar
 
Feb 2006
Cambridge, England

144308 Posts
Default

C221_118_81 now running.

Reserving C219_127_57 and C197_129_53 to run over the weekend
fivemack is offline   Reply With Quote
Old 2016-01-14, 20:10   #596
debrouxl
 
debrouxl's Avatar
 
Sep 2009

977 Posts
Default

I'll attempt to queue the OP numbers posted by William probably tomorrow.

While 15e is more efficient in that range (but the 15e queue is full), 14e can usually deal with GNFS difficulty 165-170 tasks, especially with a good poly. SNFS difficulty 250+ with a sextic polynomial on 14e is a stretch.

Last fiddled with by debrouxl on 2016-01-14 at 20:15
debrouxl is offline   Reply With Quote
Old 2016-01-14, 20:48   #597
xilman
Bamboozled!
 
xilman's Avatar
 
"π’‰Ίπ’ŒŒπ’‡·π’†·π’€­"
May 2003
Down not across

2A2216 Posts
Default

Quote:
Originally Posted by debrouxl View Post
I'll attempt to queue the OP numbers posted by William probably tomorrow.

While 15e is more efficient in that range (but the 15e queue is full), 14e can usually deal with GNFS difficulty 165-170 tasks, especially with a good poly. SNFS difficulty 250+ with a sextic polynomial on 14e is a stretch.
Thanks. Indicates where I should devote GPU effort in the next month or two.

Paul
xilman is offline   Reply With Quote
Old 2016-01-15, 09:07   #598
Dubslow
Basketry That Evening!
 
Dubslow's Avatar
 
"Bunslow the Bold"
Jun 2011
40<A<43 -89<O<-88

722110 Posts
Default

Quote:
Originally Posted by Dubslow View Post
Is there a particular reason for the use of gzip for compression? bzip2 has a rather better compression ratio (at a computational cost) while still being more-or-less widely available.
Quote:
Originally Posted by frmky View Post
The results are compressed by the BOINC clients before they are returned to the server. The server just checks that they are valid compressed relations then concatenates them into the file you download.
Furthermore, removing duplicates would also be a massive savings of bandwidth. It would perhaps require uncompressing, unless someone extended remdups4 with zlib, but in that case it would become practical to use bzip2.

I'm bringing this up because I have substantially worse internet that I've had years past; it took on the order of 12 hrs to download 22GiB of just over 400M relations, of which 63M were duplicates (and so a waste of bandwidth). Besides the connection being slower, it's also a connection where the total bandwidth consumption is monitored -- and 22 GiB is not insubstantial. (It definitely didn't help that I messed up and needed to do it *again* -- but that was my fault )
Dubslow is offline   Reply With Quote
Old 2016-01-15, 09:41   #599
fivemack
(loop (#_fork))
 
fivemack's Avatar
 
Feb 2006
Cambridge, England

23×11×73 Posts
Default C221_118_81 done

Code:
Fri Jan 15 07:37:13 2016  p56 factor: 66604751882840716203190547146724002766120346665371044853
Fri Jan 15 07:37:13 2016  p165 factor: 489911686445026569674165955928329541002751620798887655641477577459794398906029747989127094102551808131327740546125837492604682107848229561628575767032878867858107417
12.8 hours for 5.7M matrix on E5-2650v2 -t 7
Attached Files
File Type: txt C221_118_81-log.txt (21.6 KB, 77 views)
fivemack is offline   Reply With Quote
Old 2016-01-15, 15:21   #600
Xyzzy
 
Xyzzy's Avatar
 
"Mike"
Aug 2002

5·17·97 Posts
Default

Quote:
Originally Posted by Dubslow View Post
I'm bringing this up because I have substantially worse internet that I've had years past; it took on the order of 12 hrs to download 22GiB of just over 400M relations, of which 63M were duplicates (and so a waste of bandwidth).
It takes us days, but we start downloading relations as soon as they start coming in. At first it is hard to keep up but things balance out nicely towards the end.

while true; do nice wget --continue --limit-rate=64k --user=rsals_data --password=***** http://escatter11.fullerton.edu/nfs_data/12_226_plus_7_226/12_226_plus_7_226.dat.gz; sleep 3600; done
Xyzzy is offline   Reply With Quote
Old 2016-01-15, 21:01   #601
VictordeHolland
 
VictordeHolland's Avatar
 
"Victor de Hollander"
Aug 2011
the Netherlands

100100110002 Posts
Default 1373_79_minus1 results

1373_79_minus1
Code:
p63 factor: 167155760887752250734824423685255209540133209961357160033907273
p126 factor: 525234033640980062974735094976085151972433095270924958325303347977625755049053805905136035990038690127663787469942009602039239
12.4M matrix with TD=110
about 109h on all 4 cores 3770k
Attached Files
File Type: txt 1373_79_minus1.txt (18.5 KB, 76 views)
VictordeHolland is offline   Reply With Quote
Old 2016-01-15, 21:12   #602
VictordeHolland
 
VictordeHolland's Avatar
 
"Victor de Hollander"
Aug 2011
the Netherlands

23×3×72 Posts
Default

Quote:
Originally Posted by Dubslow View Post
Furthermore, removing duplicates would also be a massive savings of bandwidth. It would perhaps require uncompressing, unless someone extended remdups4 with zlib, but in that case it would become practical to use bzip2.
Bandwidth is not an issue for me (theoretical 90mbit, in practice 50-80mbit depending on the time of day). But need-less to say, if we can limit the download to uniques then I would support that.

However in that case you can't check the duplicate ratio, unless that original number of relations is stored in a table.


[edit]
I''ll be on skiing holidays from tomorrow till Saturday the 23rd. My machines will be off in that time-window.

Last fiddled with by VictordeHolland on 2016-01-15 at 21:17
VictordeHolland is offline   Reply With Quote
Old 2016-01-15, 21:14   #603
pinhodecarlos
 
pinhodecarlos's Avatar
 
"Carlos Pinho"
Oct 2011
Milton Keynes, UK

10011010100112 Posts
Default

Quote:
Originally Posted by xilman View Post

These are the sub-S250 remainders:
226.37 7,265- C168
227.22 7,266- C173
227.27 8,249+ C178
227.27 8,249- C173
227.35 6,289- C160
227.44 2,746- C219
227.58 5,322+ C207
227.74 4,374- C215


Paul
Paul,

Let's add them to the queue to help you complete a project milestone . I can thrown two machines to help you on the post-processing.

Carlos

Last fiddled with by pinhodecarlos on 2016-01-15 at 21:15
pinhodecarlos is offline   Reply With Quote
Old 2016-01-15, 21:49   #604
swellman
 
swellman's Avatar
 
Jun 2012

22·773 Posts
Default

Quote:
Originally Posted by swellman View Post
Reserving 1847_71_minus1.
A nice triple.

Code:
prp54 factor: 751675369654905088678231667221559046734934416341718309
prp60 factor: 860344312225890325626692969152005944727762796494291088905929
prp113 factor: 97939282732986918883657823741022254384311180414447883360700495089044693365385695961569969049390093405927645342427
TD=98(!) as attempts at 114 and 106 failed to build a matrix.
Attached Files
File Type: log msieve.log (15.5 KB, 66 views)
swellman is online now   Reply With Quote
Old 2016-01-15, 21:50   #605
Dubslow
Basketry That Evening!
 
Dubslow's Avatar
 
"Bunslow the Bold"
Jun 2011
40<A<43 -89<O<-88

3×29×83 Posts
Default

It shouldn't be hard for the server to track the original rel count before we download only uniques. Thanks for the tip Mike, I'll probably use that myself.
Dubslow is offline   Reply With Quote
Reply

Thread Tools


Similar Threads
Thread Thread Starter Forum Replies Last Post
restarting nfs linear algebra cubaq YAFU 2 2017-04-02 11:35
Linear algebra at 600% CRGreathouse Msieve 8 2009-08-05 07:25
Linear algebra crashes 10metreh Msieve 3 2009-02-02 08:34
Linear algebra proof Damian Math 8 2007-02-12 22:25
Linear algebra in MPQS R1zZ1 Factoring 2 2007-02-02 06:45

All times are UTC. The time now is 22:12.


Fri Aug 6 22:12:33 UTC 2021 up 14 days, 16:41, 1 user, load averages: 3.32, 3.13, 2.94

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.