mersenneforum.org Linear algebra reservations, progress and results
 Register FAQ Search Today's Posts Mark Forums Read

2017-10-16, 00:44   #2190
Mini-Geek
Account Deleted

"Tim Sorbera"
Aug 2006
San Antonio, TX USA

17·251 Posts

Quote:
 Originally Posted by swellman If not specified, the default target_density (TD) used by msieve is 70. You may want to restart msieve with a TD=116, then 112, etc until msieve successfully enters LA. Or drop TD in increments of 10. It’s up to you but it should be a bit faster than TD=70. That said, a SNFS 263 is never going to be a one week job on a single machine. Good luck!
Thanks! I'll try it in lower increments and see what I can get.

 2017-10-17, 11:24 #2191 fivemack (loop (#_fork))     Feb 2006 Cambridge, England 6,323 Posts Taking L1321, on the basis of why not bite off more than I can chew in two places at once
 2017-10-17, 12:54 #2192 swellman     Jun 2012 22×3×241 Posts C222_143_73 Update I finally got both data files for C222_143_73 combined, filtered and into LA. Job should be done by 3 Nov. The split approach certainly works but it’s fairly unwieldy. And the algebraic file really didn’t add a lot of unique relations to the pot. Combined it was 610M raw/351M unique relations. I’ll post more details when I report the factors. ETA: Fib 1301 should be done on 20 Nov. 6 weeks to run that job! Last fiddled with by swellman on 2017-10-17 at 12:56 Reason: Fib 1301 status
 2017-10-17, 13:11 #2193 swellman     Jun 2012 22·3·241 Posts C202_137_75 -14e/33 job I think there is a problem with this data file. I’ve downloaded it twice now, with the same results both times. Many, many errors in filtering, and eventually msieve crawls to a halt (but doesn’t crash) during reading relations, somewhere in the high 470M’s. Remdups doesn’t seem to play well with it either, refusing to hash over 100M relations. These seem to be then discarded by remdups. Remdups also seems to need a DIM of at least 3000 but the highest I can use on my 32 Gb machine without an “out of memory” error is DIM=780. Not sure why that is - memory doesn’t seem to be a problem at least according to Win 10 task manager. Remdups issues aside, I have always been able to filter data with msieve the old fashioned way, but not in this case. Any advice?
2017-10-17, 13:28   #2194
wombatman
I moo ablest echo power!

May 2013

6C716 Posts

Quote:
 Originally Posted by swellman I think there is a problem with this data file. I’ve downloaded it twice now, with the same results both times. Many, many errors in filtering, and eventually msieve crawls to a halt (but doesn’t crash) during reading relations, somewhere in the high 470M’s. Remdups doesn’t seem to play well with it either, refusing to hash over 100M relations. These seem to be then discarded by remdups. Remdups also seems to need a DIM of at least 3000 but the highest I can use on my 32 Gb machine without an “out of memory” error is DIM=780. Not sure why that is - memory doesn’t seem to be a problem at least according to Win 10 task manager. Remdups issues aside, I have always been able to filter data with msieve the old fashioned way, but not in this case. Any advice?
I appear to be having a similar issue with the C181_HP2 file. I get to ~187M or so and then msieve basically just appears to churn where the CPU is occupied as expected but there is no disk activity at all related to reading in relations. I haven't tried remdups, but I did redownload the relations file and try again without any success.

Last fiddled with by wombatman on 2017-10-17 at 13:29

 2017-10-17, 14:29 #2195 fivemack (loop (#_fork))     Feb 2006 Cambridge, England 6,323 Posts I had a similar issue with L1307 - at about 130 million relations in, msieve just hung. I sorted this out by using 'grep' to filter valid lines out of the file Code: egrep '^[-0-9]+,[0-9]+:[0-9a-fA-F,]+:[0-9a-fA-F,]+$' msieve.dat.dirty > msieve.dat.clean which does of course require you to have free disc space equal to about the size of the original uncompressed file. The files produced by nfs@home do have some peculiarities - there seem to be chunks of gzipped data, some relatively large, in the middle of the result of gunzipping the file. Last fiddled with by fivemack on 2017-10-17 at 14:34  2017-10-17, 16:26 #2196 wombatman I moo ablest echo power! May 2013 5·347 Posts Do I need to decompress the relations file before doing your grep cleaning step? 2017-10-17, 16:32 #2197 fivemack (loop (#_fork)) Feb 2006 Cambridge, England 6,323 Posts Quote:  Originally Posted by wombatman Do I need to decompress the relations file before doing your grep cleaning step? Yes, egrep doesn't work on compressed data. But if you're on a Unix machine, gunzip -c msieve.dat.gz | egrep ... | gzip -c will filter the compressed relations file on the fly and recompress the results as they come out, which ought to save quite a lot of disc space. Last fiddled with by fivemack on 2017-10-17 at 16:33 2017-10-17, 23:24 #2198 swellman Jun 2012 289210 Posts Quote:  Originally Posted by fivemack I had a similar issue with L1307 - at about 130 million relations in, msieve just hung. I sorted this out by using 'grep' to filter valid lines out of the file Code: egrep '^[-0-9]+,[0-9]+:[0-9a-fA-F,]+:[0-9a-fA-F,]+$' msieve.dat.dirty > msieve.dat.clean which does of course require you to have free disc space equal to about the size of the original uncompressed file. The files produced by nfs@home do have some peculiarities - there seem to be chunks of gzipped data, some relatively large, in the middle of the result of gunzipping the file.
Houston, we have a problem. Acting on Fivemack’s expert advice, I finally got the above procedure to work on my Win 7 box (via Cygwin) and it chewed on the file for less than 10 minutes. The uncompressed “dirty” file was 72Gb, the new clean data file was 2.4 Gb. Feeding it into msieve revealed barely 20M relations residing in the data file, and filtering quickly terminated.

Either I need something else in my egrep syntax, or that data file is in bad shape.

Where to go from here?

2017-10-17, 23:28   #2199
frmky

Jul 2003
So Cal

22·33·19 Posts

Quote:
 Originally Posted by swellman Houston, we have a problem. Acting on Fivemack’s expert advice, I finally got the above procedure to work on my Win 7 box (via Cygwin) and it chewed on the file for less than 10 minutes. The uncompressed “dirty” file was 72Gb, the new clean data file was 2.4 Gb. Feeding it into msieve revealed barely 20M relations residing in the data file, and filtering quickly terminated. Either I need something else in my egrep syntax, or that data file is in bad shape. Where to go from here?
Looking at it now...

Last fiddled with by frmky on 2017-10-18 at 04:18

2017-10-18, 07:24   #2200
frmky

Jul 2003
So Cal

22·33·19 Posts

Quote:
 Originally Posted by wombatman I appear to be having a similar issue with the C181_HP2 file. I get to ~187M or so and then msieve basically just appears to churn where the CPU is occupied as expected but there is no disk activity at all related to reading in relations. I haven't tried remdups, but I did redownload the relations file and try again without any success.

 Similar Threads Thread Thread Starter Forum Replies Last Post cubaq YAFU 2 2017-04-02 11:35 CRGreathouse Msieve 8 2009-08-05 07:25 10metreh Msieve 3 2009-02-02 08:34 Damian Math 8 2007-02-12 22:25 R1zZ1 Factoring 2 2007-02-02 06:45

All times are UTC. The time now is 06:25.

Thu Nov 26 06:25:41 UTC 2020 up 77 days, 3:36, 3 users, load averages: 1.42, 1.49, 1.44