mersenneforum.org

mersenneforum.org (https://www.mersenneforum.org/index.php)
-   NFS@Home (https://www.mersenneforum.org/forumdisplay.php?f=98)
-   -   Linear algebra reservations, progress and results (https://www.mersenneforum.org/showthread.php?t=20023)

Mini-Geek 2017-10-16 00:44

[QUOTE=swellman;469885]If not specified, the default target_density (TD) used by msieve is 70. You may want to restart msieve with a TD=116, then 112, etc until msieve successfully enters LA. Or drop TD in increments of 10. It’s up to you but it should be a bit faster than TD=70.

That said, a SNFS 263 is never going to be a one week job on a single machine. Good luck![/QUOTE]

Thanks! I'll try it in lower increments and see what I can get. :smile:

fivemack 2017-10-17 11:24

Taking L1321, on the basis of why not bite off more than I can chew in two places at once

:barbie:

swellman 2017-10-17 12:54

C222_143_73 Update
 
I finally got both data files for C222_143_73 combined, filtered and into LA. Job should be done by 3 Nov. The split approach certainly works but it’s fairly unwieldy. And the algebraic file really didn’t add a lot of unique relations to the pot. Combined it was 610M raw/351M unique relations. I’ll post more details when I report the factors.

ETA: Fib 1301 should be done on 20 Nov. 6 weeks to run that job!

swellman 2017-10-17 13:11

C202_137_75 -14e/33 job
 
I think there is a problem with this data file. I’ve downloaded it twice now, with the same results both times. Many, many errors in filtering, and eventually msieve crawls to a halt (but doesn’t crash) during reading relations, somewhere in the high 470M’s.

Remdups doesn’t seem to play well with it either, refusing to hash over 100M relations. These seem to be then discarded by remdups. Remdups also seems to need a DIM of at least 3000 but the highest I can use on my 32 Gb machine without an “out of memory” error is DIM=780. Not sure why that is - memory doesn’t seem to be a problem at least according to Win 10 task manager.

Remdups issues aside, I have always been able to filter data with msieve the old fashioned way, but not in this case.

Any advice?

wombatman 2017-10-17 13:28

[QUOTE=swellman;469994]I think there is a problem with this data file. I’ve downloaded it twice now, with the same results both times. Many, many errors in filtering, and eventually msieve crawls to a halt (but doesn’t crash) during reading relations, somewhere in the high 470M’s.

Remdups doesn’t seem to play well with it either, refusing to hash over 100M relations. These seem to be then discarded by remdups. Remdups also seems to need a DIM of at least 3000 but the highest I can use on my 32 Gb machine without an “out of memory” error is DIM=780. Not sure why that is - memory doesn’t seem to be a problem at least according to Win 10 task manager.

Remdups issues aside, I have always been able to filter data with msieve the old fashioned way, but not in this case.

Any advice?[/QUOTE]

I appear to be having a similar issue with the C181_HP2 file. I get to ~187M or so and then msieve basically just appears to churn where the CPU is occupied as expected but there is no disk activity at all related to reading in relations. I haven't tried remdups, but I did redownload the relations file and try again without any success.

fivemack 2017-10-17 14:29

I had a similar issue with L1307 - at about 130 million relations in, msieve just hung.

I sorted this out by using 'grep' to filter valid lines out of the file

[code]
egrep '^[-0-9]+,[0-9]+:[0-9a-fA-F,]+:[0-9a-fA-F,]+$' msieve.dat.dirty > msieve.dat.clean
[/code]

which does of course require you to have free disc space equal to about the size of the original uncompressed file.

The files produced by nfs@home do have some peculiarities - there seem to be chunks of gzipped data, some relatively large, in the middle of the result of gunzipping the file.

wombatman 2017-10-17 16:26

Do I need to decompress the relations file before doing your grep cleaning step?

fivemack 2017-10-17 16:32

[QUOTE=wombatman;470007]Do I need to decompress the relations file before doing your grep cleaning step?[/QUOTE]

Yes, egrep doesn't work on compressed data. But if you're on a Unix machine, gunzip -c msieve.dat.gz | egrep ... | gzip -c will filter the compressed relations file on the fly and recompress the results as they come out, which ought to save quite a lot of disc space.

swellman 2017-10-17 23:24

[QUOTE=fivemack;469999]I had a similar issue with L1307 - at about 130 million relations in, msieve just hung.

I sorted this out by using 'grep' to filter valid lines out of the file

[code]
egrep '^[-0-9]+,[0-9]+:[0-9a-fA-F,]+:[0-9a-fA-F,]+$' msieve.dat.dirty > msieve.dat.clean
[/code]

which does of course require you to have free disc space equal to about the size of the original uncompressed file.

The files produced by nfs@home do have some peculiarities - there seem to be chunks of gzipped data, some relatively large, in the middle of the result of gunzipping the file.[/QUOTE]

Houston, we have a problem. Acting on Fivemack’s expert advice, I finally got the above procedure to work on my Win 7 box (via Cygwin) and it chewed on the file for less than 10 minutes. The uncompressed “dirty” file was 72Gb, the new clean data file was 2.4 Gb. Feeding it into msieve revealed barely 20M relations residing in the data file, and filtering quickly terminated. :max:

Either I need something else in my egrep syntax, or that data file is in bad shape.

Where to go from here?

frmky 2017-10-17 23:28

[QUOTE=swellman;470027]Houston, we have a problem. Acting on Fivemack’s expert advice, I finally got the above procedure to work on my Win 7 box (via Cygwin) and it chewed on the file for less than 10 minutes. The uncompressed “dirty” file was 72Gb, the new clean data file was 2.4 Gb. Feeding it into msieve revealed barely 20M relations residing in the data file, and filtering quickly terminated. :max:

Either I need something else in my egrep syntax, or that data file is in bad shape.

Where to go from here?[/QUOTE]

Looking at it now...

Edit: now download the "clean" version and try again.

frmky 2017-10-18 07:24

[QUOTE=wombatman;469996]I appear to be having a similar issue with the C181_HP2 file. I get to ~187M or so and then msieve basically just appears to churn where the CPU is occupied as expected but there is no disk activity at all related to reading in relations. I haven't tried remdups, but I did redownload the relations file and try again without any success.[/QUOTE]

Try downloading the "clean" version now on the website.


All times are UTC. The time now is 23:06.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.