mersenneforum.org  

Go Back   mersenneforum.org > Factoring Projects > Msieve

Reply
 
Thread Tools
Old 2016-09-29, 19:11   #1
fivemack
(loop (#_fork))
 
fivemack's Avatar
 
Feb 2006
Cambridge, England

191316 Posts
Default A small point on filtering

The weight target used when doing the clique removal pass is a function of the amount of RAM available; running with the same target density on a 64GB machine and a 32GB machine gave me

64GB
Code:
Sun Sep 25 03:48:18 2016  commencing singleton removal, initial pass
Sun Sep 25 07:48:11 2016  memory use: 11024.0 MB
Sun Sep 25 07:48:15 2016  reading all ideals from disk
Sun Sep 25 07:50:46 2016  memory use: 12693.0 MB
Sun Sep 25 08:02:33 2016  commencing in-memory singleton removal
Sun Sep 25 08:08:32 2016  begin with 742388473 relations and 598583048 unique ideals
Sun Sep 25 09:36:22 2016  reduce to 506671424 relations and 341116535 ideals in 15 passes
Sun Sep 25 09:36:22 2016  max relations containing the same ideal: 37
Sun Sep 25 09:39:00 2016  reading ideals above 720000
Sun Sep 25 09:39:01 2016  commencing singleton removal, initial pass
Sun Sep 25 12:52:37 2016  memory use: 11024.0 MB
Sun Sep 25 12:52:39 2016  reading all ideals from disk
Sun Sep 25 12:59:10 2016  memory use: 22311.2 MB
Sun Sep 25 13:27:01 2016  keeping 414655273 ideals with weight <= 200, target excess is 2608246
Sun Sep 25 13:45:24 2016  commencing in-memory singleton removal
Sun Sep 25 13:48:07 2016  begin with 506671424 relations and 414655273 unique ideals
Sun Sep 25 14:23:57 2016  reduce to 506628374 relations and 414612223 ideals in 10 passes
Sun Sep 25 14:23:57 2016  max relations containing the same ideal: 200
Sun Sep 25 14:39:19 2016  removing 14788264 relations and 12866356 ideals in 2000000 cliques
and on a 32GB machine
Code:
Thu Sep 29 10:09:57 2016  commencing singleton removal, initial pass
Thu Sep 29 11:27:53 2016  memory use: 11024.0 MB
Thu Sep 29 11:27:53 2016  reading all ideals from disk
Thu Sep 29 11:28:17 2016  memory use: 12693.0 MB
Thu Sep 29 11:29:18 2016  commencing in-memory singleton removal
Thu Sep 29 11:30:14 2016  begin with 742388473 relations and 598583048 unique ideals
Thu Sep 29 11:39:51 2016  reduce to 506671424 relations and 341116535 ideals in 15 passes
Thu Sep 29 11:39:51 2016  max relations containing the same ideal: 37
Thu Sep 29 11:40:34 2016  reading ideals above 720000
Thu Sep 29 11:40:35 2016  commencing singleton removal, initial pass
Thu Sep 29 12:54:52 2016  memory use: 11024.0 MB
Thu Sep 29 12:54:52 2016  reading large ideals from disk
Thu Sep 29 12:59:17 2016  keeping 376967749 ideals with weight <= 20, target excess is 40295770
Thu Sep 29 13:06:02 2016  memory use: 14097.3 MB
Thu Sep 29 13:06:02 2016  commencing in-memory singleton removal
This meant the 32GB machine produced a rather larger matrix with td=144 than the 64GB machine had with td=128, which was somewhat confusing, particularly since the 64GB machine had only used 22GB for the reading-all-ideals phase. This is because all the parts of msieve that refer to ram_size actually compare against ram_size/2

Now I've read the code, I can try to work round this by

Code:
screen /home/nfsworld/msieve-svn/trunk/msieve -v -nc1 "filter_mem_mb=65536 target_density=144"
but the ram_size/2 comparisons are because later phases want to load two copies of the data, so on a 32G machine with no swap this fails.

(why not just use the 64G machine for everything? because it's the quad-Opteron I talked about elsewhere, and for the reasons discussed in that thread I didn't want to be using 22GB of its memory while timing linear algebra. Also the 32G machine has fast SSD and can do singleton removal in 74 rather than 193 minutes. I'm rerunning the filtering on the 64G machine)

Last fiddled with by fivemack on 2016-09-30 at 09:26
fivemack is offline   Reply With Quote
Old 2016-09-29, 19:22   #2
fivemack
(loop (#_fork))
 
fivemack's Avatar
 
Feb 2006
Cambridge, England

641910 Posts
Default

More interesting question: where does the default max ideal weight limit of 200 come from? Does it have to fit in a single byte for data structure reasons? Since I got a large slow matrix when it got set to 20, should I expect a smaller or at least a faster matrix if I set it to 250?
fivemack is offline   Reply With Quote
Old 2016-10-05, 18:33   #3
jasonp
Tribal Bullet
 
jasonp's Avatar
 
Oct 2004

3,541 Posts
Default

Just seeing this. The ram_size/2 figure is to leave half your memory for not-filtering-tasks; nothing in the code really becomes larger or smaller based on the amount of RAM you declare to be usable, it's just compared to the size of one of the intermediate savefiles generated by the singleton removal, under the assumption that that's how much memory the clique removal and full merge will need.

The 200 figure was chosen to limit the explosion of memory use by including ideals in the filtering that occur very often. Filtering would probably ignore those ideals anyway so they just chew up space. You can get a reasonable but probably-too-light matrix with a target of 40.
jasonp is offline   Reply With Quote
Reply

Thread Tools


Similar Threads
Thread Thread Starter Forum Replies Last Post
NFS filtering in the era of big data jasonp Msieve 36 2018-05-07 19:55
Sieving with powers of small primes in the Small Prime variation of the Quadratic Sieve mickfrancis Factoring 2 2016-05-06 08:13
The big filtering bug strikes again (I think) Dubslow Msieve 20 2016-02-05 14:00
Filtering Sleepy Msieve 25 2011-08-04 15:05
Filtering R.D. Silverman Cunningham Tables 14 2010-08-05 08:30

All times are UTC. The time now is 00:50.


Sat Jul 17 00:50:03 UTC 2021 up 49 days, 22:37, 1 user, load averages: 1.45, 1.48, 1.39

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.