![]() |
|
|
#12 |
|
"Serge"
Mar 2008
Phi(4,2^7658614+1)/2
36×13 Posts |
I have found that BL running time isn't noticeably (I don't have enough data to say more definitely) influenced by added density (I used TARGET_DENSITY 90.0 for my last trick, and will probably try 100 next time), while it is almost quadratic on matrix side size. The resulting matrix is denser and smaller. However, denser matrices in effect need more RAM so there's a sweet spot somewhere (it may not be a problem for your servers to go farther than I can).
I wonder what will happen to your matrix if you set TARGET_DENSITY 120.0 or something? You may be able to cheat the filtering into making a reasonably dense matrix and you will be fine even if you overshoot a bit. Obviously, you also may do well with decreasing the starting magic weight of 200 ("keeping 278950783 ideals with weight <= 200"), but this is not an interesting way out. The interesting part would be to tweak the full merge to work with weight 90. It usually works very well when on input that has weight 20, 22 ... 30-40, I guess, but gets confused with too many - at 90, right? |
|
|
|
|
|
#13 |
|
Tribal Bullet
Oct 2004
1101110101012 Posts |
Increasing TARGET_DENSITY won't help; merging only stopped because there were no more ideals left to merge, since too many ideals were erroneously thrown away in the course of the merge phase. This wouldn't happen if you commented out the for() loop on lines 510-520 of common/filter/merge.c; but I don't know how bad the memory use will get in that case before merging stops. Alternately you can change the '100' in my fix above to something larger, maybe 1000, in order to delay the burying of dense ideals for longer. The object is to only throw away ideals that will never be merged, without knowing how long merging is supposed to run. If a filtering run with less excess converged to a 12M matrix and we have an 18M matrix now, then burying of ideals needs to be delayed about 50% longer.
|
|
|
|
|
|
#14 |
|
Jul 2003
So Cal
83A16 Posts |
I took out the for loop. Filtering with 579M relations took four weeks, used nearly 28 GB of memory, and produced a matrix that was still a bit larger than that produced using 490M relations. The log is attached.
I also let the original 16.5M matrix run to completion. As expected, it was unsuccessful. Code:
Wed Dec 30 14:05:18 2009 commencing linear algebra Wed Dec 30 14:05:23 2009 read 16529372 cycles Wed Dec 30 14:05:38 2009 matrix is 16529313 x 16529372 (2986.7 MB) with weight 901504482 (54.54/col) Wed Dec 30 14:05:38 2009 sparse part has weight 601110202 (36.37/col) Wed Dec 30 14:05:39 2009 saving the first 48 matrix rows for later Wed Dec 30 14:05:52 2009 matrix is 16529265 x 16529372 (2841.6 MB) with weight 623067426 (37.69/col) Wed Dec 30 14:05:53 2009 sparse part has weight 579619704 (35.07/col) Wed Dec 30 14:05:53 2009 matrix includes 64 packed rows Wed Dec 30 14:05:53 2009 using block size 65536 for processor cache size 2048 kB Wed Dec 30 14:07:13 2009 commencing Lanczos iteration (8 threads) Wed Dec 30 14:07:13 2009 memory use: 4009.4 MB Wed Dec 30 14:07:14 2009 restarting at iteration 88533 (dim = 5598557) Wed Dec 30 14:09:24 2009 linear algebra at 33.9%, ETA 520h 1m Wed Jan 27 09:05:33 2010 lanczos halted after 261369 iterations (dim = 16528330) Wed Jan 27 09:07:08 2010 lanczos error: only trivial dependencies found |
|
|
|
|
|
#15 |
|
Tribal Bullet
Oct 2004
3,541 Posts |
I wish there was something else we could try; thanks for pouring so much effort into this.
|
|
|
|
|
|
#16 | |
|
Nov 2003
22×5×373 Posts |
Quote:
fail as well when the matrix is this sparse. Anytime the density falls below .001% I have had trouble. Here the density is ~ 37/16.5M ~ .00022% I have found that more than 10% (or so) over-sieving is detrimental rather than beneficial. |
|
|
|
|
|
|
#17 |
|
"Serge"
Mar 2008
Phi(4,2^7658614+1)/2
947710 Posts |
Greg, the "max relations containing the same ideal" starts with 200 in all the versions that you were using, and after all clique removal it seems to me that it remains too generous (last lines of clique removal say
Wed Jan 6 11:03:09 2010 begin with 135716320 relations and 78806502 unique ideals Wed Jan 6 11:06:01 2010 reduce to 135716318 relations and 78798091 ideals in 2 passes Wed Jan 6 11:06:01 2010 max relations containing the same ideal: 90 ...the gap rels/ideals remains too large and 2-way and full merges make too sparse a matrix). If you had some more energy to push this test, you may want to change just the 200 initial value to something like 120 or maybe even 80? And then expect it to arrive to a much smaller/denser set (e.g. 40M relations with 30M ideals, which is usual for this size of a task)? I don't know if it will work, but it could be one last thing to try before wiping that directory? Only if you have time for this, of course. |
|
|
|
|
|
#18 |
|
Tribal Bullet
Oct 2004
3,541 Posts |
Actually there is so much excess in this case that the clique removal ran out of cliques to remove! Rather than move on to 3-way cliques and up, the code just gives up and uses the remaining excess during the merge phase. Starting with excess less than 200 might possibly salvage things; at least the merge phase produced a matrix of reasonable size (thanks for the confirmation that you really do need to throw away heavy columns early, though perhaps I throw them away too early).
It amazes me that having too many relations is something that can't be dealt with in an automated fashion. Even trying to throw the extra ones away automatically doesn't work :) |
|
|
|
![]() |
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| matrix needs more columns than rows | ryanp | Msieve | 2 | 2013-05-02 00:02 |
| matrix needs more columns than rows | wreck | Msieve | 7 | 2010-09-07 10:03 |
| Error: "matrix must have more columns than rows" | mdettweiler | Msieve | 10 | 2009-03-17 02:38 |
| BUG: Half-assigned exponent (blank columns) | sylvester | PrimeNet | 2 | 2008-10-28 21:32 |