![]() |
|
|
#1 |
|
Jun 2012
11010102 Posts |
This is rather irritating... using factmsieve.py, I had just finished sieving a range of 300k Q in about 40 core-hours (C110, using 13e) and due to a foolish "copy GPU build to incompatible system and running msieve" issue, the script continued to oversieve without trying to build the (probably) decent-sized matrix - making what should be a 7.35m rel set to be 10.5m rels. That's a 43% oversieve... but what annoyed me even more was...
The script ran -nc1 on the relation set and within two passes of singleton removal I'm left with 111 RELATIONS. Just 111. WHAT? "Filtering wants 1M more relations." I'm not surprised... is this meant to happen, or is this a bug? I've now got to spend three days re-sieving.. ugh. Ah well, maybe I should have kept a backup or something.... |
|
|
|
|
|
#2 |
|
"Serge"
Mar 2008
Phi(4,2^7658614+1)/2
36×13 Posts |
Can you post the filtering log?
The fact that you have "only 111 relations left" suggests that you are trying to use relations that are incompatible with the polynomial: - either you sieved repeatedly with the same initial q0 (i.e. over and over again) - or you sieved with a wrong poly - or you filtered with a wrong poly (note: msieve uses the .fb file; it doesn't care what is in the .poly file) In the log we will probably see what really happened. Last fiddled with by Batalov on 2014-01-03 at 22:52 |
|
|
|
|
|
#3 |
|
Jun 2012
2·53 Posts |
For some reason the log just isn't there (I'll check File History and see if something deleted it), but factmsieve shouldn't have made a boo-boo that big. I know I used the right poly and sieved different Q values - only ~10 error {-11, -15} reading relation xxxxxx... and the only script/execution of GGNFS was via factmsieve - I had no input in Q to sieve, so I have no idea how it could have screwed up. I didn't delete any files either or edit the .resume file with the Q values for each thread in (running on three threads BTW).
Any thoughts? EDIT: Log file - it's 11 passes on singleton removal BTW, bad memory... Code:
Fri Jan 03 16:07:17 2014 Msieve v. 1.52 (SVN 939) Fri Jan 03 16:07:17 2014 random seeds: f87c9ae0 a4ce8500 Fri Jan 03 16:07:17 2014 factoring 27497402623478819755861287638997142716239668544799922159987902398906410768984442135177909823618476535803680709 (110 digits) Fri Jan 03 16:07:18 2014 searching for 15-digit factors Fri Jan 03 16:07:19 2014 commencing number field sieve (110-digit input) Fri Jan 03 16:07:19 2014 R0: -928246644395884074880 Fri Jan 03 16:07:19 2014 R1: 33104915023 Fri Jan 03 16:07:19 2014 A0: -14528625718358860692124437 Fri Jan 03 16:07:19 2014 A1: 11784324502903670452245 Fri Jan 03 16:07:19 2014 A2: -1210010951786695015 Fri Jan 03 16:07:19 2014 A3: -12950346227801 Fri Jan 03 16:07:19 2014 A4: 4733349044 Fri Jan 03 16:07:19 2014 A5: 39900 Fri Jan 03 16:07:19 2014 skew 18889.61, size 1.749e-010, alpha -6.607, combined = 1.046e-009 rroots = 3 Fri Jan 03 16:07:19 2014 Fri Jan 03 16:07:19 2014 commencing relation filtering Fri Jan 03 16:07:19 2014 estimated available RAM is 4043.9 MB Fri Jan 03 16:07:19 2014 commencing duplicate removal, pass 1 <snipped ~20 relation errors, only -11 and -15> Fri Jan 03 16:08:11 2014 skipped 3 relations with b > 2^32 Fri Jan 03 16:08:11 2014 found 3042496 hash collisions in 10326173 relations Fri Jan 03 16:08:28 2014 added 52723 free relations Fri Jan 03 16:08:28 2014 commencing duplicate removal, pass 2 Fri Jan 03 16:09:04 2014 found 5453396 duplicates and 4925500 unique relations Fri Jan 03 16:09:04 2014 memory use: 82.6 MB Fri Jan 03 16:09:04 2014 reading ideals above 100000 Fri Jan 03 16:09:04 2014 commencing singleton removal, initial pass Fri Jan 03 16:10:04 2014 memory use: 172.3 MB Fri Jan 03 16:10:04 2014 reading all ideals from disk Fri Jan 03 16:10:04 2014 memory use: 163.8 MB Fri Jan 03 16:10:05 2014 keeping 6508049 ideals with weight <= 200, target excess is 36558 Fri Jan 03 16:10:05 2014 commencing in-memory singleton removal Fri Jan 03 16:10:06 2014 begin with 4925500 relations and 6508049 unique ideals Fri Jan 03 16:10:06 2014 reduce to 111 relations and 0 ideals in 11 passes Fri Jan 03 16:10:06 2014 max relations containing the same ideal: 0 Fri Jan 03 16:10:06 2014 filtering wants 1000000 more relations Fri Jan 03 16:10:06 2014 elapsed time 00:02:49 Last fiddled with by f1pokerspeed on 2014-01-03 at 23:03 |
|
|
|
|
|
#4 |
|
(loop (#_fork))
Feb 2006
Cambridge, England
191316 Posts |
I'm afraid that the script didn't over-sieve, it just sieved the same region twice; so you have 10.5M relations in the file, but actually it's five million relations twice each.
|
|
|
|
|
|
#5 |
|
Jun 2012
10610 Posts |
Even so, how does 5m relations reduce to 111? That seems a bit ridiculous, else it would take absolutely forever to complete the factorization... or am I missing something?
|
|
|
|
|
|
#6 |
|
"Serge"
Mar 2008
Phi(4,2^7658614+1)/2
36×13 Posts |
That? "reduce to 111 relations"?
That doesn't mean that you only have 111 relations. Only means that you don't have enough for the filtering algorithm to get a working set. Also note that this doesn't mean that the other relations were "deleted" -- they are all still there. |
|
|
|
|
|
#7 | |
|
Account Deleted
"Tim Sorbera"
Aug 2006
San Antonio, TX USA
17·251 Posts |
Quote:
This graph is probably wildly inaccurate, but helps to carry the idea. ![]() In reality (to the best of my limited knowledge), there is some sort of relationship between relations that must be discovered. Something like the birthday paradox: as the number of relations grows, the number of these relationships grows quickly. Last fiddled with by Mini-Geek on 2014-01-04 at 00:09 |
|
|
|
|
|
|
#8 |
|
Tribal Bullet
Oct 2004
354110 Posts |
Yes, if you allow one algebraic and rational large prime then as the dataset grows the number of cycles increases approximately quadratically. The GNFS siever allows 2 or 3 large primes for each and the rate of cycle production increases extremely quickly once you have enough relations. The problem is that you get a threshold phenomenon where the singleton removal will find the entire dataset almost useless until it is large enough, and you suddenly are almost finished.
|
|
|
|
|
|
#9 |
|
Jun 2012
2×53 Posts |
Thanks for the information everyone! I'll keep sieving and see how long I need to wait until this matrix can get built.
|
|
|
|
|
|
#10 |
|
Jun 2012
1528 Posts |
So I finally got the matrix built with ~7.4m relations... but then -nc2 fails with submatrix is not invertible error. Now all of my relations are gone. I assume I can recover them... but how?
Do I have to extract spairs.save.gz and rename to <jobname>.dat? (I'm now using the 32-bit binary of v1.51 provided by Jeff Gilchrist, instead of 64-bit v1.52 by Brian Gladman - hopefully this should fix the submatrix uninvertible error) Last fiddled with by f1pokerspeed on 2014-01-09 at 13:39 |
|
|
|
|
|
#11 |
|
I moo ablest echo power!
May 2013
29·61 Posts |
Yes, you can unzip that file and rename it appropriately, and it will work fine. I've had to do the same thing myself.
|
|
|
|
![]() |
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| New Factor leaves a C168 Mersenne Composite | wblipp | ElevenSmooth | 7 | 2013-01-17 02:54 |