mersenneforum.org  

Go Back   mersenneforum.org > Factoring Projects > Msieve

Reply
 
Thread Tools
Old 2014-01-09, 23:49   #12
f1pokerspeed
 
Jun 2012

1528 Posts
Default

And now I just get the submatrix not invertible error... damnit. The matrix builds fine... maybe I'll have to use my 32-bit desktop - a measly C2D compared to my laptop's Core i5, so it might be a few more hours.
f1pokerspeed is offline   Reply With Quote
Old 2014-01-09, 23:58   #13
Batalov
 
Batalov's Avatar
 
"Serge"
Mar 2008
Phi(4,2^7658614+1)/2

100101000001012 Posts
Default

Quote:
Originally Posted by f1pokerspeed View Post
the submatrix not invertible error...
...is a sign of an unstable computer. It may pass memtest, it may even pass P95 torture test, but the "msieve LA test" is more strict. We used to joke that it would make a good torture test -- but one needs a portable ("inflatable", cheaply constructible) matrix object, too, to make this a practical stability tool.
Batalov is offline   Reply With Quote
Old 2014-01-10, 02:10   #14
swellman
 
swellman's Avatar
 
Jun 2012

22×13×59 Posts
Default

It could also be a bug found in some versions of msieve. I experienced this issue as well.

Solution was to roll back to v1.49 or try a newer precompiled version. I've been using this newer version of msieve with no problems for a few months.

Hope this helps.
swellman is online now   Reply With Quote
Old 2014-02-27, 01:33   #15
WraithX
 
WraithX's Avatar
 
Mar 2006

479 Posts
Default

Hello everyone,

I think I'm running into a similar problem where msieve says it "wants 1000000 more" relations. I ran remdups4 on the dataset before passing it to msieve. I was wondering, can someone look at the following output and let me know if this is a normal looking run, or if I might be running into some (unknown to me) problem?

I sieved from 10M-400M by last December and ran msieve -v -nc and got:
Quote:
Mon Dec 02 14:34:26 2013 commencing number field sieve (210-digit input)
<snip>
Mon Dec 02 14:34:26 2013 A5: 535643820
Mon Dec 02 14:34:26 2013 skew 380780879.24, size 1.056e-020, alpha -9.238, combined = 9.932e-016 rroots = 5
Mon Dec 02 14:34:26 2013
Mon Dec 02 14:34:26 2013 commencing relation filtering
Mon Dec 02 14:34:26 2013 estimated available RAM is 65521.9 MB
Mon Dec 02 14:34:26 2013 commencing duplicate removal, pass 1
Mon Dec 02 15:19:16 2013 found 15941610 hash collisions in 272725019 relations
Mon Dec 02 15:20:02 2013 added 122251 free relations
Mon Dec 02 15:20:02 2013 commencing duplicate removal, pass 2
Mon Dec 02 15:21:38 2013 found 14 duplicates and 272847256 unique relations
Mon Dec 02 15:21:38 2013 memory use: 788.8 MB
Mon Dec 02 15:21:38 2013 reading ideals above 196411392
Mon Dec 02 15:21:38 2013 commencing singleton removal, initial pass
Mon Dec 02 16:13:19 2013 memory use: 11024.0 MB
Mon Dec 02 16:13:20 2013 reading all ideals from disk
Mon Dec 02 16:13:24 2013 memory use: 6603.7 MB
Mon Dec 02 16:13:45 2013 commencing in-memory singleton removal
Mon Dec 02 16:14:06 2013 begin with 272847256 relations and 393539712 unique ideals
Mon Dec 02 16:14:44 2013 reduce to 684005 relations and 80466 ideals in 11 passes
Mon Dec 02 16:14:44 2013 max relations containing the same ideal: 4
Mon Dec 02 16:14:44 2013 reading ideals above 30000
Mon Dec 02 16:14:44 2013 commencing singleton removal, initial pass
Mon Dec 02 16:15:35 2013 memory use: 94.1 MB
Mon Dec 02 16:15:35 2013 reading all ideals from disk
Mon Dec 02 16:15:35 2013 memory use: 37.9 MB
Mon Dec 02 16:15:35 2013 commencing in-memory singleton removal
Mon Dec 02 16:15:35 2013 begin with 715377 relations and 4670109 unique ideals
Mon Dec 02 16:15:35 2013 reduce to 31 relations and 0 ideals in 3 passes
Mon Dec 02 16:15:35 2013 max relations containing the same ideal: 0
Mon Dec 02 16:15:35 2013 filtering wants 1000000 more relations
Mon Dec 02 16:15:35 2013 elapsed time 01:41:11
I have since finished sieving from 10M-600M and ran msieve -v -nc today and got:
Quote:
Wed Feb 26 10:28:45 2014 commencing number field sieve (210-digit input)
<snip>
Wed Feb 26 10:28:45 2014 A5: 535643820
Wed Feb 26 10:28:45 2014 skew 380780879.24, size 1.056e-020, alpha -9.238, combined = 9.932e-016 rroots = 5
Wed Feb 26 10:28:45 2014
Wed Feb 26 10:28:45 2014 commencing relation filtering
Wed Feb 26 10:28:45 2014 estimated available RAM is 65521.9 MB
Wed Feb 26 10:28:45 2014 commencing duplicate removal, pass 1
Wed Feb 26 11:25:45 2014 found 24621471 hash collisions in 342464660 relations
Wed Feb 26 11:26:31 2014 added 122251 free relations
Wed Feb 26 11:26:31 2014 commencing duplicate removal, pass 2
Wed Feb 26 11:29:18 2014 found 22 duplicates and 342586889 unique relations
Wed Feb 26 11:29:18 2014 memory use: 1257.5 MB
Wed Feb 26 11:29:18 2014 reading ideals above 246611968
Wed Feb 26 11:29:18 2014 commencing singleton removal, initial pass
Wed Feb 26 12:34:59 2014 memory use: 11024.0 MB
Wed Feb 26 12:35:00 2014 reading all ideals from disk
Wed Feb 26 12:35:05 2014 memory use: 8032.6 MB
Wed Feb 26 12:35:24 2014 commencing in-memory singleton removal
Wed Feb 26 12:35:44 2014 begin with 342586889 relations and 427878273 unique ideals
Wed Feb 26 12:36:31 2014 reduce to 1432001 relations and 281169 ideals in 16 passes
Wed Feb 26 12:36:31 2014 max relations containing the same ideal: 5
Wed Feb 26 12:36:32 2014 reading ideals above 30000
Wed Feb 26 12:36:32 2014 commencing singleton removal, initial pass
Wed Feb 26 12:37:45 2014 memory use: 188.3 MB
Wed Feb 26 12:37:45 2014 reading all ideals from disk
Wed Feb 26 12:37:45 2014 memory use: 79.6 MB
Wed Feb 26 12:37:46 2014 keeping 8098146 ideals with weight <= 200, target excess is 7315
Wed Feb 26 12:37:46 2014 commencing in-memory singleton removal
Wed Feb 26 12:37:46 2014 begin with 1441342 relations and 8098146 unique ideals
Wed Feb 26 12:37:46 2014 reduce to 32 relations and 0 ideals in 3 passes
Wed Feb 26 12:37:46 2014 max relations containing the same ideal: 0
Wed Feb 26 12:37:46 2014 filtering wants 1000000 more relations
Wed Feb 26 12:37:46 2014 elapsed time 02:09:02
Are there any different command line options I should try, or should I just focus on collecting more relations for now?

Here are the parameters I am using for sieving:
Code:
rlim: 500000000
alim: 500000000
lpbr: 32
lpba: 33
mfbr: 64
mfba: 96
rlambda: 2.7
alambda: 3.7
WraithX is offline   Reply With Quote
Old 2014-02-27, 02:04   #16
Batalov
 
Batalov's Avatar
 
"Serge"
Mar 2008
Phi(4,2^7658614+1)/2

224058 Posts
Default

Looks normal. (Never mind the "wants 1000000 more relations"; that's not a concrete number. Sort of like the proverbial "+100500" in some forums.)

Because you are using a 32/33-bit set up, you'd need probably a minimum of 500M relations. You will have enough relations when the line "begin with XXX relations and YYY unique ideals" will have XXX >~ YYY. (Well, it may converge with XXX somewhat smaller than YYY; but conversely, it may not converge with XXX somewhat larger than YYY.) Try to find some of the similar frmky's logs to see the examples of the convergence, e.g. the 10,770M c211 log.
Batalov is offline   Reply With Quote
Old 2014-02-27, 03:34   #17
WraithX
 
WraithX's Avatar
 
Mar 2006

479 Posts
Default

Thanks for the info, and for the reference to frmky's factorization. I found his log here

Hopefully at this point I can extrapolate how much further I'll have to sieve to get to 500M relations. (that's unique relations, right?) Unfortunately, remdups4 is telling me that about 33% of my relations are duplicates, or else I'd already be over 500M! Oh well, the search continues!

And then hopefully I can get to 500M before I get to 1000MQ (because that's where the sievers stop, right?)
WraithX is offline   Reply With Quote
Old 2014-02-27, 12:38   #18
henryzz
Just call me Henry
 
henryzz's Avatar
 
"David"
Sep 2007
Cambridge (GMT/BST)

23·3·5·72 Posts
Default

If you reach the max q of the standard binaries then you can switch to the newer/buggier ones assuming you run on linux.
You will probably want to do a bit of oversieving in order to reduce the size of the matrix.

Last fiddled with by henryzz on 2014-02-27 at 12:39
henryzz is offline   Reply With Quote
Old 2014-02-27, 15:36   #19
swellman
 
swellman's Avatar
 
Jun 2012

1011111111002 Posts
Default

Also check out a recent large factorization in the
3+ Cunningham thread
. The number was sieved on both the rational and algebraic sides FWIW.

Good luck with this factorization - it's an impressive project.
swellman is online now   Reply With Quote
Reply



Similar Threads
Thread Thread Starter Forum Replies Last Post
New Factor leaves a C168 Mersenne Composite wblipp ElevenSmooth 7 2013-01-17 02:54

All times are UTC. The time now is 00:51.


Sat Jul 17 00:51:30 UTC 2021 up 49 days, 22:38, 1 user, load averages: 1.41, 1.50, 1.41

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.