![]() |
Bizarre thing with sieving:
[code]found [U]9207128[/U] relations, need at least 9036311, proceeding with filtering ... nfs: commencing msieve filtering 182406447322336820309560461032830305181092209591614813564635719276441105985665782458025218176998282272069123864041991524112937073 commencing relation filtering estimated available RAM is 12019.7 MB commencing duplicate removal, pass 1 error -15 reading relation 1566313 error -9 reading relation 3239082 found 3354104 hash collisions in 9207127 relations added 55898 free relations commencing duplicate removal, pass 2 [U]found 5904462 duplicates and 3358563 unique relations[/U] memory use: 45.3 MB reading ideals above 100000 commencing singleton removal, initial pass memory use: 172.2 MB reading all ideals from disk memory use: 122.7 MB keeping 6591825 ideals with weight <= 200, target excess is 20050 commencing in-memory singleton removal begin with [U]3358563 relations[/U] and 6591825 unique ideals reduce to 84 relations and 0 ideals in 5 passes max relations containing the same ideal: 0 nfs: commencing algebraic side lattice sieving over range: 4054711 - 4094711[/code] Since when are 2/3rds of the total rel count duplicates? This also means the C129 might take longer than expected. |
[QUOTE=Dubslow;298124]Since when are 2/3rds of the total rel count duplicates?[/QUOTE]
This doesn't happen when done right. Have you stopped and restarted or any other details you may consider sharing? You need to get B[SUP]2[/SUP]'s help and debug. |
[QUOTE=Dubslow;298124]Bizarre thing with sieving:
[code]<reproduced below>[/code] Since when are 2/3rds of the total rel count duplicates? This also means the C129 might take longer than expected.[/QUOTE] [QUOTE=Batalov;298129]This doesn't happen when done right. Have you stopped and restarted or any other details you may consider sharing? You need to get B[SUP]2[/SUP]'s help and debug.[/QUOTE] Yes, I had stopped and restarted a few times in poly select (not previously caused problems) and twice during previous sieving, hence the funny bounds. [code]found [U]9207128[/U] relations, need at least 9036311, proceeding with filtering ... nfs: commencing msieve filtering 182406447322336820309560461032830305181092209591614813564635719276441105985665782458025218176998282272069123864041991524112937073 commencing relation filtering estimated available RAM is 12019.7 MB commencing duplicate removal, pass 1 [U]error -15 reading relation 1566313 error -9 reading relation 3239082[/U] found 3354104 hash collisions in 9207127 relations added 55898 free relations commencing duplicate removal, pass 2 [U]found 5904462 duplicates and 3358563 unique relations[/U] memory use: 45.3 MB reading ideals above 100000 commencing singleton removal, initial pass memory use: 172.2 MB reading all ideals from disk memory use: 122.7 MB keeping 6591825 ideals with weight <= 200, target excess is 20050 commencing in-memory singleton removal begin with [U]3358563 relations[/U] and 6591825 unique ideals reduce to 84 relations and 0 ideals in 5 passes max relations containing the same ideal: 0 nfs: commencing algebraic side lattice sieving over range: 4054711 - 4094711 total yield: 535430, q=4254739 (0.01050 sec/rel) [I]<The above is from my previous post, and here's what's happened since. Of course, it's only now that I notice errors both above and below.>[/I] found [U]9742558 relations, need at least 9036311[/U], proceeding with filtering ... nfs: commencing msieve filtering 182406447322336820309560461032830305181092209591614813564635719276441105985665782458025218176998282272069123864041991524112937073 commencing relation filtering estimated available RAM is 12019.7 MB commencing duplicate removal, pass 1 [U]error -15 reading relation 1566313 error -9 reading relation 3239082[/U] found 3553246 hash collisions in 9798455 relations added 392 free relations commencing duplicate removal, pass 2 [U]found 6339694 duplicates and 3459153 unique relations[/U] memory use: 53.3 MB reading ideals above 100000 commencing singleton removal, initial pass memory use: 172.2 MB reading all ideals from disk memory use: 126.4 MB keeping 6707590 ideals with weight <= 200, target excess is 20329 commencing in-memory singleton removal begin with 3459153 relations and 6707590 unique ideals reduce to 84 relations and 0 ideals in 5 passes max relations containing the same ideal: 0 nfs: commencing algebraic side lattice sieving over range: 4094711 - 4134711 [/code] It seems that it keeps reproducing the duplicate rels each time. |
I did not [I]suggest[/I] stopping and restarting -- this is not "Windows". :rolleyes:
I asked if you had. Ok, you had. (I may suspect that if you restart the process and the {equivalent of the .job} file was non existent, then the sieving could start from the initial q0 value and as a result all rels from the 2nd run will be duplications of the first one until the sievers will get to a yet unplowed range. But I don't know what yafu does. Ask Ben.) Relations are not some magic smoke - if they are there, they will stay there, and of course if they were redundant, they will stay redundant. yafu (and msieve which is inside of it) don't rewrite the relations file with a non-redundant version of it. The errors -9 and -15 are jammed lines in the file; they are harmles per se, but they will be reported over and over at each filtering; no surprise here. The amount of relations in your two passes seems growing, though. If you don't know what to do, do nothing - it may plow through all by itself. |
[QUOTE=Batalov;298149]
(I may suspect that if you restart the process and the {equivalent of the .job} file was non existent, then the sieving could start from the initial q0 value and as a result all rels from the 2nd run will be duplications of the first one until the sievers will get to a yet unplowed range. But I don't know what yafu does. Ask Ben.)[/quote]This wasn't the case. [code]bill@Gravemind:~/yafu∰∂ yafu "nfs(182406447322336820309560461032830305181092209591614813564635719276441105985665782458025218176998282272069123864041991524112937073)" [U]-R[/U] >> nfs: checking for job file - number in job file matches input nfs: checking for data file nfs: commencing NFS restart nfs: previous data file found - commencing search for last special-q line 0 = -4916171347,5713:7310311,2c20d,d2adf,da19d:753557,3874f39,1afb,56371,6181f,8a50f,4cd,3727f7 line 1 = -5103316103,6761:64c7953,172f927,81013,cd477:147a0ab,6afc657,1f75,5273,6b59,126f7,ca29b,3727f7 line 2 = 789282735,1489:3055,8e15,c99d,36a61,fed3d:5bfd9b,68836bb,758b,18c7d,16ff33,3727f7 line 3 = 2007379955,883:17a9fdd,7fccd9,2b049,5963dd:1a6551d,58bdbbf,1525,276b,43a5,10ccd,14bf3,3727f7 nfs: parsing special-q parsing rat side spq from 789282735,1489:3055,8e15,c99d,36a61,fed3d:5bfd9b,68836bb,758b,18c7d,16ff33,3727f7 found fed3d parsing alg side spq from 789282735,1489:3055,8e15,c99d,36a61,fed3d:5bfd9b,68836bb,758b,18c7d,16ff33,3727f7 found 3727f7 parsing rat side spq from 2007379955,883:17a9fdd,7fccd9,2b049,5963dd:1a6551d,58bdbbf,1525,276b,43a5,10ccd,14bf3,3727f7 found 5963dd parsing alg side spq from 2007379955,883:17a9fdd,7fccd9,2b049,5963dd:1a6551d,58bdbbf,1525,276b,43a5,10ccd,14bf3,3727f7 found 3727f7 parsing rat side spq from -4916171347,5713:7310311,2c20d,d2adf,da19d:753557,3874f39,1afb,56371,6181f,8a50f,4cd,3727f7 found da19d parsing alg side spq from -4916171347,5713:7310311,2c20d,d2adf,da19d:753557,3874f39,1afb,56371,6181f,8a50f,4cd,3727f7 found 3727f7 nfs: commencing gnfs on c129: 182406447322336820309560461032830305181092209591614813564635719276441105985665782458025218176998282272069123864041991524112937073 nfs: found 3239081 relations, continuing job at specialq = 3614711 found 3239081 relations, need at least 9036311, continuing with sieving ... nfs: commencing algebraic side lattice sieving over range: 3614711 - 3654711 [/code] The low end of the q range is in the middle of the q range it was working on when I stopped, so I presume it recorded where it stopped and was not doing duplicate work. [QUOTE=Batalov;298149] Relations are not some magic smoke - if they are there, they will stay there, and of course if they were redundant, they will stay redundant. yafu (and msieve which is inside of it) don't rewrite the relations file with a non-redundant version of it.[/quote]So what you're saying is that it doesn't remove the duplicates from the "total"? Why shouldn't that be done? [QUOTE=Batalov;298149] The errors -9 and -15 are jammed lines in the file; they are harmles per se, but they will be reported over and over at each filtering; no surprise here. [/quote]Okay, I didn't know the errors are persistent/will appear each time. Either way though, they still happened.[QUOTE=Batalov;298149] The amount of relations in your two passes seems growing, though. If you don't know what to do, do nothing - it may plow through all by itself.[/QUOTE]Yes, I suppose so. I'll try not to mess with it :razz: I suppose I should also mention that I had set "qintsize: 300000" in nfs.job, and then later changed it to "qintsize: 200000". Knowing how these things work, this is probably the source of the problem, isn't it? Edit: Third round of filtering, which picks up directly after the previous post's code block. [code]nfs: commencing algebraic side lattice sieving over range: 4094711 - 4134711 total yield: 532657, q=4294723 (0.01082 sec/rel) found 10275215 relations, need at least 9036311, proceeding with filtering ... nfs: commencing msieve filtering 182406447322336820309560461032830305181092209591614813564635719276441105985665782458025218176998282272069123864041991524112937073 commencing relation filtering estimated available RAM is 12019.7 MB commencing duplicate removal, pass 1 error -15 reading relation 1566313 error -9 reading relation 3239082 read 10M relations found 3778211 hash collisions in 10331504 relations added 340 free relations commencing duplicate removal, pass 2 found 6776760 duplicates and 3555084 unique relations memory use: 53.3 MB reading ideals above 100000 commencing singleton removal, initial pass memory use: 188.2 MB reading all ideals from disk memory use: 130.0 MB keeping 6816144 ideals with weight <= 200, target excess is 20618 commencing in-memory singleton removal begin with 3555084 relations and 6816144 unique ideals reduce to 84 relations and 0 ideals in 5 passes max relations containing the same ideal: 0 nfs: commencing algebraic side lattice sieving over range: 4134711 - 4174711[/code] |
You will need Ben. -R parsing seems ok-ish if a bit verbosiousiousious.
Could be the parameters to blame. If the lims or bits are too low, then you will never get enough relations. (if it were just the "run once and then run the same over again", that would be a lesser weevil - it would have righted itself, but if the parameters are off, then the siever will just tread water.) |
did you start with a certain number of thread then reduced it after? ( by modifi-ing factMsieve?=
|
[QUOTE=Dubslow;298151]
I suppose I should also mention that I had set "qintsize: 300000" in nfs.job,[/QUOTE] This is the source of the problem... see the yafu bugs thread. If you stop, remove that line, and restart again, you should relatively breeze through to the finish. |
LinAlg finally started. ETA 3.5hrs.
|
[QUOTE=Dubslow;297927]Hence the "thingy"; I guess I really meant that the 17 is persisting across lines [/QUOTE]
Hmm, I guess now I see that isn't really true either. :smile: C131 i3143 is almost at t30. The C129 was a 61*68 split. Edit: I'll take the C131 to t35 and then I'm outta here. (TF2!) |
It is ready for NFS. Don't ask, don't tell.
|
| All times are UTC. The time now is 23:14. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.