This is what I was worried about. It got to ~40 million relations and I end up with this:
[CODE]found 5162005 hash collisions in 40464281 relations added 23 free relations commencing duplicate removal, pass 2 found 4490571 duplicates and 35973733 unique relations memory use: 197.2 MB reading ideals above 720000 commencing singleton removal, initial pass memory use: 1378.0 MB reading all ideals from disk memory use: 1282.0 MB keeping 45221428 ideals with weight <= 200, target excess is 191371 commencing inmemory singleton removal begin with 35973733 relations and 45221428 unique ideals reduce to 4321 relations and 0 ideals in 17 passes[/CODE] I assume I need to bump up the LPBA and add a sizable number of relations, yes? 
If you reduced them to ashes, then it might be oversieved? Try again without some thousand lines, it may help. I remember I have seen a discussion about this in the past. It never happened to me, however, so I am not sure.

To judge by the last line:
[code] reduce to 4321 relations and 0 ideals in 17 passes [/code] You need about 30% more relations. Chris 
[QUOTE=wombatman;368306]This is what I was worried about. It got to ~40 million relations and I end up with this[/QUOTE]
With a 29/30 combination, you'd need about 55M unique relations. Right now you have 35M. Add another 2025M relations before retrying for a matrix. Keep adding 5M relations until you succeed. 
Thanks everybody. I'll report back with any new results.

Hasn't completed yet, but it looks much better with more relations:
[CODE]reduce to 13949767 relations and 14732971 ideals in 29 passes max relations containing the same ideal: 96 filtering wants 1000000 more relations[/CODE] Thanks for the help everybody. 
[url=http://homepage2.nifty.com/m_kamada/math/graphs.htm]This site[/url] has some good info on parameter selection.
There is a new thread the Math forum about estimating time to run GNFS that you might find interesting too. Try out Yafu too. 
Thanks for pointing those out. [STRIKE]A quick question on the Kamada graphsfor determining parameters, do you use the SNFS difficulty or the actual number of digits, at least for initial testing?[/STRIKE] [I]Edit: Nevermind, I was being dumb. It looks like it goes with the difficulty, or at least Factmsieve does.[/I]
Also, the number I was working on ended up needing between 52M and 55M (I set minrels to 55M and it workedat 52M, it needed more). 
Currently working on an SNFS 206. Now at over 65M relations (54.5M unique) and getting:
[CODE]begin with 14655138 relations and 16282704 unique ideals reduce to 13932177 relations and 15555212 ideals in 24 passes max relations containing the same ideal: 184[/CODE] Does this seem right? 
I assume that isn't the first pass of singleton removal. You are close but aren't quite there yet.

Good deal. That was from the "inmemory singleton removal" step, so yes, I believe you're correct.

All times are UTC. The time now is 03:48. 
Powered by vBulletin® Version 3.8.11
Copyright ©2000  2021, Jelsoft Enterprises Ltd.