View Single Post
Old 2006-02-03, 20:58   #6
xilman's Avatar
May 2003
Down not across

28×41 Posts

Originally Posted by jocelynl
I'm talking ram size to analyze the 100 million relations of the recent finds it take in excess of 2GB of ram. But then again the advance in technology may permit to reach higher limits.
Ok, it's making a bit more sense but I'm still not fully understanding your meaning.

For a start, it's certainly possible to analyze 100M relations in less than 2GB RAM but I concede it can be tricky. I've developed processes to let me do it, as have other workers, but it does seem to require rather more effort than any of the simple approaches. For instance, removing duplicates and (most) singletons can be done with a multipass algorithm in relatively modest amounts of memory. Merges can also be done in stages, each stage using a different range of large primes, but it can take quite a bit of time and labour.

I think what I'm really not understanding is your claim that NFS is running out of steam. There are many many integers within reach of current implementations of NFS, far too many to do with a reasonable amount of time and effort. So I guess that you may mean that there are relatively few integers of the form 2^n-1 (and possibly 2^n+1) which are still feasible to factor by SNFS. Is that what you mean?

xilman is online now   Reply With Quote