mersenneforum.org Brent's p-1 - How to deal with memory problems?
 User Name Remember Me? Password
 Register FAQ Search Today's Posts Mark Forums Read

 2005-01-11, 12:10 #1 jhillenb   Jan 2005 2 Posts Brent's p-1 - How to deal with memory problems? I implemented the p-1 method according to Hans Riesel's "Prime Numbers and Computer Methods for Factorization". First I computed a table of primes and primepowers below some B1. I replaced each primespower p^n with p. I saved this table in a textfile. Next I computed a table of primes between B1 and some B2 (according to Riesel B2 = 20*B1 would be OK). I saved the half diffences of those primes in another textfile. For p-1 I load those textfiles in two vectors and can then factorize every N = \prod p_i^n_i * r where p_i < B1 and B1 < r < B2, p_i, r prime. Everything works ok, just the size of those textfiles warns me. For B1 = 2e7, B2 = 4e8 they are 10MB resp 45MB big which is still fine. However, I read in this forum something about B1 = 4e9, B2 = e13. THEN I doubted that my implementation is the right one. Please let me know how to handle such large bounds B2, B2. Saving the primes binary can't be a so effective or can it? Computing the primetables each time I want to factorize shouldn't be a solution too. Please answer
2005-01-11, 16:40   #2
xilman
Bamboozled!

May 2003
Down not across

276116 Posts

Quote:
 Originally Posted by jhillenb I implemented the p-1 method according to Hans Riesel's "Prime Numbers and Computer Methods for Factorization". ... Computing the primetables each time I want to factorize shouldn't be a solution too.
Why not? It's what most implementors do. In fact, they generally compute the primes as they go, and do not bother storing them in a table.

In fact, for the primes in the range B1<p<B2, we don't usually worry too much about whether the number is prime. Putting in composites doesn't damage the chances of finding a factor and it's often faster to use a composite than it is to discover that it's not prime and then discard it. The usual approach is to drop multiples of 2 and 3, as they are easy to deal with, and use every other odd number. As a simple example, suppose B1=10 and B2=100. Then in the second stage we'd start with 11 (the first odd number larger than 10 and not a multiple of three) and alternatively add 2 and then 4. The sequence which results thus runs 11, 13, 17, 19, 23, 25, 29, 31, 35, 37, 41, ..., 83, 85, 91, 95, 97. Some composites creep in, but not too many, and all the primes are present.

Moral: sometimes it's faster to do more work than necessary in one area if it saves you work elsewhere.

Paul

Last fiddled with by xilman on 2005-01-11 at 16:41 Reason: Slight rephrasing for clarity

 2005-01-11, 17:55 #3 jhillenb   Jan 2005 2 Posts Thanks for your answer! "Why not?" Well, there seem to be large gaps of nonprime numbers and if I factorize different numbers, I can use the tables again and again and traverse them "quickly". After all, I thought Hans Riesel and Henri Cohen would know what they write about. I agree that between 10 and 100 the composites that creep in aren't too many, but if you take B1 = e9 and B2 = e11 just without multiples of 2 and 3 for example, there are quite a lot of composites. However, if you say it still is better to work that way I should believe you.
2005-01-11, 19:55   #4
xilman
Bamboozled!

May 2003
Down not across

17·593 Posts

Quote:
 Originally Posted by jhillenb However, if you say it still is better to work that way I should believe you.
No, you should not believe me! You should prove to yourself that it is better, or perhaps not, as the case may be.

You've already learned that storing all the primes is prohibitively expensive when the numbers get too large. You should now see whether it is cost effective to calculate successive batches of primes, or to test each integer in turn for primality, or not to bother. Relatively simple timing experiments will show you where the trade-off lies in each case.

Paul

 2005-01-11, 23:50 #5 Ken_g6     Jan 2005 Caught in a sieve 2·197 Posts Somebody correct me if I'm wrong, but I believe every prime p found in a sieve removes 1 out of every p remaining numbers. So 2 removes half the numbers, 3 removes 1/3 of those remaining, 5 removes 1/5 of the remaining candidates, etc. As you can see, returns dimish fairly quickly. Excluding multiples of 11 will get you less than a 10% speedup, and that's excluding the cost of removing them. Some people do make a short list of steps to skip that will let you avoid factors of 2, 3, 5, and 7. Here's what I made for a Python factoring program. I should probably mention that it doesn't work very fast. :P Code: # First, a list of low primes, below 210, that *are* prime. fll = [ 3, 5, 7, 11, 13, 17, 19, 23, 29, 31, 37, 41, 43, 47, 53, 59, 61, 67, \ 71, 73, 79, 83, 89, 97, 101, 103, 107, 109, 113, 127, 131, 137, 139, \ 149, 151, 157, 163, 167, 173, 179, 181, 191, 193, 197, 199 ] # Now, a sequence of numbers to add to get the next number to test. # Begin at 209 (mod 210), add each number in the list, then check for divisibility. flh = [ 2, 10, 2, 4, 2, 4, 6, 2, 6, 4, 2, 4, 6, 6, 2, 6, 4, 2, 6, 4, 6, 8, 4, \ 2, 4, 2, 4, 8, 6, 4, 6, 2, 4, 6, 2, 6, 6, 4, 2, 4, 6, 2, 6, 4, 2, 4, \ 2, 10 ]

 Thread Tools

 Similar Threads Thread Thread Starter Forum Replies Last Post chris2be8 Factoring 446 2020-04-29 17:08 ixfd64 Data 8 2016-04-06 19:10 Xyzzy Soap Box 29 2015-12-14 21:27 bsquared Factoring 9 2007-05-18 19:24 graysky Hardware 9 2004-08-05 10:45

All times are UTC. The time now is 04:29.

Fri Jul 10 04:29:24 UTC 2020 up 107 days, 2:02, 0 users, load averages: 1.42, 1.58, 1.68

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2020, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.