![]() |
complexity of Pepin's test
Pepin's test is used to test Fermat numbers for primality, but the numbers quickly grow too large to be tested in a reasonable amount of time.
So I'm just curious, how much resources (time, memory, etc) would it take a computer of, say 3 GHz, to test a number like F33? |
[QUOTE=ixfd64]Pepin's test is used to test Fermat numbers for primality, but the numbers quickly grow too large to be tested in a reasonable amount of time.
So I'm just curious, how much resources (time, memory, etc) would it take a computer of, say 3 GHz, to test a number like F33?[/QUOTE] How long does it take for such a machine to square an integer of 2^33 bits, given that it can (for sake of argument, the precise number will be different) square a 2^6 bit integer in 1 microsecond? Given the answer to that question, how many microseconds does it take to perform 2^33 such squarings? Finally, there are close to 2^25 seconds per annum and 2^20 microseconds per second, so there are about 2^45 microseconds per annum. Convert your answer to the second question from microseconds to years. No, I am not going to give you the answers to the above questions. Working them out for yourself is educational. Paul |
[QUOTE=xilman]How long does it take for such a machine to square an integer of 2^33 bits, given that it can (for sake of argument, the precise number will be different) square a 2^6 bit integer in 1 microsecond?
Given the answer to that question, how many microseconds does it take to perform 2^33 such squarings? Finally, there are close to 2^25 seconds per annum and 2^20 microseconds per second, so there are about 2^45 microseconds per annum. Convert your answer to the second question from microseconds to years. No, I am not going to give you the answers to the above questions. Working them out for yourself is educational. Paul[/QUOTE] I think your response is a bit unfair. We can not expect others to know how to do simple arithmetic. |
[QUOTE=ixfd64]Pepin's test is used to test Fermat numbers for primality, but the numbers quickly grow too large to be tested in a reasonable amount of time.
So I'm just curious, how much resources (time, memory, etc) would it take a computer of, say 3 GHz, to test a number like F33?[/QUOTE] Nobody has software that can handle a number that big. Note that even if you did, you'd need at least 4GB of memory just to fit intermediate results. See the F24 paper of Crandall, Mayer and Papadopoulos for some estimates. jasonp |
[QUOTE=jasonp]See the F24 paper of Crandall, Mayer and Papadopoulos for some estimates.[/QUOTE]
I'm always amused about the form of self-citations in scientific papers. :wink: Similar to the use of "we"/"the authors", when there is only one... |
I often wondered about the "we" form of scientific papers until I happened to read one where the author kept using "I". It was very hard to read. I think someone speaking to you as a person in the "I" form distracts too much from the subject matter - due to the author appearing as an actual person in the text, the brain thinks there's a dialogue and kinda switches into social interaction mode which makes you lose focus on the hard facts.
My .02€, Alex |
I like the "we" because it always struck me as giving the feel that the reader is being invited to participate in a collaborative learning experience.
My current in-development Mlucas v3.0 has Fermat-mod capability and can handle numbers the size of F33 (a single modmul needs on the order of a minute on a fast single-CPU machine), but as has been stated, even if you had a massively parallel machine and a perfectly parallelized FFT, the sheer number of modmuls needed for the Pe'pin test of F33 is overwhelming - you would need to be able to do a mod-F33 squaring every few milliseconds to be able to test F33 in under a year. With a perfectly parallelized FFT (and no such beast exists - it's extremely tough to get decent parallel big-FFT performance with as few as 4 CPUs) you would need several tens of thousands of CPUs to achieve that kind of performance. That kind of hardware is not out of reach (assuming you can get one or more of the world's economic superpowers to give up their global-climate or nuclear-weapons simulations for, oh, just the coming year - y'all weren't really going to *use* that billion-dollar supercomputer for anything, were you?), but as I said there is no software that can wring anywhere close to the needed big-FFT parallelism out of it. Ask me again in ten years, or whenever the needed number of CPUs has dropped to around 1000 or less, and maybe then things will look less daunting. |
[QUOTE=akruppa]I often wondered about the "we" form of scientific papers until I happened to read one where the author kept using "I". It was very hard to read. I think someone speaking to you as a person in the "I" form distracts too much from the subject matter - due to the author appearing as an actual person in the text, the brain thinks there's a dialogue and kinda switches into social interaction mode which makes you lose focus on the hard facts.
[/QUOTE] DJB, in one of his papers asterisks the first "I", and has a footnote at the bottom saying something along the lines of 'I refuse to continue mechanically replacing "I" by "we"'. I forget which one though, I can hunt it out if anyone's interested. I have to differ when it comes to preferences. If I know a paper has only a single author, every time I see "we" I get this "what? are you a nutcase, or royalty?" distraction which might make me lose focus on the hard facts. As long as the maths is clear (and the steps are gentle) I don't really care about either pronouns, or active/passive, or anything. |
[QUOTE=fatphil]DJB, in one of his papers asterisks the first "I", and has a footnote at the bottom saying something along the lines of 'I refuse to continue mechanically replacing "I" by "we"'. I forget which one though, I can hunt it out if anyone's interested.
[/QUOTE] I remember that comment - I think we're talking about the same paper. On further thought, I don't think it could have been anyone but Dan. :rolleyes: Alex |
I like the advice given to me by George Bergman (a professor at UC Berkeley). He prefers to use "we" when he is involving the reader in something. (Like, "We now divide n by 3.") He uses "I" when he is referring only to himself. (Like, "I conjecture that there are no counter-examples...") Makes sense.
|
[QUOTE=ixfd64]Pepin's test is used to test Fermat numbers for primality, but the numbers quickly grow too large to be tested in a reasonable amount of time.
So I'm just curious, how much resources (time, memory, etc) would it take a computer of, say 3 GHz, to test a number like F33?[/QUOTE] The benchmarks page: [url]http://www.mersenne.org/bench.htm[/url] says that it would take a 3800 MHz Pentium-4 about [spoiler]4050[/spoiler] years to do a LL test on a number of that size, and a Pepin test on F33 should be comparable. But a 4GB FFT size would create some problems. Interestingly enough, F31, which was considered out of reach before Alex Kruppa found a factor, would take on the order of 250 years according to the benchmarks page, which a multi-processor system could conceivably bring down to a few decades. |
| All times are UTC. The time now is 01:42. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.