Quote:
Originally Posted by gophne
As for the primality test (for mersenne primes) it is formulaic in nature, so it is like proving the lemma for the lower mersennes, and then extending it to the rest of the set of mersenne numbers. My problem being that the "primality" check could well be too unwieldy at very high numbers. I do not have a good understanding of algorithm complexity and computation time concepts.

Think about it in four stages.
1. Define it precisely and unambiguously. Until you do this, you
can't even be wrong.
2. Prove that your test holds for all primes (of the appropriate form). Once you do this you have a pseudoprimality test (but its value is still unclear).
3. Prove that your test fails to hold for all composites (of the appropriate form). Once you do this you have a primality test (but possibly a slow one).
4. Prove that your test runs in a (small) certain amount of time based on the size of the input and perhaps other characteristics.
There are inbetween stages. Before you get to 3 you can perform random testing to show that your test fails for either all composites tested or a very high percentage. Before you get to 4 you can use heuristics or implementations to argue that it will be fast. Really you need to get to at least stage 2 before any of this matters though.
You don't need to get to 4 to be of value. BPSW is at 2, and MillerRabin is stuck there permanently. ECPP looks like it will be at 3 for the foreseeable future.