[QUOTE=Batalov;413779]A rhetorical question if there ever was one.
[SPOILER]Not seeing your messages saves so much of my page space! You really needed to quote the whole message to ask [I]that[/I]?[/SPOILER][/QUOTE] the reason I asked it because they very slightly mentioned something in PM that was supposed to speed it up at very least. 
[QUOTE=alpertron;413775]Although the method generates primes as expected, the minuend and subtrahend are almost as big as in the previous version. For more interesting ranges, you should subtract extremely huge numbers and the timing would be worse than trial division.
You could use modular arithmetic in order to speed up your algorithm, but again this will be worse than trial division. Why? The union of the sets A and B must include all prime numbers below n. So you need to perform about n/log(n) multiplications mod prime greater than n[SUP]2[/SUP]. So the number of steps is the same as the worst case of trial division. Notice also that to find a prime p<n[SUP]2[/SUP] first you will need to compute a prime larger than n[SUP]2[/SUP] in order to use it as the modulus. So again this is not useful at all.[/QUOTE] I agree I don't think it is necessarily worse than trial division in general though. There should be a medium range where less trials than trial division would result in primes specially if you write the routine intelligently by keeping the sums as low as possible but for very large primes the sums will diverge very rapidly. You don't really need to calculate the base primes either. you can use the theorem to come up with solutions to that such as using double factorials of a composite odd integer, rather than primes multiplied. the double factorial can be calculated using product of sequence formula. To further keep the sum from diverging you can factor out large number of known primes. But all this would not do for a billion plus digits. As I mentioned before there are better approaches to this. BTW, thank you for being one of the few intelligent posters to this thread. 
[QUOTE=Batalov;413778]So, you are saying that for you method to work you need to store somewhere [B]all primes[/B] up to 10[SUP]500000000[/SUP], right?
Good luck with that. This universe is not equipped to store all primes up to 10[SUP]1000[/SUP], is it?[/QUOTE] Please read the red comment in my earlier post. I thought it was distinct enough. Obviously I was wrong. 
[QUOTE=a1call;413782]Obviously I was wrong.[/QUOTE]
Good, we are in agreement then. 
[QUOTE=Batalov;413783]Good, we are in agreement then.[/QUOTE]
There are 3 types of people in the world. There are those who can count, and there are those who can't.:smile: 
[QUOTE=a1call;413781]But all this would not do for a billion plus digits.[/QUOTE]
Understatement of the century. Your method will not scale to 20 digit primes, never mind a billion digits. 
This makes me think of Maurer or ShaweTaylor methods for generating proven primes. There we get to double or triple the bits every round. One could do something similar with other BLS75 theorems (Maurer's algorithm can be seen as a case of theorem 3) or even ECPP. Just build up the proof backwards to get larger and larger numbers. It makes nice large proven primes, but I seriously doubt it would scale well to millions of digits.

[QUOTE=a1call;413772][COLOR=#2E74B5][FONT="]Theorem 1: [/FONT][/COLOR]
[FONT=Calibri]For the set [/FONT][B][FONT="]A[/FONT][/B][FONT=Calibri] of [B][I]n[/I][/B] [U]consecutive[/U] prime numbers staring from [B]2[/B] and ending at [B]Pn[/B].[/FONT] [FONT=Calibri]Let:[/FONT] [B][FONT="]B[/FONT][/B][B][FONT="]⊂[/FONT][/B][B][FONT="]A &[/FONT][/B][FONT="] [B]C=AB[/B][/FONT] [FONT=Calibri] [/FONT] [FONT=Calibri]Let the number [B]b[/B] be equal to multiples of anyintegerpower greaterthan [B]0[/B] of alltheelements of [/FONT][B][FONT="]B[/FONT][/B][FONT=Calibri].[/FONT] [FONT=Calibri] [/FONT] [FONT=Calibri]Let the number [B]c[/B] be equal to multiples of anyintegerpower greaterthan [B]0[/B] of alltheelements of [/FONT][B][FONT="]C[/FONT][/B][FONT=Calibri].[/FONT] [FONT=Calibri] [/FONT] [FONT=Calibri]Then [B]d = [/B][/FONT][B]±[/B][B][FONT=Calibri]b[/FONT]±[/B][B][FONT=Calibri]c[/FONT][/B] [FONT=Calibri] is a [B][U]prime number[/U],[/B][/FONT][/QUOTE] False. Here is a pair of simple counterexamples: Counter #1: A = {2,3}, B = 2, C = 3. Let b = 2^1, c = 3^3. Then b+c = 29 is prime, but b+c = 25 is not. Counter #2: A = {2,3,5,7,11}, B = 2, C = {3,5,7,11}. Let b = 2^1, c = 3*5*7*11 = 1155. Then b+c = 1153 is prime, but b+c = 1157 = 13*89 is not. 
[QUOTE=ewmayer;413801]False. Here is a pair of simple counterexamples:
Counter #1: A = {2,3}, B = 2, C = 3. Let b = 2^1, c = 3^3. Then b+c = 29 is prime, but b+c = 25 is not. Counter #2: A = {2,3,5,7,11}, B = 2, C = {3,5,7,11}. Let b = 2^1, c = 3*5*7*11 = 1155. Then b+c = 1153 is prime, but b+c = 1157 = 13*89 is not.[/QUOTE] Won't suffice, both are higher than n^2 (in the first case 9, second case 121). What he calls "Pn2". His theorem is true, but not useful to generate primes, as Serge said, it will need to store primes to 10^500M. 
[QUOTE=ewmayer;413801]False. Here is a pair of simple counterexamples:[/QUOTE]
You've accidentally left out a critical part of the theorem (because of poor typography on OP's part, mind you) [QUOTE]Then d = ±b±c is a prime number, [COLOR="Red"]∀d: d < Pn2[/COLOR][/QUOTE] 
[QUOTE=Batalov;413778]So, you are saying that for you method to work you need to store somewhere [B]all primes[/B] up to 10[SUP]500000000[/SUP], right?
Good luck with that. This universe is not equipped to store all primes up to 10[SUP]1000[/SUP], is it?[/QUOTE] BTW, the idea of partitioning a set of primes into two sets and looking at combinations of products is hardly new. See the paper "Primes at a Glance" by John Selfridge et.al. 
All times are UTC. The time now is 16:08. 
Powered by vBulletin® Version 3.8.11
Copyright ©2000  2020, Jelsoft Enterprises Ltd.