![]() |
|
|
#89 | |
|
"Dave"
Sep 2005
UK
23·347 Posts |
Quote:
number of k's in file starting of range of n ending range of n Then follows for each k value the following series of lines: k=value of k first n value second n value : : last n value Last fiddled with by amphoria on 2007-09-11 at 16:24 |
|
|
|
|
|
|
#90 | |
|
May 2007
Kansas; USA
1039510 Posts |
Quote:
Curtis has WAY more experience with sieving than I do, especially with 3-20 k's like he says, but I have done some unusual sieves such as 4,500 k's at once that it appears he hasn't done so I'll add my 2 cents. The Legender symbol "memory-intesiveness" of the search for sr2sieve seems to make little difference in the actual sieve speed. If you're sieving large k's (i.e. > 1e7) or a moderate number of 'medium-sized' k's (i.e. 5 to 20 k's from about 1e5 to 1e7), you have the RAM, and don't mind waiting for just a little bit at the beginning, sr2sieve works best most of the time. It's almost definitely the # of k's that makes the biggest difference. I had a large heavy-weight sieve with 12 total k's with 9 of them > 1e8 where the Legender symbols required nearly 1 GB RAM :surprisedsimply because the k's were so large and it took about 3 hours just to create them! (Very unusual and extreme case.) But once it did get them created, sr2sieve smoked srsieve about 2.5 or 3 to 1 on speed because it was only 12 k's. I saved off the symbols file after the first run and on subsequent runs it only took 2-3 minutes to read the file, so it was even better. But when I tried to use sr2sieve to sieve across 500 k's from k=1 to 1000, it took very little time to create the symbols file ( < 15 secs), yet the sieve was FAR slower than srsieve...1/2 to 2/3rds as fast, I think. Based on my experience, here are my suggestions for sieving software in different situations: 1. 1 to 2 k's of the form k*2^n+/-1: use sr1sieve. It's fast enough that running it twice is usually faster than running anything else once with 2 k's. 2. 3 to about 50 k's: use sr2sieve. 3. About 50 k's or more: use srsieve. 4. Very specific types of searches like SG's or twins: use NewPGen. The 50 k's is a very rough estimate. If you have anywhere between about 20 and 100 k's you want to sieve, I'd suggest testing both programs and see which is faster. If about the same, use srsieve because it doesn't require as many steps and so is a little easier to use. Once you get used to the srsieve series of programs, you'll rarely go back to NewPGen for standard searches. If you haven't used sr2sieve, it takes a little getting used to. There are a few hoops to jump though on it. Curtis is the best resource for getting all of the ins and outs of it. Gary Last fiddled with by gd_barnes on 2007-09-11 at 19:16 |
|
|
|
|
|
|
#91 | |
|
May 2007
Kansas; USA
33×5×7×11 Posts |
Quote:
As a general rule, the smaller the k, the less memory the symbols take. Like I alluded to in my last post to Anon, I had a 12 k search that used nearly 1GB! 9 of the k's were > 1e8. But the symbols are VERY quirky...I had one of the k's where it took 3-5 secs to create its symbols and another one of nearly the same size that took 20-30 mins! Thinking it must be my computer, I tested it again and the same thing happened. I even tested it on another type of computer with the same results. Probably the biggest problem in your search is that there are too many k's but the average size of the k makes a big difference also. You could almost definitely create the symbols for 10K < k < 20K but you may not be able to create them for 100K < k < 110K. From sr2sieve's perspective, k > 100K is rather large. Regardless, even if you were able to create the symbols file, I'm sure that you would have found that srsieve would have been much faster than sr2sieve. Gary Last fiddled with by gd_barnes on 2007-09-11 at 19:23 |
|
|
|
|
|
|
#92 |
|
May 2007
Kansas; USA
33×5×7×11 Posts |
Karsten,
I did a Proth prime search on all Riesel primes shown on the summary site for 10K < k < 1M. This now effectively extends the search for twins for Riesel primes shown on our k < 300, 300 < k < 3010, and summary sites to include ALL k < 1M. The list is attached. A couple of notes: 1. I previously tested all Riesels on our various sites for k < 10K for Proths, including very high ones. There were no additional twins found other than those that you already have from my list in the Twin Prime Search forum. 2. I only searched the primes shown on our site, whose k's have many gaps in their primes. This list is by no means a complete list for all of each k. In many cases, the list in the TPS forum will have more twins for some k's than this list due to the gaps on our site. I wanted to do this because the new attached list has twins for k > 100K/n < 10K and for n > 15K unlike the other one. In doing this, I found the highest twin on our site to date. It is 210885*2^16595-/+1. Gary |
|
|
|
|
|
#93 |
|
Mar 2006
Germany
23×3×112 Posts |
it's a long time since last update, one reason was my holiday. i think next week i can complete an update of all pages including all work done for remaining-k-page: fill nash weights, mark k's as low or 15k, mark twins and Sophie Germain. all other pages are quite uptodate except small things to do. i build also a new page with statistics of the 4 data pages with counts of k's, their primes, twins, SG's, 15k, low, 2145k, 2805k, riesel.
i entertain the idea of making a new page with searched k over n-ranges like many primes listed in the summary from Broadhurst, Underbakke, Heuer or Kenny, etc. so that page stay slim when i'm adding more k's. also to split the other pages seems the best choice to work easily with statistics. for some data like number of k's or primes or low weights i got scripts to check the pages and show me the counts. i try to do this for twins and SG too. so please if there any errors you found or some missing low primes send me them for including in the data pages. karsten Last fiddled with by kar_bon on 2007-09-27 at 10:16 |
|
|
|
|
|
#94 |
|
Mar 2006
Germany
23×3×112 Posts |
hello,
at last a new update of all data. new: - splitted all data in several pages: for 300<k<2000, 2000<k<4000, 4000<k<6000, 6000<k<8000 and 8000<k<10000. for k>10000 for every 10^n in an own page and all k>10^10 in the last one. - new page with statistics of k's, primes, twins and Cunningham Chains. - all other pages are up to date (contributors, programs, infos, woodall) Kosmaj will upload these the next days. i hope the next update i will not need so much time, but this was a hard work with the remaining k's and the statistics. karsten |
|
|
|
|
|
#95 | |
|
May 2007
Kansas; USA
289B16 Posts |
Quote:
![]() Gary |
|
|
|
|
|
|
#96 |
|
Nov 2003
2·1,811 Posts |
The latest version of data pages is now available on our web site: www.15k.org
Thanks to Karsten for his help. |
|
|
|
|
|
#97 |
|
May 2007
Kansas; USA
101000100110112 Posts |
Great to see the new cool pages!
![]() Continuing talk about twins earlier in this thread...I have now posted all twins up to n=20K for k<1M in the Twin Prime Search forum here. I also included a link to a new web page that I created that lists all of the twins over n=10K for k<1M. ![]() Gary |
|
|
|
|
|
#98 |
|
May 2005
23×7×29 Posts |
I have reached n=10000 for 100001<k<199999 project and I will continue till n=20000
|
|
|
|
|
|
#99 | |
|
May 2007
Kansas; USA
33·5·7·11 Posts |
Quote:
![]() I guess at some point, we need to do 10K<k<100K up to n=10K. Gary |
|
|
|
|
![]() |
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| Erroneous data | ATH | Data | 8 | 2013-11-13 19:21 |
| Corrupted data? | Oops! | Information & Answers | 2 | 2013-10-22 03:48 |
| GPU TF vs DC/LL data | bcp19 | GPU to 72 | 0 | 2011-12-02 16:41 |
| Data available? | Prime95 | LMH > 100M | 10 | 2007-06-22 23:55 |
| Conflicting data? | ATH | Data | 4 | 2006-02-27 13:53 |