mersenneforum.org  

Go Back   mersenneforum.org > Prime Search Projects > Prime Sierpinski Project

Reply
 
Thread Tools
Old 2005-07-11, 18:39   #23
ltd
 
ltd's Avatar
 
Apr 2003

22×193 Posts
Default

Online submission will be a problem as i don't have the infrastructure for that at the moment and i don't have enough free time to build one. For the ranges send in i have an archive of all files that where send to me so if one sends in his factrange file the data can be used later even when there is no corresponding data in the DB at the moment.

Alos do be honest the stats have one weak point as only the first factor received for one k/n pair gets points at the moment. This is due to the simple design of my DB but with the data available for all ranges it is possible to build something different later.

Lars
ltd is offline   Reply With Quote
Old 2005-07-11, 21:27   #24
Joe O
 
Joe O's Avatar
 
Aug 2002

20D16 Posts
Default

Quote:
Originally Posted by ltd
Alos do be honest the stats have one weak point as only the first factor received for one k/n pair gets points at the moment. This is due to the simple design of my DB but with the data available for all ranges it is possible to build something different later.

Lars
I don't consider that a weak point. All you need is one factor for a k/n pair.
Joe O is offline   Reply With Quote
Old 2005-07-11, 21:35   #25
Joe O
 
Joe O's Avatar
 
Aug 2002

3×52×7 Posts
Default

Quote:
Originally Posted by ltd
I will start with a 1006 to 50 Mil file to see how many lost factors we have.
If there are not that many i will change to a 20m-50m file for public release.
Lars
You might just want to stay with the 1006<n<50M dat file. It should only be a little slower than a 20M<n<50M file. Just for comparison, we have found that the 991<n<50M dat is only 15% slower than the 1M<n<20M dat. i.e we have found that increasing the n space by 250% only slows us down slightly.
I wonder if the difference between 1006<n<50M and 1006<n<20M will be even meaningful. And you would only have to support one dat.
Joe O is offline   Reply With Quote
Old 2005-07-12, 03:41   #26
hhh
 
hhh's Avatar
 
Jun 2005

373 Posts
Default

I have got 800kp/s at sob right now; if you want me to do a small range, assign me one, I will start immediately. Yours H.
hhh is offline   Reply With Quote
Old 2005-07-12, 07:18   #27
hhh
 
hhh's Avatar
 
Jun 2005

373 Posts
Default

I think you should not make two different dat-files, for two reasons (or three):
-The speed up will not be very high (15%?).
-possible missed factors seem not to be very important, but later it will require a long time to find equivalent factors by sieving (thats why we are still resieving up to 100000G in SoB,
-It will confuse to have two dats, and this may result in new missed factors, by ranges sieved by the wrong dat.

MHO. H.
hhh is offline   Reply With Quote
Old 2005-07-12, 08:21   #28
Citrix
 
Citrix's Avatar
 
Jun 2003

2·7·113 Posts
Default

There is one dat upto 80T and then the other dat above 80T.

Citrix
Citrix is offline   Reply With Quote
Old 2005-07-12, 16:15   #29
ltd
 
ltd's Avatar
 
Apr 2003

22·193 Posts
Default

Some progress informations about the new dat file.

The file is build up and i have already sieved around 15G out of the 100G target before releasing since this morning.

In the range from 1G to 9.5G i sieved out 185000 factors.
In the 50G range there are still around 1000 factors per G.

The file should be ready for public release latest on sunday.

Oh by the way i think we should stay with the 1006 to 50Mil file i am using at the moment and not switch to a 20Mil to 50Mil file for faster catch up.

Lars
ltd is offline   Reply With Quote
Old 2005-07-12, 16:57   #30
Citrix
 
Citrix's Avatar
 
Jun 2003

2×7×113 Posts
Default

Quote:
Originally Posted by ltd

Oh by the way i think we should stay with the 1006 to 50Mil file i am using at the moment and not switch to a 20Mil to 50Mil file for faster catch up.

Lars
Why not?

Citrix
Citrix is offline   Reply With Quote
Old 2005-07-12, 17:16   #31
ltd
 
ltd's Avatar
 
Apr 2003

14048 Posts
Default

At least i found already a lost factor. ( out of one of my own original ranges :( )

For the performance impact i can not give any real numbers
as the runtime is not stable due to the number of factors found.
For example at 1G my PC had 202kp/s and at 9G it has 242 to 248 kp/s.
So i will wait until the runtimes have stabilised a little bit before i make further tests.

Lars

P.S.:
The same PC has around 380kp/s at 80T with the 300k-20Mil dat
ltd is offline   Reply With Quote
Old 2005-07-12, 17:31   #32
VJS
 
VJS's Avatar
 
Dec 2004

12B16 Posts
Default

I wouldn't release that dat just yet...

You have to sieve out at least 1T first before you release. I think Joe and I updated the dat about 10 times before we reached 3T.

We also rechecked the dat with a couple other programs, I think Joe used sobsieve and i used proth and newpgen. Also which program did you use for the first 1G?

Luckly no factors were missed by proth and later found by sobsieve/newpgen and vise versa. So its your call but i'd recheck the first couple G at least.

Sorry to say this but keep it to yourself or one other person for now. Once you get a little higher I'll help out with the combined effort for sure. I'll also double check some stuff.
VJS is offline   Reply With Quote
Old 2005-07-12, 17:56   #33
ltd
 
ltd's Avatar
 
Apr 2003

22·193 Posts
Default

@VJS

Sorry it was a little bit to late for me to consider your very good points.
I had already released the file.

The first 1G was done with newpgen.
I did not make any crosscheckes of the new results from Proth_sieve as i am very confidend about the results. I have done alot crosschecks with the old ranges using newpgen and sobsieve and never had an error with the proth_sieve results. So it did not even come to my mind to make that crosschecks again with the new ranges.

Lars
ltd is offline   Reply With Quote
Reply



Similar Threads
Thread Thread Starter Forum Replies Last Post
Line sieving vs. lattice sieving JHansen NFSNET Discussion 9 2010-06-09 19:25
64-bit sieving paleseptember Five or Bust - The Dual Sierpinski Problem 16 2009-01-25 20:26
Should a diskless node run it's own ecm program or should I combine them somehow? jasong GMP-ECM 1 2006-02-24 08:34
Sieving ValerieVonck Math 9 2005-08-05 22:31
Sieving OmbooHankvald Prime Sierpinski Project 4 2005-06-30 07:51

All times are UTC. The time now is 16:06.


Fri Jul 16 16:06:37 UTC 2021 up 49 days, 13:53, 1 user, load averages: 1.68, 1.88, 1.82

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.