mersenneforum.org Polynomial selection for 2,1109+ c225
 Register FAQ Search Today's Posts Mark Forums Read

 2022-09-14, 00:13 #12 Batalov     "Serge" Mar 2008 Phi(4,2^7658614+1)/2 3·7·479 Posts Just in order to not fork another thread, some targets that could be a fallback from 2,1109+ (if it turns SNFSki) are, perhaps, -- Code: 214 7 419 - 354 0.604 219 11 334 + 347.8 0.629 219 5 523 + 365.5 0.599 222 3 718 + 342.5 0.648 224 3 823 - 392.6 0.57 (224 5 479 + 334.8 0.669) (225 2 1109 + 333.8 0.674)
 2022-09-15, 09:14 #13 firejuggler     "Vincent" Apr 2010 Over the rainbow 22·7·103 Posts a slightly better (for me at least) poly Code: skew: 755192.453 c0: 44089320441977706345863258655893321664536110560 c1: 268491014998631707851828924312857969747572 c2: 168895151056819271392513808334804436 c3: -1095536629595790473291869141585 c4: -933378718901445252914593 c5: 850800448461467910 c6: -210163753800 Y0: -710135625395382122411596439629784935 Y1: 2600874528869472580495674989 # MurphyE (Bf=6.872e+10,Bg=3.436e+10,area=1.766e+18) = 1.850e-09
 2022-09-20, 09:08 #14 firejuggler     "Vincent" Apr 2010 Over the rainbow 22·7·103 Posts finally a decent one Code: skew: 724988.351 c0: 5864149138561428958664120082846303367267868400 c1: -64887105680355982672042016633688198554460 c2: -204393359764698159887054341192074888 c3: 184048900225366414544979923507 c4: 428493172390342410123193 c5: -161001307814852802 c6: 26851865040 Y0: -823051945817186779901623462260379129 Y1: 21719286194539104138651493 # MurphyE (Bf=6.872e+10,Bg=3.436e+10,area=1.766e+18) = 2.188e-09 maybe it can be spinned
2022-09-20, 16:50   #15
EdH

"Ed Hall"
Dec 2009

5,261 Posts

Quote:
 Originally Posted by firejuggler finally a decent one . . . maybe it can be spinned
The spin came back worse.

 2022-09-24, 12:53 #16 Gimarel   Apr 2010 2×53 Posts My current best: Code: # norm 3.812685e-16 alpha -11.270627 e 2.186e-16 rroots 4 skew: 27486666.79 c0: -21949786704915546043243424808434624720995532180576 c1: -41058006152954229159929753709281199565102884 c2: -2103952747130896180152288821190124349 c3: -96714753899249454119686484058 c4: -8749213175835406783325 c5: 356178171100692 c6: 604800 Y0: -3069461441978573391108054100447770150 Y1: 4741101382398693934919 # MurphyF (Bf=6.872e+10,Bg=3.436e+10,area=1.766e+18) = 2.394e-09 I think that a 10%-15% better score is possible but that doesn't seem to be enough to beat the snfs poly.
2022-10-01, 15:35   #17
R.D. Silverman

"Bob Silverman"
Nov 2003
North of Boston

750810 Posts

Quote:
 Originally Posted by Gimarel My current best: Code: # norm 3.812685e-16 alpha -11.270627 e 2.186e-16 rroots 4 skew: 27486666.79 c0: -21949786704915546043243424808434624720995532180576 c1: -41058006152954229159929753709281199565102884 c2: -2103952747130896180152288821190124349 c3: -96714753899249454119686484058 c4: -8749213175835406783325 c5: 356178171100692 c6: 604800 Y0: -3069461441978573391108054100447770150 Y1: 4741101382398693934919 # MurphyF (Bf=6.872e+10,Bg=3.436e+10,area=1.766e+18) = 2.394e-09 I think that a 10%-15% better score is possible but that doesn't seem to be enough to beat the snfs poly.

Typically one spends about 5% of the total effort on poly selection. Based on prior discussion it seems that
nothing close to that amount of effort has been spent.

2022-10-01, 17:49   #18
VBCurtis

"Curtis"
Feb 2005
Riverside, CA

33·11·19 Posts

Quote:
 Originally Posted by R.D. Silverman Typically one spends about 5% of the total effort on poly selection. Based on prior discussion it seems that nothing close to that amount of effort has been spent.
How do you translate "a couple A40-months" from post #4 into a percentage of total effort? I'm serious here- I don't know just how fast an A40 is.

More generally, how does one convert GPU time to a sense of effort when a GPU is 100+ times faster than a CPU for poly select?

I think we (mostly Greg) have spent more than 1% of total effort on poly select, and we're not close to the SNFS score. While it's possible to catch the SNFS score, why spend the effort when beating the score is so doubtful?

If the answer is "because GNFS225 is cool and new, while the SNFS job isn't", I am on board with that.

2022-10-01, 20:07   #19
R.D. Silverman

"Bob Silverman"
Nov 2003
North of Boston

22·1,877 Posts

Quote:
 Originally Posted by VBCurtis How do you translate "a couple A40-months" from post #4 into a percentage of total effort? I'm serious here- I don't know just how fast an A40 is.
Comparing efforts from different architctures for different problems is fuzzy at best. Thus, -->

I don't know the answer either.

Quote:
 More generally, how does one convert GPU time to a sense of effort when a GPU is 100+ times faster than a CPU for poly select?
I don't know the answer to this question either!

Quote:
 I think we (mostly Greg) have spent more than 1% of total effort on poly select, and we're not close to the SNFS score. While it's possible to catch the SNFS score, why spend the effort when beating the score is so doubtful? If the answer is "because GNFS225 is cool and new, while the SNFS job isn't", I am on board with that.
I agree that it probably isn't worth doing right now as an SNFS job; 2,1091+ and 2,1097+ are easier and
1st/2nd holes. It might be intersting as an SNFS job if only to see what the limits are for NFS@Home.

And once more to show my ignorance: I don't know whether GNFS can beat SNFS for 2,1109+. My prior guess that it could has been shown to be doubtful.

 2022-10-02, 02:30 #20 VBCurtis     "Curtis" Feb 2005 Riverside, CA 33·11·19 Posts I agree that a clear and accurate measure of "enough poly select effort" is difficult. So, rather than aim for a specific amount of effort, forum members who do team poly select efforts shoot for a particular E-score. We base our target scores on our list of record poly scores for each digit size, and the list of records provides us a rather accurate guess for an achievable score (that is. one on trend for other records of nearby or similar sizes). Charybdis did such an interpolation in post #1, predicting 2e-16 as an easy mark. Beating that by 10% is a reasonable goal, but the SNFS poly sieves about 20% better than our candidate "2e-16" scoring GNFS. So, we need a 2.4 or 2.5 just to match, and I think that's as good as we can hope for. You can find the list of best poly E-scores organized by degree and input size here: https://mersenneforum.org/showpost.p...&postcount=200
2022-10-02, 08:59   #21
R.D. Silverman

"Bob Silverman"
Nov 2003
North of Boston

22×1,877 Posts

Quote:
 Originally Posted by VBCurtis I agree that a clear and accurate measure of "enough poly select effort" is difficult. So, rather than aim for a specific amount of effort, forum members who do team poly select efforts shoot for a particular E-score. We base our target scores on our list of record poly scores for each digit size, and the list of records provides us a rather accurate guess for an achievable score (that is. one on trend for other records of nearby or similar sizes). Charybdis did such an interpolation in post #1, predicting 2e-16 as an easy mark. Beating that by 10% is a reasonable goal, but the SNFS poly sieves about 20% better than our candidate "2e-16" scoring GNFS. So, we need a 2.4 or 2.5 just to match, and I think that's as good as we can hope for. You can find the list of best poly E-scores organized by degree and input size here: https://mersenneforum.org/showpost.p...&postcount=200
Does anyone have any idea how the e-scores are distributed? I doubt that they are Gaussian. My "intuition"
(which could be very wrong) says that they are more likely to be shaped like a Gamma/ChiSquare distribution (i.e.
with a long tail to the right). Has anyone ever looked at this?

However they are distributed, you should probably aim for at least 2-sigma above the mean. Mean should be a
function of size. Do you have any idea how far above the mean your history of 'record' scores lie?

Does anyone have histograms of past data, rather than just the record values? Did anyone keep the mean
values in addition to the 'records'? It would be an interesting study. It might even be publishable.

Note: When I say "distributed', I mean the distribution of values for each candidate and not the distribution of the 'record'
values as a function of size.

Last fiddled with by R.D. Silverman on 2022-10-02 at 09:06

 2022-10-02, 10:08 #22 swellman     Jun 2012 74128 Posts Kamada’s site has 20+ years of data, some displayed graphically but no e-score histogram. He may have such data in his archives. There is data in the NFS@Home log files for the various GNFS factorizations over the years but that would require a bit of data mining. I believe @VBCurtis did some digging into this a few years ago but the thread eludes me. Greg Childers may have such data as a matter of course (though I doubt it). But some data was likely never captured, e.g. hardware and SW used, processing time, discarded poly search results that “weren’t good enough”, and intangibles like user motivation - how much time does one really need to spend on a GNFS 155 search before going with a “good enough” result versus say a GNFS 210 where a few percent in e-score represents LOT of sieving. Polynomial searches have evolved over the last decade - increased use of GPUs, better algorithms and search methodologies give better results in less time. Most of the records in the table are recent - virtually all of the old search results have been topped in the last 3-years. But I love the idea of a study and I hope someone here takes the idea and runs with it.

 Similar Threads Thread Thread Starter Forum Replies Last Post Max0526 NFS@Home 9 2017-05-20 08:57 jasonp Msieve 65 2011-05-01 19:06 Unregistered Information & Answers 3 2011-04-16 14:24 fivemack Factoring 47 2009-06-16 00:24 CRGreathouse Factoring 2 2009-05-25 07:55

All times are UTC. The time now is 22:03.

Tue Feb 7 22:03:39 UTC 2023 up 173 days, 19:32, 1 user, load averages: 1.48, 1.39, 1.16