![]() |
|
|
#23 | ||
|
Aug 2002
3×7 Posts |
Quote:
ops: Youre right. Thougt I tripplechecked it on a bunch of numbers, but I must have been blind. With 1 it works. Thanks for seeing it! Quote:
|
||
|
|
|
|
|
#24 | ||
|
Aug 2002
3416 Posts |
Quote:
|
||
|
|
|
|
|
#25 | |
|
Aug 2002
1516 Posts |
Quote:
------------------------------------------------------ Probability that TF up to n bits will find a factor, under assumption TF already is done up to n-1 bits. Based on historical data from factors & nofactors - Sept.08-2002[code:1] Fact bits: Exponent range Predicted 00M-79M 10M-15M 15M-20M 30M-35M n 1/n: ( =ALL) 50 2,00% 2,05% 1,99% 1,98% 2,03% 51 1,96% 2,01% 1,97% 1,95% 1,93% 52 1,92% 1,97% 1,90% 1,96% 1,96% 53 1,89% 1,89% 1,89% 1,90% 1,88% 54 1,85% 1,85% 1,80% 1,81% 1,86% 55 1,82% 1,80% 1,81% 1,78% 1,84% 56 1,79% 1,79% 1,77% 1,73% 1,76% [/code:1] When looking here, Georges 1/n formula seem to work very good. I guess the standard deviations are normal for the actual sample sizes. For reference, there are around 2800 factors behind the percentages at the 5M samples, and around 44000 per figure in the "ALL" collumn. I choosed to show up to 56 bits only, just because I then feel pretty safe that we look at "apples vs apples" rather than anything uncomparable. Nofactors states that every exponent up to 79M is trial factored to at least 56 bits. An assumtion I made about the factors table, is that when a factor is shown, it is supposed to be the smallest existing one. Anyone knowing if that's right? |
|
|
|
|
|
|
#26 | |
|
P90 years forever!
Aug 2002
Yeehaw, FL
17·487 Posts |
Quote:
|
|
|
|
|
|
|
#27 |
|
Aug 2002
23×52 Posts |
Hey Svempasnake, great work! :D Looks like George is fully exonerated. The trend I saw must have a statistical fluke due to the limited size of my dataset.
I'm not convinced that it was necessary to exclude all numbers above 56 bits. As long as at each level you work only with numbers that have either had a factor found at that level or that have been trial factored through that level without finding a factor, the results for that level should be valid. The only problem is that the sample size gets smaller and smaller as you reach levels that have been less than fully trial-factored over the entire 0-79M range, making the results less statistically significant at each subsequent level. On second thought, P-1 factoring is also going to corrupt the results at higher levels. If a number has been trial factored to n bits but subsequent P-1 factoring finds a factor at n+3 bits, this new factor will be placed in the factors database with no indication that the range from n+1 to n+3 hasn't been trial factored. So I guess I've just talked myself back around to your assumption that the number in the factors table is indeed the smallest factor. And as more and more factors are found via P-1 factoring, the validity of this method of data-mining the factors database will become worse and worse. George, is a record kept anywhere of which factors were found using trial factoring versus using P-1? |
|
|
|
|
|
#28 | |
|
P90 years forever!
Aug 2002
Yeehaw, FL
201278 Posts |
Quote:
|
|
|
|
|
|
|
#29 | |||
|
Aug 2002
3·7 Posts |
Quote:
Let's have a look at nofactors, which can help us inderectly:[code:1] All exponents 8.25M - 13.38M in nofactors (TF goes up to 64 bits) Factored to bits / Number of exponents 59 13 60 15 61 7 62 9 63 6472 64 113487 All exponents 13.38M - 17.85M in nofactors (TF goes up to 65 bits) Factored to bits / Number of exponents 57 21 58 59 59 149 60 56 61 30 62 6 64 49 65 102436 [/code:1]It looks to me as nofactors purely consist of TF:s, and P-1:s aren't updated here. Does anyone know if thats how it works If that's the case, we can almost allways tell whether a factor was found by TF or P-1. Especially in exponent ranges where TF is as 99%-complete as above: Take binarydigits 4.0e19 factor as an example. First we count its bits carefully ;) and see it's a 66 bit factor. Since we know that TF ends at 65 bits for 16 M exponents, this factor must have been found trough P-1 and not trough TF. Another example: A 17.0 M exponent has a 65 bit factor. Assuming TF went up to 65 bits, this factor was found trough TF and not trough P-1. Third example: a 12 M exponent has a factor of 64 bits. Assuming TF went to 64 bits, this factor was also found trough TF. A table like the one above should help us knowing how good or bad such assumtions are. Quote:
|
|||
|
|
|
|
|
#30 | |
|
Aug 2002
258 Posts |
Quote:
|
|
|
|
|
|
|
#31 | ||
|
"Nancy"
Aug 2002
Alexandria
2,467 Posts |
Quote:
48091790614964383575080990595271931709835968599049593 = 13329478389458153107343 * 3607927422951418869297495045751 I, too, checked factors.zip to see if the composite factor made it there, but the smaller of the two prime factors is correcty listed. A p53 by P-1 would just have been too nice.. Alex |
||
|
|
|
|
|
#32 | ||
|
Sep 2002
2·131 Posts |
Quote:
|
||
|
|
|
|
|
#33 |
|
Aug 2002
Termonfeckin, IE
276810 Posts |
[quote="svempasnake"]
Let's have a look at nofactors, which can help us inderectly:[code:1] All exponents 8.25M - 13.38M in nofactors (TF goes up to 64 bits) Factored to bits / Number of exponents 59 13 60 15 61 7 62 9 63 6472 64 113487 All exponents 13.38M - 17.85M in nofactors (TF goes up to 65 bits) Factored to bits / Number of exponents 57 21 58 59 59 149 60 56 61 30 62 6 64 49 65 102436 [/code:1] Ha! I might actually be worth it to do the factoring for atleast the exponents which have been factored to 57-63 bits only. The smaller bit factoring will go through pretty fast I imagine. I looked in the status.txt file and most of these exponents have been tested once and are not yet in range to be doublechecked. I wonder if the prime95 client reported incorrectly or if the folks who did the first test turned off trial factoring. Maybe George can tell us: [code:1] 9995243,60 9996509,60 9996673,60 9996703,60 9996859,60 12026057,60 12195593,60 12376337,60 12376363,60 12722923,60 12750373,60 13019759,60 12718319 12771757 12808949 12821153 12837259 12862439 12865469 12867941 12872239 12876709 12885793 12904979 13182061 [/code:1] I am wondering if it's worth getting a box to dedicate itself to factoring these exponents?? In the 13.38-17.85M range several of the exponents are still in the process of being factored so touching them at this time is not prudent but I am definitely going to try and factor the exponents listed above if anyone cannot give me a reason not to. Garo |
|
|
|
![]() |
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| Can I decrease the factorization time with this formula ? | Godzilla | Miscellaneous Math | 53 | 2017-02-07 07:58 |
| Denser matrices save BL time | Batalov | Msieve | 8 | 2010-02-14 11:39 |
| Another colossal waste of time? | rogue | Lounge | 7 | 2007-11-13 23:28 |
| results.txt - Prime95 didn't record time for factorization | ixfd64 | Software | 1 | 2006-03-30 13:39 |
| P-1 save files didn't save work | outlnder | Software | 1 | 2003-01-19 23:01 |