mersenneforum.org

mersenneforum.org (https://www.mersenneforum.org/index.php)
-   Information & Answers (https://www.mersenneforum.org/forumdisplay.php?f=38)
-   -   Factoring bit depth? (https://www.mersenneforum.org/showthread.php?t=15913)

davieddy 2011-08-09 21:03

[QUOTE=drh;268778]David/Oliver

I reserve similiar to Oliver, a few days, up to 2 weeks worth at a time, but as long as I'm not traveling or have Internet access, I upload my results every night. Maybe that's a little more than I need to, but they do get dished out immediately afterward ... just trying to keep things moving a little bit.

Doug[/QUOTE]

[SPOILER]I'm not the Messiah - I'm just a naughty boy![/SPOILER]

Who am I to suggest the precise schedule two prolific TFers should adopt?

Once the TF (72 bit) wavefront is comfortably ahead of LLs being dished out, it won't matter one iota.

BTW the most recent query "Will Prime95 keep going till the assignment
is finished?" has given me a thought:
How many folk out there think that an incomplete LL test is better
than nothing?
Of course saving the residue and iteration number is a suggestion
that has been raised many times before. What is the answer?

Could this partly explain why 80% of LLs are expired/returned?

David

chalsall 2011-08-09 21:25

[QUOTE=davieddy;268788]Like nuclear fusion?

Leave the art of anal articulation to me:smile:[/QUOTE]

Hey, nuclear fusion has been working only 8.3 light minutes away for 5 billion years or so.

We humans are now able to bring temperatures down to very close to absolute zero, and much higher than the core of our (and any other) sun.

It is only a matter (no joke intended) of time before we are finally able to sustain a controlled fusion reaction from which we can derive energy.

(As we all know (or should know), fusion reactions have already "successfully" been used here on earth on purpose. But, sadly, they were not used for peaceful purposes....)

davieddy 2011-08-09 22:46

[QUOTE=fivemack;268669]it appears that nobody is interested in discussing the details of the modelling, because you have failed to convince them that it will be interesting and useful. Such is life.
[/QUOTE]

This isn't quite true.
I observed that the frequency of P-1 finds dropped off approximately
exponentially with bit length.
I expressed this quantitatively, and requested a
(preferably simple) theoretical explanation.

David

Christenson 2011-08-09 23:26

[QUOTE=TheJudger;268760]Hi!

David: what do you think is a good turnover time for TF assignments?
No matter of the runtime of current assignments I try to receive work for one week at once.

Oliver[/QUOTE]

Oliver:
Once the automatic connection to the server is working in mfaktc, this will become a far less critical question, and the server will be able to do what it does now, which is to hand out assignments in more "critical" areas to those users and machines that in the past have completed them and turned them in quickly.

I hadn't thought about turnaround, but I recently had a 10 day period where I turned in no work (travelled to San Jose), and probably had 15 days of work queued up. This having two weeks of work to do and getting it done over 2-3 weeks is quite normal for the machine I have borrowed at the local University; I don't visit it every day and can't visit it now until I get out from under a small flood at home.

E.

If we are getting an 80% incomplete rate on 53M LL assignments, then I do suggest (without offering the requisite resources, of course!!) that the server be contacted after each 10% of the assignment with enough information for a different machine to complete the assignment.

Either that, or start out new Prime95 clients with something smaller that actually completes in a few days, like P-1.

cyrix 2011-08-10 04:22

first guess
 
Hi!
My English isn't very well, so let us concentrate on the math.

The main problem is determining the probability, that P-1 finds a factor under the condition, that trial factoring didn't.

So for the first guess let us assume, there is no second stage in P-1. And we will find a factor P of 2^p-1 iff the k in P=2kp+1 is B1-smooth. (Actually the k has to be B1-power-smooth, but let us ignore this little difference.)

The probability, that a random k is B1-smooth can be evaluated by the Dickmans-Function delta(ln k/ ln B1), because k^(1/u) with u= lnk / ln B1 evaluates to k^(ln B1/ ln k)=B1 and one gets the probability of k being B1-smooth as k goes to infinity.

The probability, that such a P divides 2^p-1 is 1/P or nearly 1/2kp. The probability, that this P is prime is (PNT) about 1/ln(P) or 1/(ln(2)+ln(p)+ln(k)).


Combining these probabilities under assumption of independence gives us the probability, that a given P is a prime factor of 2^p-1 and is found by P-1-factoring with bound B1(=B2):

P(k)=delta(ln k/ ln B1) / ( (ln(2)+ln(p)+ln(k)) * (2kp) ).

This we have to sum up from the least k, that P=2kp+1>B -- the bound of trial factoring -- to infinity. (There are several simplifications in it. First for high P=2kp+q the probability of dividing 2^p-1 is not 1/P but 0 if P>2^p-1. But the term converges very fast to 0, so that the failure for summing up to infinity is low. Second there could be found more than one factor, so we have to subtract these probabilities. But they are also very low, so for simplicity we ignore these cases.)

For computing the probability that a factor P of 2^p-1 is found with P-1 factoring with bounds B1=B2 under the condition that trial factoring to bound B failed is approximately

[TEX]P_p\left(B,B_1\right) \approx \int_{k=\frac{B}{2p}}^{\infty} \frac{\delta\left(\frac{\ln k}{\ln B_1}\right)}{ (ln(2)+ln(p)+ln(k)) \cdot 2kp } \text{d} k.[/TEX]


For large k the sum ln(2)+ln(p)+ln(k) is dominated by ln(k) and so the integral simplifies to

[TEX]P_p\left(B,B_1\right) \approx \frac{1}{2p} \cdot \int_{k=\frac{B}{2p}}^{\infty} \frac{\delta\left(\frac{\ln k}{\ln B_1}\right)}{k \ln k} \text{d} k.[/TEX]


This can be computed numerically.


Am I right this far? Or are the assumptions for simplicity to hard?


Cyrix

davieddy 2011-08-10 09:26

[QUOTE=cyrix;268813]Hi!
My English isn't very well, so let us concentrate on the math.
[/QUOTE]

I'm sorry to hear that. Hope it makes a speedy recovery:smile:

I also hope Bob will give you full marks for effort, and hopefully
for proficiency too!

Now does your result explain (simply!) an exponential drop in factors
found by P-1 with bitlevel, "half life" ~8 bits?

David

davieddy 2011-08-10 09:54

[QUOTE=davieddy;268764]PS Does anyone still think a 100M digit prime will be found before
2025?
[/QUOTE]

[QUOTE=chalsall;268785]Yes. I do.

Why? Because the work in quantum computing is making great progress. (Gotta love those q-bits!)
[/QUOTE]

[QUOTE=davieddy;268788]Like nuclear fusion?
[/QUOTE]

Well apparently one managed to factor 15 a few years ago.
Only 99,999,998 digits to go, and QC+Shor will be factoring all the
Mersenne numbers between 332M and 500M bits like s*** off a shovel.
If it finds one it can't factor, it might be prime!

BTW I think my "fusion" analogy might be closer than I had
originally intended:
Difficulty with "Containment"?

David

davieddy 2011-08-10 10:21

Lies, Damned Lies and Statistics:
 
[URL]http://www.mersenne.info/trial_factored_tabular_data/2/50000000/[/URL]

[URL]http://mersenne.org/primenet/[/URL]

We thought 72 bits of TF was a "conservative" target.

How are "we" doing?
Marks out of ten please.

David

Dubslow 2011-08-14 19:12

Mr. (Dr. ?) Silverman, I just want to point out that this thread has long been removed from the mathematics forum, so if you don't care about the actual bit depths, please leave this thread. If not, then please ignore that.

The current answer is that we are going to bit levels deeper than what's listed at /various/math.php, although I recently realized that the list doesn't go beyond exponents of 100M.

My next question then became what are the methods that George used to determine the bit depths? He hasn't been here since I asked that, but if anyone else knows, please post.

ckdo 2011-08-14 19:55

[QUOTE=Dubslow;269084]My next question then became what are the methods that George used to determine the bit depths? He hasn't been here since I asked that, but if anyone else knows, please post.[/QUOTE]

There's not too much math involved into determining the bit depths at present, if any. Going to the tabulated depth, the TF wavefront got too far ahead of the LL wavefront, and George decided to TF one bit deeper for exponents between 53M and 70M, or some such. That process has repeated at least once thus far and I expect it to be repeated time and again as long as enough people do TF.

Christenson 2011-08-14 23:53

George's original motivation was noting the TF effort for a given, fixed bit level is proportional to 1/log N. He then solved a minimisation problem: Given, say, 10,000 prime mersenne exponents, how do I eleminate them with the least effort, ON AVERAGE.

We all know that the answer is that it happens at the amount of effort on each type of work where the answer doesn't matter as to which type of work it did. On some single CPU, probably his prototypical 1GHz-Day per day CPU, he found that would happen at the bit depths he tabulated.

Along came GPUs and mfaktc, which TFs 10-100 times faster than CPUs, and the GPUs have all been crowding into the easy, fast work of TF (as opposed to CUDALucas), and you have the present situation. I expect this process to stop after 3-6 more bit levels, particularly as CUDALucas matures....and the TF work becomes equally hard as it was before, since each bit level of TF doubles the amount of compute work involved.

At the moment, I have 1 medium-power GT440 card doing TF, (the other is doing Operation Billion Digits exclusively ATM), and it is producing about as many factors as half a dozen machines of various speeds doing P-1. The haul on the 666 exponents ckdo handed me looks to be 7 or 10 factors in the last two weeks; each of these factors found represents an LL-D test that won't need to be done. I'll be done with that the day after tomorrow.


All times are UTC. The time now is 10:25.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.