[QUOTE=Gordon;376133]Are you being serious?
Which part of [SIZE="6"][B][U]"completely factored"[/U][/B][/SIZE] allows for composite factors? you seem to struggle with comprehension of the English language...[/QUOTE] the part under polynomial factoring ?: [QUOTE]Some polynomials cannot be factored into the product of two binomials with integer coefficients, (such as x2 + 16), and are referred to as prime. "Factoring completely" means to continue factoring until no further factors can be found. [/QUOTE] note that not all prime polynomials need to produce only primes. so done in the non algebraic form this could lead to composite factors. 
[QUOTE=alpertron;376144]Gordon, I know that there is an extremely small probability that the big cofactor is not prime and so the Mersenne number is composite (we will need ECPP or a better algorithm to be invented in the future to settle the question), but it is clear that after determining that the cofactor is a pseudoprime, no one would try to continue running TF, P1, ECM, etc. on that Mersenne number.
[/QUOTE] You should have quit while you were behind. Instead, you find it necessary to spew ignorance once again. The cofactor, if it is indeed a pseudoprime, means that it is composite! A pseudoprime is a probable prime that is composite! Either learn this subject or stop prattling. [QUOTE] That's why I said that the number was completely factored. It was not intended to have mathematical rigor. [/QUOTE] But it was stated as an exact statement. And from the words alone noone can determine that it was NOT intended to be rigorous. Learn to say what you mean! 
Yes, you are correct with respect to pseudoprimes and probable primes.

[QUOTE=Gordon;376121]Unless you flip the coin so high that it leaves orbit you get proof in 2 seconds. Bad analogy.[/QUOTE]
Sorry, I wasn't thinking that my friend would just instantly tell me the answer, since in the spirit of the analogy it would make no sense for him to do so. And since you have clarified your thoughts, I don't disagree with you. Which I already said. Not that what I think matters. 
[QUOTE=R.D. Silverman;376127]You seem to fail to understand. Precise definitions are what mathematics is about. This is a mathematical discussion.
I suggest that you learn about the use of quantifiers in mathematics.[/QUOTE] I've been trying to pursue a mathematical side discussion in this vein. We have a definition of completely factored in post 33 and we have many statements that can be summarized by "it's not completely factored if there is the possibility of currently unspecified prime factors." I've been arguing that the summarization is not derivable from the definition. So far all I've received is repeated statements of the summarization. (Well, I've also received an accusation of drug use, but later posts seem to eschew such a tactic.) The issue is whether a factorization that contains a PRP could meet the definition. Not whether it could be proven to meet the definition  clearly not. But if the PRP is, in fact, although unbeknownst to us, a prime, then the factorization with the PRP is, unbeknownst to us, a complete factorization according to the definition. If not, then I'd like to understand what clause of the definition is violated. 
Ok, let's come back to the "data" part of the topic category we are in, and let's say that in this point we are highly confident that the number may be prime, therefore M576551 may be completely factored.
During people was fighting here, I started to test all bases lower than 256 with PFGW (which does not accept a higher base, this I just learned trying to do this test). Then using a small powmod() function (pari is too slow, it takes about 5 hours for a prp test, in contrast to pfgw which takes about 7 minutes, single core; my powmod took about 13 minutes for one test) I randomly selected 50 bases between 10 and 60 digits, and I tested the bPRP property of the cofactor. It didn't turn out composite. For example, it is PRP to base 1592653589793389 (random prime). So, either this number is PRP to hundreds different bases and I was bloody lucky to step on them in my random selection, or either is carmichael number, or it is prime. The probability is much larger for the last (many levels of magnitude). If you pick randomly a number of 170k digits and test its probable primality, and it turns out it is PRP, then the probability is much higher to be prime, than to be pseudoprime. As the numbers go higher, the PSP are much rarer than P (that is why they are called "industrial grade" primes). Of course, we can not say with certitude that the number is prime, but we "conjoketure" that it is, and in this point we are very confident that the conjoketure is true! 
[QUOTE=LaurV;376176][...]I randomly selected 50 bases between 10 and 60 digits, and I tested the bPRP property of the cofactor. It didn't turn out composite. For example, it is PRP to base 1592653589793389 (random prime).[...][/QUOTE]
Does this make the possibility that the cofactor is composite so vanishingly small that it would not occur in similar circumstances in many times the lifetime of the universe? Or is there some interdependency between different bases when testing for probable primeness so that there are some pseudoprimes which routinely pass the test in many bases? (I have no understanding of this field and apologise in advance for my ignorance.) 
If you go "higher" in value, the report between "pseudoprimes" (i.e. PRP which are composite, i.e. numbers that pass a PRP test, but are still composite) and primes goes smaller and smaller, with exponential speed. If you use multiple bases, the speed is even higher. See [URL="http://primes.utm.edu/prove/prove2_2.html"]this very good page[/URL] (pages, you need to click "next page" and "previous page" few times :D) about how many pseudoprimes are there in the wild.
For example, under 100k, there are 9592 primes, and only 78 "pure" 2PSP (i.e. composite numbers which pass a 2PRP test). So, if you select a random number under 100k and do a 2PRP test, and it pass, then the probability of it to be composite is 78/(78+9592)~=0.008, or 0.8%. There are about [URL="http://primes.utm.edu/howmany.shtml#table"]69k primes[/URL] between 100k and a million, but only 167 are 2PSP. So, if you select a random number bigger than 100k and under 1M, and do a 2PRP test, and it pass, then the probability of it to be composite is 167/(167+68906)~=0.002, or 0.2%. (remark that the range is 9 time larger, and the numbers are 10 times larger). Repeating the story for 100k digit number gives you a [URL="http://primes.utm.edu/notes/prp_prob.html"]probability[/URL] of [TEX]\frac{1.3}{10^{10584}}[/TEX] of that number to be composite. Carmichael numbers (PRP to all bases which do not contain factors) are even less, exponentially less, for example there are only 43 carmichael numbers below one million (including those below 100k). Our "prime" has 170k digits (even bigger! so smaller probability) and we used [B][U]many[/U][/B] bases. I would bet my life that the number is not composite. RDS himself said (in the thread related to the "smallest milliondigit prime") that any PRP in that range is "almost sure" a prime. Of course, we know that unless someone prove it prime, or factors it, we can not be certain. But (considering its size and the number of bases tested) the probability of it to be composite is somewhere at 1/10[sup]50000[/sup] 
Thanks for the description and links, LaurV.
If I understand it correctly then, the existence of Carmichael numbers which are composite but PRP to all relatively prime bases, and the fact that there are known to be infinitely many of them, is an interdependency between different bases when you test them for PRP. But this still does not limit the vanishinglysmallness of the possibility of compositeness when a number has passed PRPtests to multiple bases. At a certain point I guess more PRPtests are pointless because you are quickly extraordinarily safe in assuming, albeit not having proven, that the number is either prime or a Carmichael number. And the former is much, much more likely when the number is large. 
If people want to be more sure, someone should do a SPRP test to a few bases.

[QUOTE=LaurV;376176]Ok, let's come back to the "data" part of the topic category we are in, and let's say that in this point we are highly confident that the number may be prime, therefore M576551 may be completely factored.
[/QUOTE] "May be" is irrelevant. We already know that it "may be". [QUOTE] During people was fighting here, I started to test all bases lower than 256 with PFGW (which does not accept a higher base, this I just learned trying to do this test). Then using a small powmod() function (pari is too slow, it takes about 5 hours for a prp test, in contrast to pfgw which takes about 7 minutes, single core; my powmod took about 13 minutes for one test) I randomly selected 50 bases between 10 and 60 digits, [/QUOTE] <Bob turns on deliberate rant mode> I realize of course that the people who regularly pollute this forum with nonsense and pointless computation are either too lazy or too stupid to read, but the following paper is a classic: [url]https://www.math.dartmouth.edu/~carlp/PDF/paper72.pdf[/url] There are also followup papers that have been published that I will let people find themselves, if they are not too lazy. 50 PRP tests is ^%!@#%^!@#^ POINTLESS. I have discussed this many times on this forum. But of course, people don't read my posts either. Multiple PRP tests are [b][i]NOT[/i][/b] independent of one another. Once one does a few of them, further tests are pointless. READ the above paper. Of course, this subtle suggestion eludes people so they continue to mindlessly apply unnecessary computer time (13 min x 50 in this case) under a delusion that they are somehow adding value. Rather than 50 <#!@&*#@&*@#> PRP tests, it would have been far better to have done a single Frobenius test or Lucas test. Oops! I forgot. Many of you haven't got a clue what the Frobenius or Lucas test is, because you can't be bothered to educate yourselves. Before people do any computations, a complete reading of the Crandall & Pomerance book and Hans Riesel's book [I could mention others, but I won't] should be a REQUIREMENT. <rant off> 
[QUOTE=henryzz;376197]If people want to be more sure, someone should do a SPRP test to a few bases.[/QUOTE]
Another clueless suggestion. One should do a SINGLE SPRP test followed by a single Frobenius or Lucas test. 
[QUOTE=BrianE;376187]Does this make the possibility that the cofactor is composite so vanishingly small that it would not occur in similar circumstances in many times the lifetime of the universe?
[/QUOTE] If you weren't too lazy you would actually READ the Kim & Pomerance paper. .... and the Crandall & Pomerance book. I have given these references before. But noone ever seems to read them. Furthermore, if people could be @#@#&*@# bothered to ever do a Google search on the subject of primality testing, this paper and many others would turn up. [QUOTE] Or is there some interdependency between different bases when testing for probable primeness [/QUOTE] Ah! An intelligent question!! The Kim & Pomerance paper gives the answer. [QUOTE] so that there are some pseudoprimes which routinely pass the test in many bases? [/QUOTE] Read the paper. Look up MillerRabin. [QUOTE] (I have no understanding of this field and apologise in advance for my ignorance.)[/QUOTE] Stop apologizing and DO SOMETHING about it. READ!!!!!!! 
Thanks for the pointers, RDS.:smile:

Along with LaurV's 304 (overlapping*) PRP tests:
[CODE] ./pfgw64 tc q"(2^5765511)/4612409/64758208321/242584327930759" PFGW Version 3.7.7.64BIT.20130722.x86_Dev [GWNUM 27.11] Primality testing (2^5765511)/4612409/64758208321/242584327930759 [N1/N+1, BrillhartLehmerSelfridge] Running N1 test using base 3 Running N+1 test using discriminant 11, base 1+sqrt(11) (2^5765511)/4612409/64758208321/242584327930759 is Fermat and Lucas PRP! (2127.9222s+0.0200s) [/CODE] and running my own test with GMP: [CODE]Is likely prime: (L+2)^(n+1)==7 (mod n, L^2L+1)[/CODE] Running tests with Lucas components greatly reduces the chance the number being tested is a Carmichael number. However, if we did 304 octillion such tests it is no primality proof and so we cannot truely say this mersenne number is "completely factored" :wink: (*) baseaPRP and basebPRP implies baseabPRP :geek: 
Cofactor of M576551 passed "Extra strong" BPSW as well as Paul's (L+2)^(n1)=5+2x Frobenius test. With the usual caveats that this just makes us highly skeptical of it being a composite.
Regarding OpenPFGW and Pari, there is no question that once above 10k or so digits, OpenPFGW is very, very fast at doing its PRP tests. But the test is relatively weak  it's a great prefilter, especially if more than one base is used. Pari, assuming you're not living in the dark ages using pre2.3, does an "almost extra strong" BPSW test in its ispseudoprime function. This is a stronger test. For me this means more than PFGW passing lots of its tests. Pari is also not the fastest BPSW implementation (but none of them that I know of come close to OpenPFGW's test speed for numbers over 100k+ digits). Adding one of the Frobenius tests seems useful to give another, different, test. Reference welcome if this isn't true. It's relatively expensive (about the same cost as a full BPSW) so not something I'd normally run unless I want some additional check. I don't think it hurts to run one randombase MillerRabin test after BPSW. People have differing opinions on the usefulness of this. Mathematica's documentation states that it adds a base3 MR test to BPSW. I'm digressing on the digression at this point... 
[QUOTE=R.D. Silverman;376201]"May be" is irrelevant. We already know that it "may be".[/QUOTE]
Sigh. You've repeated the parts we are in agreement about, but completely ignored the question of how to apply your definition from Post #33. Let me try a more extreme example. As you have pointed out in other threads, n(x) = number of prime factors of x counting duplicates p(x, k) = kth prime factor of x (1 <= k <= n(x)) are well defined functions. Does the following represent x as a product of primes? [INDENT][TEX]x = \sum\limits_{k=1}^{n(x)} p(x,k)[/TEX][/INDENT] 
[QUOTE=wblipp;376253]Sigh. You've repeated the parts we are in agreement about, but completely ignored the question of how to apply your definition from Post #33. Let me try a more extreme example. As you have pointed out in other threads,
n(x) = number of prime factors of x counting duplicates p(x, k) = kth prime factor of x (1 <= k <= n(x)) are well defined functions. Does the following represent x as a product of primes? [INDENT][TEX]x = \sum\limits_{k=1}^{n(x)} p(x,k)[/TEX][/INDENT][/QUOTE] No. 
[QUOTE=wblipp;376253]Does the following represent x as a product of primes?[/QUOTE]
I believe you mean: [TEX]x = \prod_{k=1}^{n(x)} p(x,k)[/TEX] edit: or easier as [TEX]x = \prod_{px} p[/TEX] 
[QUOTE=science_man_88;376258] [TEX]x = \prod_{px} p[/TEX][/QUOTE]
When the LHS equals 12, the RHS equals 2x3 = 6, so something is not right there. 
[QUOTE=wblipp;376164]The issue is whether a factorization that contains a PRP could meet the definition. Not whether it could be proven to meet the definition  clearly not. But if the PRP is, in fact, although unbeknownst to us, a prime, then the factorization with the PRP is, unbeknownst to us, a complete factorization according to the definition. If not, then I'd like to understand what clause of the definition is violated.[/QUOTE]
Obviously it *could* meet the completelyfactored definition ... "all" that is missing for it to actually do so is a proof of primality of the PRP. Absent such proof, it *does* not meet the definition. To use a legal analogy, the applicable phrase is "assumes facts not in evidence". 
Does fixing the summation/product mistake change your answer? Assuming not we have
[QUOTE=wblipp;376253]Does the following represent x as a product of primes? [INDENT][TEX]x = \prod\limits_{k=1}^{n(x)} p(x,k)[/TEX][/INDENT][/QUOTE] [QUOTE=R.D. Silverman;376257]No.[/QUOTE] I'm relieved. I don't want this to satisfy the definition of fully factored. But as you point out, what we want doesn't matter  mathematics is about definitions. So my question is, what part of the definition does it fail? [QUOTE=R.D. Silverman;375973]A number is completely factored when it is represented as the product of primes.[/QUOTE] Is it that individual multiplicands are not prime? Or is it that the product doesn't equal the number? Or that this is not a representation? Or something else? 
[QUOTE=alpertron;376269]When the LHS equals 12, the RHS equals 2x3 = 6, so something is not right there.[/QUOTE]
doh, I forgot about prime powers I can fix it though: [TEX]x=\prod_{p^in} p^i[/TEX] 
[QUOTE=science_man_88;376294]doh, I forgot about prime powers I can fix it though:
[TEX]x=\prod_{p^in} p^i[/TEX][/QUOTE] Using the same example: 12 = 2x4x3 = 24 (the first factor uses i=1 and the second one uses i=2). 
[QUOTE=alpertron;376301]Using the same example: 12 = 2x4x3 = 24 (the first factor uses i=1 and the second one uses i=2).[/QUOTE]
I get it I suck I was thinking p^i was the greatest power of p dividing n but thanks for the insight. 
The usual way is the  operator, where p^e  n means that p^e divides n but p^(e+1) does not divide n (p prime). So you can write
[TEX]n=\prod_{p^en}p^e[/TEX] You could do it other ways. For example, [TEX]n=\prod_{p^en,\ e>0}p[/TEX] 
[QUOTE=ewmayer;376275]Obviously it *could* meet the completelyfactored definition ... "all" that is missing for it to actually do so is a proof of primality of the PRP. Absent such proof, it *does* not meet the definition. To use a legal analogy, the applicable phrase is "assumes facts not in evidence".[/QUOTE]
BTW, I have no problem with the term "probable complete factorization" to describe the OP scenario. 
[QUOTE=ewmayer;376275]Obviously it *could* meet the completelyfactored definition ... "all" that is missing for it to actually do so is a proof of primality of the PRP. Absent such proof, it *does* not meet the definition. To use a legal analogy, the applicable phrase is "assumes facts not in evidence".[/QUOTE]
To the OP  or to put it another way, that's yet another vote for "it isn't completely factored". when are you going to withdraw or at the very least amend your claim? 
[QUOTE=Gordon;376376]To the OP  or to put it another way, that's yet another vote for "it isn't completely factored".
when are you going to withdraw or at the very least amend your claim?[/QUOTE] Read post #97. 
[QUOTE=Gordon;376376]To the OP  or to put it another way, that's yet another vote for "it isn't completely factored".
when are you going to withdraw or at the very least amend your claim?[/QUOTE] If it's all about voting, then maybe we should all now vote on whether M(2976221) is a proven prime (considering that we cannot prove that the Peano axioms for the natural numbers are consistent and therefore that proofs about them using Peano arithmetic could be incorrect).:devil: 
[QUOTE=Gordon;376376]To the OP  or to put it another way, that's yet another vote for "it isn't completely factored".
when are you going to withdraw or at the very least amend your claim?[/QUOTE] I see yet again someone has nipped in and changed the title of the thread. Will whoever did stand up and say why they did it, it get's really tiring you know. Making it look like the OP never claimed to have completely factored it doesn't fool anyone. 
You get more and more pleasant with every post.

[QUOTE=Jayder;376394]You get more and more pleasant with every post.[/QUOTE]
You don't know me, we've never corresponded by email , text, voice or in person. You don't get to have an opinion on me, least not one that's worth a damn anyway. Truth hurts sometimes and it needs to be faced up to. Trying to artificially save someones face by changing titles after the fact is underhand and deceitful. 
So let's say that I wanted to factor 45, and my computer told me that:
45 = 5 * 9 But is unable to factorise 9 due to technical limitations. Thankfully I come up with a really good probable prime test for small numbers: If a number between 2 and 14 is odd, then it is probably a prime number. Since 9 is both between 2 and 14 and an odd number, then it's PRP using this test! So I can now claim that 45 is fully factored! Essentially you need to remember: probably != definitely 
[QUOTE=blahpy;376433]probably != definitely[/QUOTE]Thanks for clarifying that. We were all a bit unsure about that. But you have cleared it up nicely.

I am just curious why it is that you seem to be taking this [I]personally[/I]. It was immediately clear to me, and likely everybody else, what Dario meant. Yes, the title was wrong, but his actual post clears it up immediately. His intent was clear from the getgo. Right there, in his opening post, it says it's a probable prime [U]four times[/U] and refers to probable primes six times. I would be similarly confounded and find you as unpleasant if you were continuously ragging on somebody for making a particularly bad spelling mistake on facebook. It is always good to be correct and correcting people should (I believe) always be welcome if done tactfully, but there comes a point (around the 10th time mentioning it or so) where it gets very grating and unpleasant. Speaking strictly for myself, anyway.
If this were a small article in a newspaper and despite all your protesting the error went unaddressed, I would understand the annoyance and be right there beside you. But in this context and on this forum, I am very perplexed, especially when the error was admitted to and seems to have been fixed. It's clear we view this forum very differently. Though not everybody sees it this way, I view this forum as a [I][B]relaxed[/B][/I] environment where people can share in their number related [U]hobbies[/U], and learn a bit of mathematics and programming if they care to. In Dario's post I see excitement at finding such a large PRP, and a wanting to share this with friends and interested ones. My internal response to this topic was, "While the title isn't accurate, this is an awesome find!" Not for a second did I think of hounding him about the title or his possible misconception. And the title changing was likely done with no intention of deceit! You seem to hold this forum to a far higher standard than I do. Your view may be as valid as mine, but it is not one that I can fully understand. I don't know you and I [U]sincerely[/U] apologise if I am being presumptuous or if my comment offended you. However, you have presented [I]a part[/I] of yourself for all to see on this public forum. I am certainly allowed to have an opinion [I]on what you have shared[/I], however shallow it may be. If I see somebody being very loud and rude to a waitress because they were given the wrong dish, I am not going to think, "I don't know that person. Maybe their daughter is ill and they are under a lot of stress." No, I am going to think that they are rude. Is it a fully informed opinion? No. It doesn't take into account the reasons behind the rudeness. But I am certainly allowed to base an opinion on what I have seen of a person, and adjust it accordingly. Now, does my opinion have to mean anything to you? Absolutely not. If you find me longwinded or stupid, you're free to think so. Actually, you'd probably be right! Maybe you'd like to share why you find this mistake so offensive? I'm a reasonable guy, I might just agree with you. 
[QUOTE]No, I am going to think that they are rude. Is it a fully informed opinion?[/QUOTE]So, "probably rude" or "completely rude"?
:razz: 
Reading this thread convinces me that what we need are rigidly defined areas of doubt and uncertainty.

[QUOTE=Xyzzy;376438]So, "probably rude" or "completely rude"?
:razz:[/QUOTE] Definition: A Gordon is a person who passes the Jayder test for possible rudeness but is in fact not rude but expressable as a product of factors: concern for exact use of language and distaste for mutating thread titles. Exercise: prove that there exists more than one Gordon. [QUOTE=xilman;376443]Reading this thread convinces me that what we need are rigidly defined areas of doubt and uncertainty.[/QUOTE] I'm not sure about this. 
[QUOTE=Jayder;376436]I am just curious why it is that you seem to be taking this [I]personally[/I]. It was immediately clear to me, and likely everybody else, what Dario meant. Yes, the title was wrong, but his actual post clears it up immediately. His intent was clear from the getgo. Right there, in his opening post, it says it's a probable prime [U]four times[/U] and refers to probable primes six times. I would be similarly confounded and find you as unpleasant if you were continuously ragging on somebody for making a particularly bad spelling mistake on facebook. It is always good to be correct and correcting people should (I believe) always be welcome if done tactfully, but there comes a point (around the 10th time mentioning it or so) where it gets very grating and unpleasant. Speaking strictly for myself, anyway.
[/QUOTE] I want to add that only moderators can change titles. Once a 30minute window has passed, I cannot edit the post or change the title (when it is the first post of a thread). So there is nothing I can do about this issue. 
[QUOTE=Xyzzy;376438]So, "probably rude" or "completely rude"?
:razz:[/QUOTE] LOL! Mike, what this forum needs is a set of buttons on any displayed post, that we can press to show our agreement with what has been said, our disagreement, our thanks to the person for posting, that we find the post funny, and so on. 
[QUOTE=BudgieJane;376459]Mike, what this forum needs is a set of buttons on any displayed post, that we can press to show our agreement with what has been said, our disagreement, our thanks to the person for posting, that we find the post funny, and so on.[/QUOTE]Do you think this is some sort of social networking site or something? Didn't you read RDS's posts about how this is a mathematical discussion, and that precision and pedantry are of paramount importance?
:bob: 
[QUOTE=retina;376463]Do you think this is some sort of social networking site or something? Didn't you read RDS's posts about how this is a mathematical discussion, and that precision and pedantry are of paramount importance?
[/QUOTE] I thought that was only valid in the "Math" subforum. 
[QUOTE=BudgieJane;376459]LOL!
Mike, what this forum needs is a set of buttons on any displayed post, that we can press to show our agreement with what has been said, our disagreement, our thanks to the person for posting, that we find the post funny, and so on.[/QUOTE] On the whole, I think the forum needs a scoring system like it needs to be infested with dyspeptic porcupines. I have found a source willing to supply the porcupines, if required. 
[QUOTE=retina;376463]Do you think this is some sort of social networking site or something? Didn't you read RDS's posts about how this is a mathematical discussion, and that precision and pedantry are of paramount importance?
:bob:[/QUOTE] Yes, I did. But sometimes we need some sort of levity to clear the air a bit. [Now someone's going to call me an idiot for saying that] 
[QUOTE=fivemack;376468]
I have found a source willing to supply the porcupines, if required.[/QUOTE] Dunno about the porcupines, but there is this porpuquine [url]http://mathcs.wilkes.edu/~rpryor/mth231/RecurrencesandRecursion.pdf[/url] 
[QUOTE=BudgieJane;376470]Dunno about the porcupines, but there is this porpuquine
[url]http://mathcs.wilkes.edu/~rpryor/mth231/RecurrencesandRecursion.pdf[/url][/QUOTE]I can define "mutual recursion", but only in terms of "recursion, mutual". Is that OK? "preceded by its negation yields FALSE" preceded by its negation yields FALSE. Just quining a phrase there The above paragraph is based on something I found in "Copper, Silver, Gold: An Indestructible Metallic Alloy" in Egbert B Gebstadter or, possibly, in a work referenced in its index. 
Hey, knock it off. We're not here to have fun, we're here to do mathematics stuff and make sure that each and every word is ontopic, true and accurate without deception, mistake or falsiness. :Gordon:

[QUOTE=retina;376474]Hey, knock it off. We're not here to have fun, we're here to do mathematics stuff and make sure that each and every word is ontopic, true and accurate without deception, mistake or falsiness. :Gordon:[/QUOTE]
But, ... [url]http://3.bp.blogspot.com/qJ_vaX1ZejU/UJzk8VgxnjI/AAAAAAAAFc0/eDqkbIFIXPA/s1600/math+is+fun.jpg[/url] 
[QUOTE=retina;376474]Hey, knock it off. We're not here to have fun, we're here to do mathematics stuff and make sure that each and every word is ontopic, true and accurate without deception, mistake or falsiness. :Gordon:[/QUOTE]
At least in the DATA forum, want some [URL="http://www.mersenneforum.org/forumdisplay.php?f=7"]fun and frolics?[/URL] 
I am still interested in why this
[TEX]n=\prod_{p^en}p^e[/TEX] fails to meet the requirements of this definiton [QUOTE=R.D. Silverman;375973]Start by defining "completely factored". My definition would be: A number is completely factored when it is represented as the product of primes.[/QUOTE] One of the issues is that "completely factored" is not a property of the number, it's a property of our knowledge. The unmathematical word "when" is in the definition because our knowledge can can change over time. I think the expression does not fail the "prime" or "product" components, so it must fail the "represented" part. We need "represented" to mean something like "represented in decimal notation" or some equivalent wording that precludes a representation whose evaluation requires "first factor the number." Is there a standard way or saying that? 
[QUOTE=wblipp;376498]I think the expression does not fail the "prime" or "product" components, so it must fail the "represented" part. We need "represented" to mean something like "represented in decimal notation" or some equivalent wording that precludes a representation whose evaluation requires "first factor the number." Is there a standard way or saying that?[/QUOTE]
"Explicitly represented", in the sense of the fundamental theorem of arithmetic. 
[QUOTE=R.D. Silverman;375973]You are bandying words and undefined terminology.
Start by defining "completely factored". My definition would be: A number is completely factored when it is represented as the product of primes. Since the number in question has not been represented as the product of primes, then it most definitely has NOT been completely factored.[/QUOTE] What's wrong with this? C = A and B where A, B, and C are the propositions A: N is represented as a product B: Each multiplicand in the product is a prime number C: N is completely factored (not C) = (not A) or (not B) Bob has asserted (not C) but cannot prove (not A) nor (not B) 
This is a good framework for understanding your difference of opinions.
Bob says: "There is no proof of (B), therefore (not B)" William says: "There is no proof of (not B), therefore (B)" The second argument seems far less convenient. By this logic, any sufficiently large number (for example 2[SUP]2[SUP]195[/SUP][/SUP]+1) has no proof of (not B) and is therefore completely factored. If you want to disprove that F195 is not completely factored, you have to prove that the cofactor is composite, which you can't. 
[QUOTE=Batalov;377899]William says: "There is no proof of (not B), therefore (B).[/QUOTE]
I have never claimed B. I have consistently claimed that both "B" and "not B" are unproven. My only complaint is that Bob has failed to follow implications of his own definition. The original poster briefly claimed "B." I don't think anybody has claimed "B" since early in the thread. 
[QUOTE=Batalov;377899]The second argument seems far less convenient. By this logic, any sufficiently large number (for example 2[SUP]2[SUP]195[/SUP][/SUP]+1) has no proof of (not B) and is therefore completely factored. If you want to disprove that F195 is not completely factored, you have to prove that the cofactor is composite, which you can't.[/QUOTE]
This is one of the reasons I don't like Bob's definition. If the definitions don't give you the results you want, then you need to change the definitions. You cannot keep the definition and ignore the mathematical consequences of the definition. 
In [URL="http://en.wikipedia.org/wiki/Threevalued_logic#Logics"]threevalued logic[/URL], most of these (A, B, C) statements will fall into Unknown value pretty fast.
What Bob is saying, I think, is (C) [TEX]\ne[/TEX] True. What we all seem to agree with, is that presently (C) evaluates as Unknown. As well as, obviously, (not C) evaluates as Unknown, too. 
Can someone draw the Venn diagrams? I think that that is really an imperative for a proper understanding of what is going on here.
Please define your axioms just for the sake of clarity… You should also comment on whether incompleteness does or does not apply. 
1 Attachment(s)
[COLOR="White"].[/COLOR]

This informative graphic has provoked much cackling and chortling around this musically inclined household! :missingteeth: :goodposting:

[QUOTE=Xyzzy;377944][COLOR=White].[/COLOR][/QUOTE]
:rofl: :goodposting: Wonderful! the best thing I have seen posted for a long while! (not only on this forum!). The guy who conceived that diagram was genial! Thank you for sharing it! 
[QUOTE=LaurV;372028]Bwaaa haaa haaa! This is genial. :rofl:[/QUOTE]
[QUOTE=LaurV;372414]:missingteeth::missingteeth::rofl: :rofl: [B]Genial![/B] [/QUOTE] [QUOTE=LaurV;377993]The guy who conceived that diagram was [B]genial[/B]! [/QUOTE] :pb: Look it up in the dictionary, maybe? :razz: 
[QUOTE=Batalov;377994]
Look it up in the dictionary, maybe? :razz:[/QUOTE] Who knows? Perhaps the designer of the diagram is a pleasant person, as well as intelligent/inspired/whatever. :innocent: 
I, also, would expect the originator of that diagram to be genial.

Really?!? I think that he was probably Gentile.

[QUOTE=Batalov;377994]<snip photo>Look it up in the dictionary, maybe? :razz:[/QUOTE]
Genial! :yucky: 
I [I]did[/I] wait for the 13th ("lucky") "genial" quote.
Too soon? :yucky::razz: Check it out for yourself. Forum search engine is UR friend. (Now, 14, to be exact.) 
Wait… genitals?

PRP testers get 'em while they are hot. ~35 new tests are up all under M(1M) some under M(100k).
EDIT: You can thank my hero TJAOI for those btw. 
After executing about 20000 PRP tests I've just completed the range of Mersenne Numbers with exponents less than 2 million as can be seen on [url]http://www.mersenne.ca/prp.php?assigned_distribution=1[/url] .
This range will be complete only for a short time because every day there are new known prime factors of Mersenne numbers with exponents less than two million. 
Great work, alpertron! You found couple of nice gems on the way. I think that even for a year of factors in this range (below 2M) the new PRP tests needed will be negligible work. Any plans to go to 3 :smile:?

[QUOTE=bloodIce;379411]Great work, alpertron! You found couple of nice gems on the way. I think that even for a year of factors in this range (below 2M) the new PRP tests needed will be negligible work. Any plans to go to 3 :smile:?[/QUOTE]
Thanks, but completing the 2M to 3M range will need a lot of time (a few CPU years). That should be done by several people. When done, I think that at least one PRP will be discovered. 
Alperton, would you consider running PRP test on 23M for exponents that have 4 or more factors? ~650 tests.
[QUOTE=bloodIce;379411]new PRP tests needed will be negligible work. Any plans to go to 3 :smile:?[/QUOTE] You might be surprised! User TJAOI is posting a lot of ECM factors (~1475 total to date) all under 1M, ~1100 in since June 1st. A lot of multiple factors for exponents, but I would expect >100 tests a week depending on progress, in addition to a few other users that seem to regularly post a couple a week. 
[QUOTE=flagrantflowers;379447]Alperton, would you consider running PRP test on 23M for exponents that have 4 or more factors? ~650 tests.
[/QUOTE] These are the Mersenne numbers that are the product of prime numbers and known PRP: [QUOTE=http://www.mersenne.ca/prp.php] M130,439 = 260879 x PRP39261 M136,883 = 536581361 x PRP41198 M157,457 = 4612545359 x 358012521626153 x PRP47376 M173,867 = 52536637502689 x PRP52326 M221,509 = 292391881 x PRP66673 M271,211 = 613961495159 x PRP81631 M271,549 = 238749682487 x PRP81734 M406,583 = 813167 x PRP122388 M432,457 = 1672739247834685086279697 x PRP130159 M576,551 = 4612409 x 64758208321 x 242584327930759 x PRP173528 M684,127 = 23765203727 x PRP205933 M696,343 = 11141489 x 36009913139329 x PRP209600 M1,010,623 = 12602017578957977 x PRP304212 M1,168,183 = 54763676838381762583 x PRP351639 M1,304,983 = 52199321 x PRP392832 [/QUOTE] So it appears that it is not more probable that the cofactor is PRP if the Mersenne number has many prime factors. 
[QUOTE=alpertron;379457]So it appears that it is not more probable that the cofactor is PRP if the Mersenne number has many prime factors.[/QUOTE]
But what percentage of your candidates only have one factor? In other words, what is the percent of positive prp results for all single factors exponents, two factors, three, etc? In your small set there are two, 2 factor exponents and one 3 factor. There are much less exponents with 23 factors than a single factor. 
[QUOTE=flagrantflowers;379462]But what percentage of your candidates only have one factor? In other words, what is the percent of positive prp results for all single factors exponents, two factors, three, etc?
In your small set there are two, 2 factor exponents and one 3 factor. There are much less exponents with 23 factors than a single factor.[/QUOTE] I went to [url]http://www.mersenne.ca/prp.php?show=2&min_exponent=900000&max_exponent=1000000[/url] (the first 1000 Mersenne numbers with exponents greater than 900,000) and then selected all lines, copied them and then pasted them in a text editor, deleting all extra lines at the top and botom. The number of lines is 3427. Removing 1000 lines that contain Mxxx,xxx and 1000 lines of "no", we get 1427 lines, that is, there are 1427 factors, so there are many Mersenne numbers with at least 2 prime factors in that range. 
Is this the appropriate place to ask about the PRP stuff?
As far as I can tell PRP refers to Probably Prime which, not too surprisingly, is a number that fits enough criteria of prime numbers (relative to a base, so probable prime base 2 or base 13 or whatever?) that we believe it to be a prime number, but we're not sure? When finding factors, we stumble across some little ones but we're some times left with a gigantic number and we don't know whether it's prime or not (just like the Mersenne numbers) or what its factors are (again just like the Mersenne numbers). So is this PRP business part of the ECM effort to factor the smaller Mersennes? 
For the [U]size[/U] of these numbers, if they pas a fewbases PRP test, then the chances are very high that they are prime. I mean "very high" like in 99 followed by a dot, and a very long string of nines, so long it won't fit your screen line, and then the percent sign at the end. This is because the probability of a PRP to be PSP (pseudoprime, i.e. composite) decreases with the size of the number, larger PRPs have lower probability to be composite.
But we can't be sure till somebody proves them to be prime, most likely never, as proving them prime is not very important. More important could me if someone proves a theorem like "a composite factor of a mersenne number with prime exponent can not be 3PRP", for example. (it won't work for 2, for example 2047 is 2PRP, and yet composite). Or, ""a composite proper factor of a mersenne number with prime exponent can not be 2PRP". But these fall more in the "open questions" (they even can't be called "conjectures") and we have no idea if they are true. There is no reason why they would be true, in fact. So, some PRP factor of a mersenne number with prime exponent can still be composite. We are talking here about numbers with thousand digits at least, ans many people fail to imagine the [U]size[/U] of those numbers. A number with 100 digits will be enough to write down the number of all atoms which exist in the known universe. People are doing that (finding PRP's of this type) for fun. Related to the ECM part, one can try to ECM such PRPs but her chances to get a factor are close to zero. Like in a zero, followed by a dot, followed by a hundred zeroes, followed by a 1, then the percent sign. On the other hand, some ECM factors are found daily, for OTHER mersenne numbers in that range, and that leaves new [U]cofactors[/U] available for PRP testing. For example, if I test now the cofactor of Mx, where the exponent x is a prime in the range, and I find it to be composite (NONPRP), then the job is not "done" yet. Someone can find a factor of it tomorrow, so I would have to come back to the range and test if the new cofactor is still composite, or not (a new PRP test). That is what the people here are talking about. 
[QUOTE=LaurV;379475]For example, if I test now the cofactor of Mx, where the exponent x is a prime in the range, and I find it to be composite (NONPRP), then the job is not "done" yet. Someone can find a factor of it tomorrow, so I would have to come back to the range and test [COLOR=Red]if the new cofactor is still composite[/COLOR], or not ([COLOR=Red]a new PRP test[/COLOR]).[/QUOTE]
All one needs to do is [U]keep all residues[/U]: 3^Mp (mod Mp) or even just their last 64 bits. When a new factor F is reported, one can check if this new factor leads to a PRP almost instantly: calculate 3^prod(F[SUB]i[/SUB]) (mod Mp) and if it matches, do the PRP test (and probably best in a different base, because for base 3 it is already almost surely in the hat). GIMPS actually has all LL residues. Iirc from EWMayer, LL residues can also be used, but the interested party will have to dig up all details. 
[QUOTE=Batalov;379476]GIMPS actually has all LL residues. Iirc from EWMayer, LL residues can also be used, but the interested party will have to dig up all details.[/QUOTE]
I don't believe that. Who reported full residues to GIMPS? What gimps have is the last hex digits of each residue, which can't be of any use here. 
[QUOTE=ewmayer;342684]Testing the cofactor for rigorous primality is expensive, but PRPtesting (starting with the full LL residue) can be done similarly cheaply as cofactorPRP testing is done for Fermat numbers (starting with a Pépintest residue).
In the late 1990s I adapted the procedure used for Fermats to PRPtest a bunch of M(p) cofactors (and discovered numerous PRP cofactors up to my search limit of a thenlofty 20k digits or so). My version of the test needed a Pépinstyle test residue to be generated for the target M(p), but Peter Montgomery [URL="http://ndatech.com/mersenne/archives/digest/v01_0368.txt"]showed me how to use an LLtest residue[/URL], thus eliminating the redundant LLtest/Pépinstylemodpowering effort.[/QUOTE] Scroll the linked discussion until "Saving storage when testing cofactors of Mp". I remembered correctly: the storage space [I]can [/I]be saved, but it is not the lowest 64bit that should have been saved but the few hundred [I]highest [/I]bits. So, as GIMPS "freeofcharge" residues go now, they probably won't help. Because LL test is a test in a weird base [TEX]\omega,\ \bar{\omega}[/TEX] (in diguise), the result is more twisted than in simple base 3. (Peter Montgomery showed how to use upper bits, ... I wonder if lower bits could be an equivalent fingerprint. Tonight I won't think about it.) Anyway, Dario most likely has all Pépintest residue RES64s now in his logs (up to a certain limit), so he has everything he needs for very fast new incoming factor tests within his range. 
[QUOTE=Batalov;379476]All one needs to do is [U]keep all residues[/U]: 3^Mp (mod Mp) or even just their last 64 bits. [/QUOTE]
I don't see how the last 64 bits help. Lets say F divides Mp. By Fermat, 3^(Mp/F) = 3 (mod (Mp/F)). Raising both sides to F, we get 3^Mp = 3^F (mod Mp/F). Now if we have the full residue 3^Mp (mod Mp), we can easily calculate 3^Mp (mod Mp/F). But last 64bits? Are you saying 3^Mp = 3^F (mod Mp/F) ==> 3^Mp = 3^F (mod Mp). I just tried it with M2311 (77567729423209*4514379640917651135021865565129IPRP652). LHS holds, but RHS doesn't. 
[QUOTE=axn;379483]I don't see how the last 64 bits help.
Lets say F divides Mp. By Fermat, 3^(Mp/F) = 3 (mod (Mp/F)). Raising both sides to F, we get 3^Mp = 3^F (mod Mp/F). Now if we have the full residue 3^Mp (mod Mp), we can easily calculate 3^Mp (mod Mp/F). But last 64bits? Are you saying 3^Mp = 3^F (mod Mp/F) ==> 3^Mp = 3^F (mod Mp). I just tried it with M2311 (77567729423209*4514379640917651135021865565129IPRP652). LHS holds, but RHS doesn't.[/QUOTE] Anyway, for numbers in this range, it is easier to perform a PRP test than to find a new prime factor of the Mersenne number. There is no double check of PRP, so there is still the doubt of having incorrect residues. This is why I would prefer to perform a PRP test on the new cofactor. 
Did some rudimentary math on the PRP data @mersenne.ca for the odds of finding a positive PRP result given the number of factors found (three or less versus four or more).
Below M(2M) there are 3384 exponents with 4 or more factors, of these 78 have a positive PRP result. ~2.3% chance of a positive prp result given four or more factors. For three factors or less: total of ~110801 exponents factored, subtract exponents with more than four factors leaves us with ~107417. There are 203 positive PRP results for 3 factors or less, gives: ~0.19% chance of a positive PRP result given three or less factors. [B]So the odds of finding a positive PRP with more than three factors is about ten times higher than for exponents with three factors or less. [/B] This analysis does not differentiate between one, two and three factors. The data is not readily available but I get the feeling that this would further lower the odds of a positive PRP for a single factor. Either way it seems that running PRPs on exponents with numerous factors is more fruitful than exponents with a single, or couple, of factors. EDIT: there seem to only be ~75 positive PRP tests for a single factor. This is only ~25% of the positive PRP tests found to date. Yet the large majority of exponents only have one factor. 
[QUOTE=flagrantflowers;379686]Did some rudimentary math on the PRP data @mersenne.ca for the odds of finding a positive PRP result given the number of factors found (three or less versus four or more).
[/QUOTE] No, you did NOT do any math. What you did was mindless numerology on a small sample set and then jump to some clearly erroneous conclusions. Your conclusions, if correct would violate ErdosKac. Stop babbling. Stop the pseudomath. Stop pretending that what you are doing is math. Stop equating mindless numerology with "performing rudimentary math". If, in fact, you think that you did some math, please show it to us. What you have presented is not math and does not constitute meaningful analysis of the data. 
[QUOTE=flagrantflowers;379686]Did some rudimentary math on the PRP data @mersenne.ca for the odds of finding a positive PRP result given the number of factors found (three or less versus four or more).
Below M(2M) there are 3384 exponents with 4 or more factors, of these 78 have a positive PRP result. ~2.3% chance of a positive prp result given four or more factors. For three factors or less: total of ~110801 exponents factored, subtract exponents with more than four factors leaves us with ~107417. There are 203 positive PRP results for 3 factors or less, gives: ~0.19% chance of a positive PRP result given three or less factors. [B]So the odds of finding a positive PRP with more than three factors is about ten times higher than for exponents with three factors or less. [/B] This analysis does not differentiate between one, two and three factors. The data is not readily available but I get the feeling that this would further lower the odds of a positive PRP for a single factor. Either way it seems that running PRPs on exponents with numerous factors is more fruitful than exponents with a single, or couple, of factors. EDIT: there seem to only be ~75 positive PRP tests for a single factor. This is only ~25% of the positive PRP tests found to date. Yet the large majority of exponents only have one factor.[/QUOTE] Notice that in the range you are considering, the lowest numbers have much more factoring effort done, so I think the statistics are not correct. It is like comparing apples and oranges. For example, if you consider the range 500K to 2M you will find a completely different picture. 
[QUOTE=alpertron;379712]Notice that in the range you are considering, the lowest numbers have much more factoring effort done, so I think the statistics are not correct. It is like comparing apples and oranges. For example, if you consider the range 500K to 2M you will find a completely different picture.[/QUOTE]
It isn't even that! He babbles about numbers with 3 factors, vs. numbers with 4 factors, when all he is looking at is numbers with 3 SMALL factors; i.e. almost all of the numbers with three factors are not even included in his data because most of these have factors whose size lie beyond his search range. What he is doing is MEANINGLESS. 
There is a formula on section 4.5.4 (Factoring into Primes) of Knuth's TAOCP that computes the probability of having the second largest prime factor less than some bound (expressed as log B / log N, B is the bound, N is the composite number).
That can be used here, but it appears that the formula is not easy to compute. For example, we could fix the bound to 10[SUP]30[/SUP] and then for all prime exponents between 1M and 2M, add all probabilities. That sum could be (approximately) the expected number of PRP factors in the mentioned range that we could found in a reasonable amount of time. 
[QUOTE=alpertron;379712]Notice that in the range you are considering, the lowest numbers have much more factoring effort done, so I think the statistics are not correct. It is like comparing apples and oranges. For example, if you consider the range 500K to 2M you will find a completely different picture.[/QUOTE]
I considered ignoring <M(10K) as you're correct there are few exponents with more than one factor and the picture does drastically change. But if you ignore that range the odds of positive PRP results as a function of factors is more disparate between 13 and more than four. I find it interesting that there are obviously much fewer (as a percent) of exponents with less than three factors versus more than four. You're correct I didn't do any "math". I calculated some uncorrected percentages. The order of magnitude difference, despite my "pseudomath", seems to be too large to simply be dumb "numerology". 
[QUOTE=R.D. Silverman;379714]It isn't even that! He babbles about numbers…[/QUOTE]
Unfortunately we only have PRP tests completed up to M(2M) so going beyond that range doesn't make sense as I would be including mostly candidates without a PRP result (positive or negative). All I'm saying is the percent of exponents with a positive PRP result and having more than three factors is a very large portion of positive tests. Despite the fact that there are much more numbers with one factor. I'm not trying to say that this means anything. I'm trying to say that this is functionally what the odds are on the tests done to date. This is obviously from this sample set and doesn't, ideally, represent the population of all <M(2M). I don't pretend to understand the underlying factor theorems. 
[QUOTE=alpertron;379739]There is a formula on section 4.5.4 (Factoring into Primes) of Knuth's TAOCP that computes the probability of having the second largest prime factor less than some bound (expressed as log B / log N, B is the bound, N is the composite number).
That can be used here, but it appears that the formula is not easy to compute. For example, we could fix the bound to 10[SUP]30[/SUP] and then for all prime exponents between 1M and 2M, add all probabilities. That sum could be (approximately) the expected number of PRP factors in the mentioned range that we could found in a reasonable amount of time.[/QUOTE] The formula in the book is: [tex]G(\beta)\, = \,\int\limits_{0}^{\beta}(G(\frac{t}{1t})F(\frac{t}{1t}))\,\frac{dt}{t}[/tex] For the values of [tex]\beta[/tex] near zero, the function F can be ignored, as can be seen in the graph in the page following the definition. We can also make t/(1t) = t as an approximation. Deriving both members we get: [tex]G'(t) \approx \frac{G(t)}{t}[/tex] [tex]\frac{G'(t)}{G(t)} \approx \frac{1}{t}[/tex] [tex]\log G(t) \approx C + \log(t)[/tex] [tex]G(t) \approx Kt[/tex] From the table of values of G(t) in the book, we can assign K = 1.8 I wrote the following program in UBASIC language: [QUOTE] 10 S=0 20 P=nxtprm(1000000) 30 B=(30*log(10))/(P*log(2)) 40 S=S+1.8*B 50 P=nxtprm(P) 60 if P<2000000 then 30 70 print S [/QUOTE] The program prints the number 8.78. This is a crude approximation, but it appears that there are several PRPs to be found in the range 1M to 2M in a reasonable amount of time by finding more prime factors. 
A more approximate result can be found using the method of undetermined coefficients.
[tex]G'(t)\, = \,G(\frac{t}{1t})\frac{1}{t}[/tex] [tex]G(t) = Kt + Lt^2 + O(t^3)[/tex] [tex]\frac{t}{1t} = t + t^2 + O(t^3)[/tex] [tex]K + 2Lt + O(t^2) = (K(t+t^2+O(t^3)) + L(t^2+O(t^3)) + O(t^3))\frac{1}{t}[/tex] [tex]K + 2Lt + O(t^2) = K + (K+L)t + O(t^2)[/tex] [tex]L = K[/tex] [tex]G(t) = 1.8t(1+t) + O(t^3)[/tex] The result is again 8.78. 
Using the formulas above I computed the expected numbers of PRP to be found in different ranges. Notice that most Mersenne numbers are not factored up to these bounds because the factorization stopped when the first prime factor was found.
[CODE] Range Known PRPs 25 digits 30 digits 35 digits  1K2K 45 14 17 20 2K5K 26 17 20 24 5K10K 16 12 14 16 10K20K 10 11 13 15 20K50K 8 13 16 18 50K100K 7 9 11 13 100K200K 5 9 10 12 200K500K 5 11 13 15 500K1M 3 8 9 11 1M2M 3 7 9 10 2M3M 0 4 5 6 3M4M 0 3 3 4 4M5M 0 2 3 3 5M10M 0 7 8 9 [/CODE] 
[QUOTE=alpertron;379793]The formula in the book is:
[tex]G(\beta)\, = \,\int\limits_{0}^{\beta}(G(\frac{t}{1t})F(\frac{t}{1t}))\,\frac{dt}{t}[/tex] For the values of [tex]\beta[/tex] near zero, the function F can be ignored, as can be seen in the graph in the page following the definition. We can also make t/(1t) = t as an approximation. Deriving both members we get: [tex]G'(t) \approx \frac{G(t)}{t}[/tex] [tex]\frac{G'(t)}{G(t)} \approx \frac{1}{t}[/tex] [tex]\log G(t) \approx C + \log(t)[/tex] [tex]G(t) \approx Kt[/tex] [/QUOTE] This ignores C. exp(C) might be quite large. We don't know the value of C. 
[QUOTE=alpertron;379805]A more approximate result can be found using the method of undetermined coefficients.
[tex]G'(t)\, = \,G(\frac{t}{1t})\frac{1}{t}[/tex] [tex]G(t) = Kt + Lt^2 + O(t^3)[/tex] [tex]\frac{t}{1t} = t + t^2 + O(t^3)[/tex] [tex]K + 2Lt + O(t^2) = (K(t+t^2+O(t^3)) + L(t^2+O(t^3)) + O(t^3))\frac{1}{t}[/tex] [tex]K + 2Lt + O(t^2) = K + (K+L)t + O(t^2)[/tex] [tex]L = K[/tex] [tex]G(t) = 1.8t(1+t) + O(t^3)[/tex] The result is again 8.78.[/QUOTE] We do not know that the implied constant in the O() estimate is small. Without knowing what the constant is, any numerical estimate is just a SWAG. 
[QUOTE=R.D. Silverman;379832]This ignores C. exp(C) might be quite large. We don't know the value of C.[/QUOTE]
I found the value of K = exp(C) from the table of values of G(b) as I wrote several posts above. [QUOTE]We do not know that the implied constant in the O() estimate is small. Without knowing what the constant is, any numerical estimate is just a SWAG.[/QUOTE] The value of t in these cases is less than 0.0001, so I think that O(t^3) should be small as well. As I stated in a previous post, the result was the same when using the quadratic coefficient or not. When using the quadratic coefficient, the values are exact to three significant digits for t<0.1 as can be seen on the table of values of G(t) in the book. All calculations for the table include both the linear and quadratic coefficient. 
[QUOTE=alpertron;379835]I found the value of K = exp(C) from the table of values of G(b) as I wrote several posts above.
The value of t in these cases is less than 0.0001, so I think that O(t^3) should be small as well. As I stated in a previous post, the result was the same when using the quadratic coefficient or not. When using the quadratic coefficient, the values are exact to three significant digits for t<0.1 as can be seen on the table of values of G(t) in the book. All calculations for the table include both the linear and quadratic coefficient.[/QUOTE] It looks good. But I hesitate to draw any conclusions with B this small, relative to the size of the original composite(s). R. Guy's "Law of Small Numbers" kicks in. Your work gives reasonable estimates, but we don't know if they apply for factors this small relative to the original composite. 
All times are UTC. The time now is 00:36. 
Powered by vBulletin® Version 3.8.11
Copyright ©2000  2023, Jelsoft Enterprises Ltd.