mersenneforum.org  

Go Back   mersenneforum.org > Factoring Projects > Factoring

Reply
 
Thread Tools
Old 2018-01-21, 12:37   #1
siegert81
 
Dec 2010

2×37 Posts
Default Maximum number of prime factors (3 questions)

Hello, an amateur mathematician here...

Let's say we are interested in investigating a "random" large odd number we know next to nothing about like 1,990,292,873.

We then begin applying trial division by a list of the primes in order to unlock some of its mysteries...

Division by 3 yields a remainder of 2.
Division by 5 yields a remainder of 3.
Division by 7 yields a remainder of 2.
Division by 11 yields a remainder of 8.
Division by 13 yields a remainder of 10.

So far, no luck in finding a factor...

We then wonder what the maximum number of factors this number could have, given the information we've generated.

Calculating the 8th root of 1,990,292,873 gives us ~14.5333

Given we've ruled out all prime factors less than the 8th root, we can conclude that our large random number has at most 7 prime factors. This is a new piece of information.

Calculating the 7th root of 1,990,292,873 gives us ~21.30182

Division by 17 yields a remainder of 6.
Division by 19 yields a remainder of 9.

Again, no luck with factoring, but given we've ruled out all prime factors less than the 7th root, we can conclude that our large random number has at most 6 prime factors. This reduces the possibilities and, in my mind, represents greater information; a new level, if you will.

The 6th root is ~35.466
The 5th root is ~72.4
The 4th root is ~211.2
The cubic root is ~1257.879
The square root is ~44612.7

So performing divisions by 23, 29, and 31 will limit the number to a maximum of 5 prime factors.
Performing divisions by 37 - 71 will limit the number to a maximum of 4 prime factors.
Performing divisions by 73 - 211 will limit the number to a maximum of 3 prime factors.
Performing divisions by 223 - 1,249 will limit the number to a maximum of 2 prime factors (a semiprime).
Finally, performing divisions by 1,259 - 44,587 would complete the factoring process.

I now have a question. Are there any ways (or an improvement) of limiting the maximum number of prime factors of an integer aside from the above method of trial factoring up to the Nth root where N-1 would be the new maximum number of prime factors?

Personally, I've never seen this kind of question raised, and I know of no algorithms or methods aside from determining the actual factors (or primality) of the number.

A second, related question I have is whether or not this question has been raised for the remaining co-factor of F(12) (the Fermat number 2^(2^12) + 1). I believe this co-factor is 1,133 digits long. If mathematicians can't find the next factor, perhaps they could at least determine what "level" they're on and focus on "leveling up" as they gain more information about the F(12) co-factor. Of course, I'm sure such factoring "levels" could be tracked for all sorts of particular numbers.

A third, related question I have is whether or not it's somehow easier to complete the factorization of a number if a new limit on the maximum number of prime factors is reached. Alternatively, is such information of any use, or would it be nothing more than a fun way to track one's progress as they factor a random large integer?
siegert81 is offline   Reply With Quote
Old 2018-01-21, 12:58   #2
science_man_88
 
science_man_88's Avatar
 
"Forget I exist"
Jul 2009
Dumbassville

20C016 Posts
Default

Quote:
Originally Posted by siegert81 View Post

Division by 3 yields a remainder of 2.
Division by 5 yields a remainder of 3.
Division by 7 yields a remainder of 2.
Division by 11 yields a remainder of 8.
Division by 13 yields a remainder of 10.
CRT could get you a remainder class on some of these mod a primorial. Then maybe find out all you can about the factors of that remainder.
science_man_88 is offline   Reply With Quote
Old 2018-01-21, 13:02   #3
ATH
Einyen
 
ATH's Avatar
 
Dec 2003
Denmark

55048 Posts
Default

Regarding F12 you can follow the ECM progress here (the top line 4096)
https://www.mersenne.org/report_ecm/

All 112,000 curves has been done at the 60 digit level B1=260M, which means the chance/risk of missing a factor at 60 digit is 1-1/e ~ 37%.

But then 68,721 curves have been done on the 65 digit level B1=800M which lowers that chance greatly, so only a very small chance/risk that a 60 digit factor have been missed, and infinitesimal chance that an even smaller factor exist.

If you want to run ECM on F12 using Prime95 here is the worktodo line:
Code:
ECM2=1,2,4096,1,800000000,80000000000,1,"114689,26017793,63766529,190274191361,1256132134125569,568630647535356955169033410940867804839360742060818433"
The "1" after 80000000000 is the number of curves you want to run.

Last fiddled with by ATH on 2018-01-21 at 13:03
ATH is offline   Reply With Quote
Old 2018-01-21, 14:32   #4
Dr Sardonicus
 
Dr Sardonicus's Avatar
 
Feb 2017
Nowhere

34·41 Posts
Default

Quote:
Originally Posted by siegert81 View Post
I now have a question. Are there any ways (or an improvement) of limiting the maximum number of prime factors of an integer aside from the above method of trial factoring up to the Nth root where N-1 would be the new maximum number of prime factors?
You could probably gain some slight improvement using the actual sizes of primorials, and possibly checking whether your N is a pure power. (Be it noted, your example value is not considered "large" by today's standards. I'm not sure what diminutive would be considered appropriate -- "microscopic," perhaps.)

Another old-fashioned approach is to find quadratic residues of your number, since a quadratic residue mod N will be a quadratic residue mod each prime factor of N. Each residue eliminates about half the candidate trial factors.
Dr Sardonicus is offline   Reply With Quote
Old 2018-01-21, 18:55   #5
CRGreathouse
 
CRGreathouse's Avatar
 
Aug 2006

11·13·41 Posts
Default

Quote:
Originally Posted by siegert81 View Post
Are there any ways (or an improvement) of limiting the maximum number of prime factors of an integer aside from the above method of trial factoring up to the Nth root where N-1 would be the new maximum number of prime factors?
All the methods I know of involve factoring the number, in part or whole. In particular, you can either rule out small factors, or you can factor the whole thing and know exactly.

Trial division is the obvious approach to ruling out small factors. Strassen's algorithm is faster than trial division past some point if you need to guarantee that a number has no small prime factors. If you're OK with high probability (say, the chance of failure is at most one in a trillion trillion) you can use ECM very effectively. You can even use it deterministically, but only to relatively small bounds -- Chelli shows that this can be done to 2^32.

If you want to factor the number entirely, do a small amount of trial division, a bit of ECM, SQUFOF (or rho), a bunch more ECM, and then finish whatever is left with SIQS (< if 100 digits or so) or NFS.

Quote:
Originally Posted by siegert81 View Post
A second, related question I have is whether or not this question has been raised for the remaining co-factor of F(12) (the Fermat number 2^(2^12) + 1). I believe this co-factor is 1,133 digits long. If mathematicians can't find the next factor, perhaps they could at least determine what "level" they're on and focus on "leveling up" as they gain more information about the F(12) co-factor. Of course, I'm sure such factoring "levels" could be tracked for all sorts of particular numbers.
A lot of ECM has been done on F(12). The expected number of 60-digit factors is log(log(10^60)) - log(log(10^59)) = 0.016807.... If it had such a prime factor, the odds of it being missed in t60 or the partial t65 is about 0.76%. Our updated probability is
\[\frac{0.016807\ldots\cdot0.0076\ldots}{1 - 0.016807\ldots + 0.016807\ldots\cdot0.0076\ldots} = 0.000128\ldots\]
or about one in 7787. It's actually a bit less likely than that because the factor might have been found in t55 or lower.

If you repeat the calculations for a 55-digit factor, say, it would be significantly less likely.

Quote:
Originally Posted by siegert81 View Post
A third, related question I have is whether or not it's somehow easier to complete the factorization of a number if a new limit on the maximum number of prime factors is reached. Alternatively, is such information of any use, or would it be nothing more than a fun way to track one's progress as they factor a random large integer?
It doesn't make it easier to factor, but it might give you information of other kinds. If you want to prove that N isn't divisible by the seventh power of a prime, say, it suffices to trial divide (or Strassen or whatever) up to the sixth root of N and then test if it's a seventh power (and if so, if the seventh root is prime or not). Booker, Hiary, & Keating have a novel algorithm to tell if a number is divisible by a square which avoids factoring, and I think it's reasonably quick in practice (compared to NFS on numbers without small prime factors).
CRGreathouse is online now   Reply With Quote
Old 2018-01-22, 02:37   #6
siegert81
 
Dec 2010

2×37 Posts
Default

Quote:
Another old-fashioned approach is to find quadratic residues of your number, since a quadratic residue mod N will be a quadratic residue mod each prime factor of N. Each residue eliminates about half the candidate trial factors.
Interesting. Let's assume that 672 is a quadratic residue (mod N). How would we then know which trial factors to skip over, or how would we rapidly eliminate these trial factors?

Quote:
Strassen's algorithm is faster than trial division past some point if you need to guarantee that a number has no small prime factors.
I've never heard of Strassen's algorithm. I'll have to search for it.

Quote:
Booker, Hiary, & Keating have a novel algorithm to tell if a number is divisible by a square which avoids factoring
That's interesting. It appears that it's conditional on GRH.

Quote:
If you're OK with high probability (say, the chance of failure is at most one in a trillion trillion) you can use ECM very effectively.
A high probability would not satisfy my condition. I'd want a guarantee of no factors below a certain root of the number. That is what would produce definitive knowledge about the maximum number of prime factors - a "leveling up" so to speak. Obviously more ECM curves represent more progress, but not the definitive metric of progress I'm seeking.

As I've thought about this topic more I've noticed that if a number fails a Fermat test, it must have at least 2 prime factors. I also noticed that if a number is a Carmichael number, it must have at least 3 prime factors. Such observations just make me wonder if more could be said or discovered about the range of prime factors a number could be limited to. Perhaps investigation down this path could lead to something.

I've also noticed that semiprimes have an interesting property. Assume N = p*q is a semiprime and that B and N are relatively prime. Then B^(N+1) must be congruent to B^(p+q) (mod N). I don't know how meaningful this is though.

Last fiddled with by siegert81 on 2018-01-22 at 02:39 Reason: Quotation was missing.
siegert81 is offline   Reply With Quote
Old 2018-01-22, 02:46   #7
a1call
 
a1call's Avatar
 
"Rashid Naimi"
Oct 2015
Remote to Here/There

3·617 Posts
Default

Quote:
Originally Posted by siegert81 View Post

I've also noticed that semiprimes have an interesting property. Assume N = p*q is a semiprime and that B and N are relatively prime. Then B^(N+1) must be congruent to B^(p+q) (mod N). I don't know how meaningful this is though.
Can you please provide a numeric example?
Thank you in advance.
a1call is offline   Reply With Quote
Old 2018-01-22, 05:08   #8
CRGreathouse
 
CRGreathouse's Avatar
 
Aug 2006

11·13·41 Posts
Default

Quote:
Originally Posted by siegert81 View Post
Interesting. Let's assume that 672 is a quadratic residue (mod N). How would we then know which trial factors to skip over, or how would we rapidly eliminate these trial factors?
Dr Sardonicus could probably answer this better. But the basic idea is that you can use quadratic residuosity to determine which congruence classes of primes can be kept (if the qr is small), or you can just use the Jacobi symbol on each prime if the qr is large. For 672, there are 192 congruence classes relatively prime to 672, and you need only consider those which are quadratic residues mod 672. In this case, that means:

1, 11, 13, 17, 19, 25, 29, 41, 47, 53, 61, 79, 89, 107, 115, 121, 127, 139, 143, 149, 151, 155, 157, 167, 169, 179, 181, 185, 187, 193, 197, 209, 215, 221, 229, 247, 257, 275, 283, 289, 295, 307, 311, 317, 319, 323, 325, 335, 337, 347, 349, 353, 355, 361, 365, 377, 383, 389, 397, 415, 425, 443, 451, 457, 463, 475, 479, 485, 487, 491, 493, 503, 505, 515, 517, 521, 523, 529, 533, 545, 551, 557, 565, 583, 593, 611, 619, 625, 631, 643, 647, 653, 655, 659, 661, 671

(Also, potentially, the individual primes 2, 3, or 7.) So for example, knowing that 672 is a quadratic residue mod N proves that 5 does not divide N, while it's perfectly possible that 673 would divide N (since 673 = 1 mod 672 and 1 is a quadratic residue mod 672).

Quote:
Originally Posted by siegert81 View Post
I've never heard of Strassen's algorithm. I'll have to search for it.
Here's a quick overview:
https://math.stackexchange.com/a/185657/1778

Various other algorithms that might be of interest, variants/extensions/alternatives:
Hittmeir 2015 (see also talk)
Hiary 2014
Rubinstein 2013/Kloosterman sums (blog entry)
Costa & Harvey 2012 (see also their very accessible talk)
Castagnos, Joux, Laguillaumie, & Nguyen 2009
Bostan, Gaudry, & Schost 2007
Boneh, Durfee, & Howgrave-Graham 1999
McKee 1996
Lenstra 1984
Shanks/SQUFOF
Lehman 1974

You'd have to decide which of these could work for you.

Quote:
Originally Posted by siegert81 View Post
That's interesting. It appears that it's conditional on GRH.
Right. It did seem like the sort of thing you were looking for in terms of deciding a hard arithmetic property without factorization.

Quote:
Originally Posted by siegert81 View Post
A high probability would not satisfy my condition. I'd want a guarantee of no factors below a certain root of the number. That is what would produce definitive knowledge about the maximum number of prime factors - a "leveling up" so to speak. Obviously more ECM curves represent more progress, but not the definitive metric of progress I'm seeking.
It's hard because the performance gap is so large. If you could trial divide to 10^13 in a day on a consumer machine, trial dividing to 10^60 with
  • the entire Earth's population
  • calculating since the beginning of the universe
  • if everyone had a supercomputer a billion times faster than the consumer machine
  • assuming trial division on the ~60 digit numbers was as fast as on the ~13 digit numbers
would only be one 500 quadrillionth of the way done.

But with ECM it's not all that hard to take a number to t60.

Quote:
Originally Posted by siegert81 View Post
As I've thought about this topic more I've noticed that if a number fails a Fermat test, it must have at least 2 prime factors. I also noticed that if a number is a Carmichael number, it must have at least 3 prime factors. Such observations just make me wonder if more could be said or discovered about the range of prime factors a number could be limited to. Perhaps investigation down this path could lead to something.
Failing a Fermat test just means it's composite. Being Carmichael is a severe restriction on the number. If you have lots of information on a number (like knowing that it's Carmichael) you might very well be able to say more.

Quote:
Originally Posted by siegert81 View Post
I've also noticed that semiprimes have an interesting property. Assume N = p*q is a semiprime and that B and N are relatively prime. Then B^(N+1) must be congruent to B^(p+q) (mod N). I don't know how meaningful this is though.
I think this requires that N is a squarefree semiprime, p < q.

Last fiddled with by CRGreathouse on 2018-01-22 at 05:14 Reason: add BGS link
CRGreathouse is online now   Reply With Quote
Old 2018-01-22, 11:42   #9
science_man_88
 
science_man_88's Avatar
 
"Forget I exist"
Jul 2009
Dumbassville

100000110000002 Posts
Default

Quote:
Originally Posted by CRGreathouse
I think this requires that N is a squarefree semiprime, p < q.
I get B^(q+1) congruent to B^(q+1) mod p and
B^(p+1) congruent to B^(p+1) mod q

When I reduce. In the case p=q this gets us B^2 congruent to B^2 in both cases.
science_man_88 is offline   Reply With Quote
Old 2018-01-22, 14:24   #10
Dr Sardonicus
 
Dr Sardonicus's Avatar
 
Feb 2017
Nowhere

34·41 Posts
Default

Quote:
Originally Posted by siegert81 View Post
Interesting. Let's assume that 672 is a quadratic residue (mod N). How would we then know which trial factors to skip over, or how would we rapidly eliminate these trial factors?
CRGreathouse covered it pretty well. I can add a small refinement. If you can factor your residue, you can drop odd square factors. You might also be able to drop factors of 4, until your reduced number is a "fundamental discriminant." In the case of 672, dividing out a factor of 4 leaves 168, and that's your fundamental discriminant. Since it's so small, you can determine the residue classes of p (mod 168) for which (168/p) = +1. There are 24 such classes (mod 168), namely 1, 11, 13, 17, 19, 25, 29, 41, 47, 53, 61, 79, 89, 107, 115, 121, 127, 139, 143, 149, 151, 155, 157, and 167. This is half of the 48 residue classes relatively prime to 168.

As CRGreathouse also pointed out, if your quadratic residue R is too large to prepare a table of residue classes mod R in advance, you can quickly test R/p for each p.

A traditional method of producing quadratic residues (mod N) is to develop the simple continued fraction for sqrt(N). This produces residues no larger than (if memory serves) 2*sqrt(N).

BTW, Pari-GP factored your example number 1990292873 in a small fraction of a millisecond. It is the product of two primes. (Setting the debug level high enough reveals that Pollard rho disclosed one of the factors.)
Dr Sardonicus is offline   Reply With Quote
Old 2018-01-22, 16:00   #11
CRGreathouse
 
CRGreathouse's Avatar
 
Aug 2006

133478 Posts
Default

Quote:
Originally Posted by Dr Sardonicus View Post
BTW, Pari-GP factored your example number 1990292873 in a small fraction of a millisecond. It is the product of two primes. (Setting the debug level high enough reveals that Pollard rho disclosed one of the factors.)
~66 microseconds on my machine.
CRGreathouse is online now   Reply With Quote
Reply

Thread Tools


Similar Threads
Thread Thread Starter Forum Replies Last Post
Finding prime factors for 133bit number noodles YAFU 2 2017-05-12 14:00
Maximum number of consecutive integers where the product is 1 (mod n). carpetpool Miscellaneous Math 14 2017-02-26 15:55
Number of distinct prime factors of a Double Mersenne number aketilander Operazione Doppi Mersennes 1 2012-11-09 21:16
Maximum number of threads for Prime95 treesong Information & Answers 11 2012-06-18 13:35
Estimating the number of prime factors a number has henryzz Math 7 2012-05-23 01:13

All times are UTC. The time now is 20:36.

Fri Aug 7 20:36:41 UTC 2020 up 21 days, 16:23, 1 user, load averages: 2.06, 1.88, 1.79

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2020, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.