View Single Post
Old 2003-05-28, 18:52   #4
cheesehead
 
cheesehead's Avatar
 
"Richard B. Woods"
Aug 2002
Wisconsin USA

22·3·599 Posts
Default Re: Faster way to do LLT?

Quote:
I know that S(n-1) can be extremely large in comparison to M(n),
It's not always appreciated just how large the unMODed S(i) values get.

The estimated total number of particles (electrons, proton, neutrons, muons, ...) in the known universe is less than 10^100. 10^100 is less than 2^400. If we could use every particle in the universe to store one binary bit, the largest number we could store would be less than 2^(2^400).

Let's look at the unMODed S(i) values:

S(1) = 4 = 2^2

S(2) = (2^2)^2 - 1 = 2^4 - 1 > 2^3

S(3) > (2^3)^2 - 1 > 2^5

S(4) > (2^5)^2 -1 > 2^9

...

For k > 1, S(k) > 2^(2^(k-1)+1)

So S(1000) > 2^(2^999 + 1), which is already way, way above the "universal" storage limit.

And S(1000) isn't very far along the path toward S(several-million).

Edit: I just found an estimate that if the known universe (which is mostly empty space) were packed solid with neutrons (which can be packed more tightly together than other types of particle), it would take 10^128 neutrons. Let's be generous and say it took 10^200 neutrons. 10^200 is less than 2^800. S(1000) still wouldn't fit.

It's not a matter of using more hard disk space. :)
cheesehead is offline   Reply With Quote