mersenneforum.org

mersenneforum.org (https://www.mersenneforum.org/index.php)
-   Cunningham Tables (https://www.mersenneforum.org/forumdisplay.php?f=51)
-   -   5- Table Discussion and OddPerfect.org (https://www.mersenneforum.org/showthread.php?t=4425)

ewmayer 2006-04-17 17:13

[QUOTE=R.D. Silverman]Allow me to quote Hamming:

The purpose of computing is insight, not numbers.

If this computation leads to any new mathematical insights about the
problem, I will applaud it heartily. Until then, it is just mindless computing.[/QUOTE]
I think a little more care is required in applying Hamming's maxim here - after all, a skeptic could level a virtually identical accusation at the Cunningham table efforts - to what extent does each new factorization lead to appreciable progress on the theoretical/algorithmic/software fronts, and to what extent is it just another in-itself-completely-useless addition to Sam Wagstaff's "stamp collection?" (As some wag whose name I don't recall put it.)

I don't think you can so easily divorce the "mindless computation" aspect of these kinds of projects from the "genuine advance in insight" aspect - at the very least the number-crunching aspects help us continually improve/debug our algoithms and software implementations thereof, and while that's going on, one hopes that there is a parallel continual ferment on the theoretical side, often spurred by frustration at the seeming intractability of "the really big" computations according to the state of the current art.

I do take your (implied) point about the dangers of computation becoming a mere end in itself though.

R.D. Silverman 2006-04-17 17:28

[QUOTE=ewmayer]I think a little more care is required in applying Hamming's maxim here - after all, a skeptic could level a virtually identical accusation at the Cunningham table efforts - to what extent does each new factorization lead to appreciable progress on the theoretical/algorithmic/software fronts, and to what extent is it just another in-itself-completely-useless addition to Sam Wagstaff's "stamp collection?" (As some wag whose name I don't recall put it.)

I don't think you can so easily divorce the "mindless computation" aspect of these kinds of projects from the "genuine advance in insight" aspect - at the very least the number-crunching aspects help us continually improve/debug our algoithms and software implementations thereof, and while that's going on, one hopes that there is a parallel continual ferment on the theoretical side, often spurred by frustration at the seeming intractability of "the really big" computations according to the state of the current art.

I do take your (implied) point about the dangers of computation becoming a mere end in itself though.[/QUOTE]

I agree with you about the Cunningham project. While it holds great
historical interest for me personally, the factorizations per se are not
really useful. I did promise Dick Lehmer that I would push to finish the
base 2 tables [up to the current limits of the project].

I use the project as a vehicle for tinkering with my NFS code [and earlier
for my MPQS code etc]. Early on, it was also a good reason for me to learn
some mathematics. Then when NFS came along, it was a reason for
me to learn some algebraic field theory.

In a paper still under referee review, I showed (theoretically) how to
find an improved sieve region for line sievers. I have a related argument
for lattice sievers [I adjust the SIZE of the sieve region as the special-q
varies]. It takes a lot of data to see if the improvment is measurable
given the many uncertainties in the run-time. Everytime I do a factorization
I devote one extra machine to just "sampling" various sieve regions changes
to compare yields against the actual factorization.

BTW, It was Oliver Atkin who first coined the phrase "Wagstaff's Stamp
Collection".

philmoore 2006-04-17 23:28

[QUOTE=R.D. Silverman]Allow me to ask:

What insight is gained by this computation?

If this computation leads to any new mathematical insights about the
problem, I will applaud it heartily. Until then, it is just mindless computing.
[/QUOTE]

Algorithmic improvements do not necessarily depend upon new mathematical insights. For example, many of George Woltman's improvements to his multiplication FFTs resulted from efficient use of cache memory. Any new implementation of the Brent, Cohen, and te Riele algorithm will undoubtedly require some clever programming, and should be fertile ground for an aspiring computationalist who wants to hone their skills.

That said, I don't think that one can necessarily discount the possibility that someone working on such a project might not also develop a new mathematical insight. The last few years have seen quite a few new papers on odd perfect numbers, most of them computationally based, and although I am not an expert in this area, I have studied several of these papers and have been pleased to see the application of some very clever ideas.

Pascal Ochem 2008-04-24 11:04

[QUOTE=R.D. Silverman;77866]
On the other hand, if raising the bound helps convince people of the
unlikelihood that OPN's exist, and thereby dissuades such people from
trying to actually find an OPN, then I will also applaud the effort.[/QUOTE]

Let us take Pomerance's heuristic as a mesure of this unlikelihood.
[url]http://oddperfect.org/pomerance.html[/url]
As I understand it,
if the OPN is N=p^e*m^2 with gcd(p,m)=1 and if we can prove m > B,
then the heuristic expects about log(B)/B odd perfect numbers.

So, if the goal to raise unlikelihood, it is more efficient to modify
the (Brent, Cohen, te Riele)-method so that it gives lower a bound
on m rather than on N.

For example, proving m > 10^150 should be a lot easier than
proving N > 10^600 while still giving the same unlikelihood:
150*log(10)/10^150 odd perfect numbers.


All times are UTC. The time now is 08:04.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.