mersenneforum.org  

Go Back   mersenneforum.org > Factoring Projects > Factoring

Reply
 
Thread Tools
Old 2007-02-14, 16:21   #1
R.D. Silverman
 
R.D. Silverman's Avatar
 
Nov 2003

22×5×373 Posts
Default Anniversary

Hello All,

This is my 25th Anniversary of working on the Cunningham project.

I wish that I had a new result to report, but I am currently lacking in
CPU resources.

I mailed my first factor (a p13 of 3,193- via P-1) to Sam Wagstaff
25 years ago; about a week after I started working on the project.
It was found on a 16-bit Zilog Z-8000 minicomputer.

In that time we have progressed from where it was hard to factor numbers
of 50 digits to where it is now hard to factor numbers of 250 digits.
This comparison is a bit unfair however, because the former refers to
general numbers (done in 1982 via CFRAC-SEA!!), while the latter
refers to Cunningham numbers done with SNFS. Perhaps a better choice
for upperbound would be about 170 digits (via GNFS).

This improvement is due to roughly equal contributions from improved
computer technology and to new algorithms. Consider, for example,
that even if we had NFS available to us in 1982, we could not have
implemented it on machines of that time owing primarily to space, rather
than cpu limitations. It would have been a theoretical curiosity. Also,
many of our improvements have come not only from improved computers, but
also from improved computer availability.

May the next 25 years bring a similar factor of 3 to 4 to our efforts!!!!!!

It will not happen without new algorithms, but we can all hope.
It's now been 18 years since our last new algorithm.

It would certainly nice if we had some new talent or if a new generation
of factorers came along. The difficulty with this notion is that the bar
is set very high in terms of even the easiest remaining Cunningham numbers.
It is hard to make a contribution with only a small number of computers.
And the learning curve is very steep.

For those who want to see something of the history of the project,
Sam Wagstaff has put all of the old pages, and some of the cover letters
at:

http://homes.cerias.purdue.edu/~ssw/cun/oldp/index.html

Bob
R.D. Silverman is offline   Reply With Quote
Old 2007-02-14, 17:26   #2
ewmayer
2ω=0
 
ewmayer's Avatar
 
Sep 2002
República de California

2D3316 Posts
Default

Happy quarter-century-of-factoring to you, Bob!

I also find it interesting that after the multiple breakthrough factoring algorithms of the 70s (Pollards's p-1, Williams' p+1, Shanks' SQFOF, Brillhart/Morrison's CFRAC) and the 80s (Pomerance's QS and its multiple-polynomial extension by yourself and Montgomery, Lenstra's ECM, Pollard's NFS), virtually all the progress that has been made since (and it has been impressive - don't get me wrong) has been in refinements of the known methods and ever-increasing computational power, exploitation of parallelelism, and so forth.

Perhaps the glory years of the aforementioned 2 decades spoiled us, and it is unreasonable to expect a brand-new-and-improved factoring algorithm every few years or so. But one can hope.

And of course the real breakthrough factoring algorithm (at least for integers "small" enough to be explicitly stored) is already known - with quantum computing we have the needed algorithm, we simply lack the needed hardware.

But it would nice if at least one or two qualitatively new-and-better algorithms for classical computers were found before QC-based hardware blows the door off of everything.
ewmayer is offline   Reply With Quote
Old 2007-02-14, 18:15   #3
jasonp
Tribal Bullet
 
jasonp's Avatar
 
Oct 2004

22·883 Posts
Default

Quote:
Originally Posted by R.D. Silverman View Post
It would certainly nice if we had some new talent or if a new generation of factorers came along. The difficulty with this notion is that the bar is set very high in terms of even the easiest remaining Cunningham numbers. It is hard to make a contribution with only a small number of computers. And the learning curve is very steep.
Bob,

Many happy returns on being in for the long haul. And I'm working as fast as I can :)

jasonp
jasonp is offline   Reply With Quote
Old 2007-02-14, 18:18   #4
grandpascorpion
 
grandpascorpion's Avatar
 
Jan 2005
Transdniestr

503 Posts
Default

Here's to perseverance!
grandpascorpion is offline   Reply With Quote
Old 2007-02-14, 18:28   #5
ewmayer
2ω=0
 
ewmayer's Avatar
 
Sep 2002
República de California

3·7·19·29 Posts
Default

Quote:
Originally Posted by grandpascorpion View Post
Here's to perseverance!
Or in the case of a likely-intractable problem, "perverseverance."
ewmayer is offline   Reply With Quote
Old 2007-02-14, 18:34   #6
grandpascorpion
 
grandpascorpion's Avatar
 
Jan 2005
Transdniestr

503 Posts
Default

Yes, he is due some perverse severance pay if he ever retires from the project.
grandpascorpion is offline   Reply With Quote
Old 2007-02-14, 19:11   #7
xilman
Bamboozled!
 
xilman's Avatar
 
"𒉺𒌌𒇷𒆷𒀭"
May 2003
Down not across

2×3×1,733 Posts
Default

Quote:
Originally Posted by R.D. Silverman View Post
Hello All,

This is my 25th Anniversary of working on the Cunningham project.

I wish that I had a new result to report, but I am currently lacking in
CPU resources.

I mailed my first factor (a p13 of 3,193- via P-1) to Sam Wagstaff
25 years ago; about a week after I started working on the project.
It was found on a 16-bit Zilog Z-8000 minicomputer.

In that time we have progressed from where it was hard to factor numbers
of 50 digits to where it is now hard to factor numbers of 250 digits.
This comparison is a bit unfair however, because the former refers to
general numbers (done in 1982 via CFRAC-SEA!!), while the latter
refers to Cunningham numbers done with SNFS. Perhaps a better choice
for upperbound would be about 170 digits (via GNFS).

This improvement is due to roughly equal contributions from improved
computer technology and to new algorithms. Consider, for example,
that even if we had NFS available to us in 1982, we could not have
implemented it on machines of that time owing primarily to space, rather
than cpu limitations. It would have been a theoretical curiosity. Also,
many of our improvements have come not only from improved computers, but
also from improved computer availability.

May the next 25 years bring a similar factor of 3 to 4 to our efforts!!!!!!

It will not happen without new algorithms, but we can all hope.
It's now been 18 years since our last new algorithm.

It would certainly nice if we had some new talent or if a new generation
of factorers came along. The difficulty with this notion is that the bar
is set very high in terms of even the easiest remaining Cunningham numbers.
It is hard to make a contribution with only a small number of computers.
And the learning curve is very steep.

For those who want to see something of the history of the project,
Sam Wagstaff has put all of the old pages, and some of the cover letters
at:

http://homes.cerias.purdue.edu/~ssw/cun/oldp/index.html

Bob
Congratulations.

I don't remember when my first factor was sent in, but it must have been
about 15 years ago.

The bar seems to have been high since I've been contributing. There
are, of course, at least two ways around the problem. I've exploited
both of them.

First is to contribute to or (preferably IMO) organise a collaborative
effort. There have been several such over the years, including 3
incarnations of NFSNET, The Cabal, and the MullFac group.

The other way is to snaffle the relatively easy factorizations when they
become available. Some become available through ECM reducing a
composite to a range where modest QS or NFS resources are sufficient.
Others become available when the acquisition of new hardware allows the
easier SNFS targets to become feasible. A case in point is that my
latest main machine, a AMD64-3500+ has enough memory (2.5GB) to let me
complete the linear algebra on the first holes, and is fast enough to
let me do so in a reasonably short time. Sieving is performed on that
machine and a collection of antiques, going as far back as a PPro,
sundry PII and PIII, and a Sparc-10. Even without the antiques, it will
only take me a few months to complete a first hole.

The learning curve is indeed steep if you wish to write your software ab
initio. However, that's not a requirement. Several excellent
implementations of ECM and/or NFS are available as source code
(including one of yours, I believe) and the learning curve can be as
gentle as desired. Running black-box software in fire-and-forget mode
is an excellent way to get started, in my opinion, and more ambitious
activities can wait until resources (time, education, enthusiasm, etc)
permit.
xilman is offline   Reply With Quote
Old 2007-02-15, 09:07   #8
BotXXX
 
BotXXX's Avatar
 
Aug 2003
Europe

2×97 Posts
Default

Quote:
Originally Posted by R.D. Silverman View Post
This is my 25th Anniversary of working on the Cunningham project.
Congratulations with the quarter-of-a-century-factoring-enthusiasm for the Cunningham project. Hopefully in the near future there will be a break-through new algorithm that will speed up factoring in all area's. But in the same time the current algorithms are very advanced already and improved, optimized, and tweaked over the years.

Quantum computing might indeed be the next improvement we will see and until the hardware becomes available we need to stick around with the current solutions.
BotXXX is offline   Reply With Quote
Old 2007-02-15, 11:30   #9
BotXXX
 
BotXXX's Avatar
 
Aug 2003
Europe

2·97 Posts
Default

And my eye just stumbled on this article of D-Wave Systems:
http://www.dwavesys.com/index.php?ma...t01returnid=21

Not sure how this first Orion chip with only 16 qubits whould be interresting for factoring large numbers, but they are planning a 512 qubit system for summer next year with a possible 1024 qubit system before the end of 2008. Which would be available for rent to anyone with enough cash probably.
BotXXX is offline   Reply With Quote
Old 2007-02-15, 12:40   #10
xilman
Bamboozled!
 
xilman's Avatar
 
"𒉺𒌌𒇷𒆷𒀭"
May 2003
Down not across

1039810 Posts
Default

Quote:
Originally Posted by BotXXX View Post
Not sure how this first Orion chip with only 16 qubits whould be interresting for factoring large numbers,.
Not at all.

It could factor numbers as large as 127, assuming I remember properly how these things work. That is, a 2n+1 qubit computer is required to factor a n-bit integer.

A 1024-qubit computer is where it starts to get interesting, as far as integer factorization is concerned.

Paul
xilman is offline   Reply With Quote
Old 2007-02-15, 17:22   #11
FactorEyes
 
FactorEyes's Avatar
 
Oct 2006
vomit_frame_pointer

23×32×5 Posts
Default

Quote:
Originally Posted by jasonp View Post
Bob,

Many happy returns on being in for the long haul...

jasonp
Yes. Also, thanks for for your theoretical contributions, as well as the time you spend plonking down interesting stuff around here.

I think Akruppa said something about the factoring itch: the compulsion to bust a composite down to primes. I have a chronic case of this, but less acute than the severe cases around here. For some reason, it isn't enough to know that some big honker is composite or not.

I'm a mildly-educated dilletante, but I check in every few years and skim a paper or two, and am astonished at how things have changed. I read about P-1, CFRAC, and SQUFOF as an undergraduate. P-1 seems like the biggest value for your money among these early ones, particularly since the wonderful ECM is a supercharged realization of the same idea.

MPQS took advantage of the falling price of memory, using little beyond quadratic reciprocity and iterative methods for solving quadratic congruences. "Wow! You mean, that actually works?" was my first reaction. MPQS has several points at which the asymptotic behavior could be bad, but isn't, or the fixed cost of solving n equations, where n is the size of the factor base, could be really large, but isn't.

It is surprising that NFS is still tops, 17-18 years after it first surfaced. I hope this changes, but not too abruptly. If the QC technology makes all this obsolete, I'll be sorry to see some of these methods become relics.

Factorization is both glamorous and unloved. It doesn't seem to be a tenure-enhancing pursuit, perhaps because it doesn't fall within a narrow subdiscipline, but it's also in its infancy. I hope we have another wave of progress in sparse matrices or factorization of ideal classes, or maybe a departure from the ritual of seeking immense piles of smooth numbers as the cure to what ails us.

The weirdest thing about recent history is that we have gone from special multi-processor machines to run CFRAC on 65-digit integers, to the point where 120-digit composites are fodder for personal computers, and a 95-digit job causes anxiety only because it forces a choice among three different methods.

Put another way: in the early 90s, a generic 70-digit integer was near the frontier of what a single PC could do. These days we routinely pull factors of 70 digits or more from large composites. When I think about how much worse a 70-digit job is than a 65-digit job, this is shocking progress.

Last fiddled with by FactorEyes on 2007-02-15 at 17:26
FactorEyes is offline   Reply With Quote
Reply

Thread Tools


Similar Threads
Thread Thread Starter Forum Replies Last Post
Ramanujan's 125 birth anniversary garo Lounge 1 2011-12-30 23:57
TPS Anniversary Rally: April 13-18 Oddball Twin Prime Search 9 2011-04-21 19:44
Happy 50th Anniversary, Yuri Gagarin ewmayer Science & Technology 2 2011-04-14 01:19
Anniversary. Xyzzy Lounge 4 2007-08-27 11:42

All times are UTC. The time now is 05:05.

Fri Dec 4 05:05:30 UTC 2020 up 1 day, 1:16, 0 users, load averages: 1.51, 1.45, 1.31

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2020, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.