![]() |
|
|
#1 |
|
P90 years forever!
Aug 2002
Yeehaw, FL
19·397 Posts |
This post is primarily for akruppa, the ECM/P-1 expert, but others are welcome to advise.
The new Primenet server can hand out ECM assignments and record results. I want to implement a simple scheme that doesn't have to be optimal - just not too wasteful. Here is my effort - please help improve it. The database keeps a "total effort" value which is the cumulative total of "curves * B1". The server hands out ECM assignments for B1 = 50000 if total_effort < 300 * 50000, B1 = 250000 if total_effort < 750 * 250000, B1 = 1000000 if total_effort < 1800 * 1000000, etc. Three major questions: 1) How do I simply adjust the "curves * B1" effort when a user uses prime95 with a non-standard B2 (100 * B1)? 2) Can the "total_effort" formula of curves * B1 be improved. 3) Is the table for switching to higher B1's correct? Would a table with more entries be better? Would a continuously increasing B1 be better? Thanks for any help. The target ECM candidates are larger Mersennes, say M1200 - M999999, since Cunningham's are better handled by GMP-ECM. If the formula works for Cunningham numbers and there is any ECM work prime95 can usefully do by the time the client and server are ready that is a bonus. |
|
|
|
|
|
#2 |
|
Sep 2005
12710 Posts |
Sounds interesting...
I can see it now: George (Woltman)'s first project - find all the prime Mn. George's second project - find the factorizations of all Mn! :-) Thanks for all the help you've given me over at Mersenneplustwo (http://bearnol.is-a-geek.com/Mersenn...neplustwo.html) already [eg I find your excellent mprime very useful for my purposes, even though it doesn't (currently) have network facilities for factoring] and I wish you good luck with this new development (maybe it can even benefit me directly in future? - which would be fantastic - I'm currently using ECMNet and GMP-ECM, but this doesn't (yet) work too well with GWNUM). In the meantime, if you are including even exponents in your efforts, please would you mind letting me know of any factors/progress you find that might be of interest to me. Or perhaps, to avoid duplication of effort (or stepping on my toes :-) ) you might consider some system to avoid those few exponents I'm already pursuing - otherwise I'm afraid my own efforts will be swamped by the mighty army of gimpsters! Thanks again, James |
|
|
|
|
|
#3 |
|
"Nancy"
Aug 2002
Alexandria
2,467 Posts |
Hi George,
the method you propose isn't all wrong. A better work estimate than sum-of-B1-values could be done by estimating each curve's chance of success for different factor sizes, as we are doing for the Cunningham tables now. So long as people don't do heaps of curves at small B1, or only few curves at very high B1, the sum-of-B1 is ok. As a first approximation, the effect of B2!=100 can be estimated roughly like (log(B2/B1)/log(100))^1.3. Again not a rigorous analysis, but usually about right. Doing the B1 escalation in more smaller steps may improve efficiency a little but I expect not by much. I'll look into this in more detail when my exams are over, which will be in late April. Alex |
|
|
|
|
|
#4 | |
|
Nov 2003
22·5·373 Posts |
Quote:
I solved this problem completely in my joint paper with Sam Wagstaff: A Practical Analysis of the Elliptic Curve Factoring Algorithm. Math. Comp. If you can't get a copy, send me your snail mail address and I will send a hard copy. I no longer have a soft copy. When I was laid off from RSA I was basically hustled out the door (along with everyone else) and wound up leaving a lot of stuff on the RSA computers. Included among the stuff that got left behind were soft-copies of the TeX original text for all of my papers. My severance agreement with RSA precludes my making any comments about the company, so I will refraiin from expressing my opinion about how they treated everyone during the layoff. |
|
|
|
|
|
|
#5 |
|
Aug 2002
Buenos Aires, Argentina
2·683 Posts |
If you want to distribute your paper, I'd recommend you to scan it, and then you can send it through e-mail.
It is strange for me that you didn't have a back up of your papers at your home. From your comments it appears that even if you continued working at RSA, if a fatal error occurred to your office hard disk, you would have lost your paper's originals. |
|
|
|
|
|
#6 | |
|
Bamboozled!
"𒉺𒌌𒇷𒆷ð’€"
May 2003
Down not across
250428 Posts |
Quote:
Your second paragraph suggests that you don't understand how corporations protect their assets. Do you really think that RSA wouldn't be backing up their workers' machines? You may be right. If so ,and if that is still RSA's policy, I would advise selling any of their stock you may own. Paul |
|
|
|
|
|
|
#7 |
|
Aug 2002
Buenos Aires, Argentina
101010101102 Posts |
Well, most corporations only back up sensitive information stored in their servers. It is supposed that you have to upload your classified information to the server in order to be backed up.
|
|
|
|
|
|
#8 | |
|
∂2ω=0
Sep 2002
República de California
101101011111112 Posts |
Quote:
Due to similar considerations about where best to keep one's "personal IP", as it were, in the past 5 years I've made a habit of stashing stuff I'm working on on a friend's ftp server. I find it very convenient to have local copies on both my home and work PCs, but at least once a week I send anything I've modified recently and don't want to risk losing to the 3rd-party server. Bit of a pain, but as long as you're not dealing with an industrial-strength number of changed files every week, still quite manageable. Of course the more-paranoid companies out there will have mechanisms in place that would make such a thing difficult (if not impossible) to do. Bob, I realize this won't help you with the TeX documents stored on your former machine at RSA, but if you're talking about research papers (which are in the public domain) and are still on speaking terms with anyone there, have you considered just politely asking if they would allow you to make copies of your non-work-related data (under their supervision, of course, i.e. someone there vets all the files in question before handing them off to you)? You obviously know the folks there better than anyone here, but if the data really are important to you, it's an approach that might be worth considering. At worst you'd be no worse off than you are at present. |
|
|
|
|
|
|
#9 | |
|
Nov 2003
164448 Posts |
Quote:
Been there. Done that. They reformatted the hard drives on my work computers shortly after I left... |
|
|
|
|
|
|
#10 | |
|
"William"
May 2003
New Haven
2·7·132 Posts |
Quote:
Practical ECM has advanced since this paper was written with the addition of polynomial extensions in stage 2 - it isn't clear how much of an impact this has. Also be aware that many people have reported difficulty getting numbers out this paper. There is a thread here at MersenneForum dedicated to discussion of this paper. Multiple people have reported inability to recreate the tables. In a different thread we have recently had Bob's agreement that the formula for the two-parameter Dickman function in the paper is wrong, although he is confident the work used the correct formula and asserts that other people have been able to recreate the tables. |
|
|
|
|
|
|
#11 | |
|
Aug 2002
Buenos Aires, Argentina
136610 Posts |
Quote:
|
|
|
|
|
![]() |
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| What Bounds to choose, and what are Bounds | 144 | Information & Answers | 5 | 2017-03-15 13:36 |
| Extending P-1 Bounds | TObject | Software | 4 | 2012-10-10 17:42 |
| ECM bounds | Unregistered | Information & Answers | 4 | 2011-07-17 20:18 |
| unusually low P-1 bounds | ixfd64 | PrimeNet | 9 | 2011-04-02 18:45 |
| Bounds explanation | Uncwilly | Lounge | 4 | 2011-04-01 19:15 |