20060120, 10:14  #1 
2·3^{2}·7·19 Posts 
design factoring algorithms
All the factoring algorithms are based on number theory concepts.Nothing wrong with this.But we cann't succeded with number thory alone.Why not we try to design factoring algorithms based on machine learning and genetic algorithms etc.Atleast try to embed these concepts with existing algorithms to improve its performance.I see some papers on monto carlo based factoring algorithms etc.But these are not popular yet.

20060120, 13:46  #2 
Aug 2002
Buenos Aires, Argentina
1494_{10} Posts 
Well, if you can show us that the computer can learn how to factor faster than GNFS, I will gladly program that algorithm.
Wait!! In that case the computer should also be able to program itself, so I don't have to work this time. 
20060120, 13:58  #3  
Tribal Bullet
Oct 2004
110111100011_{2} Posts 
Quote:
Besides, most combinatorial algorithms are exponentialtime, and the best current factoring algorithms do much better than that. People can solve the traveling salesman problem for large numbers of cities because of approximation algorithms that give a 'good enough' solution, but close doesn't count with factoring. That being said, this territory is largely unexplored. There are plenty of good books on combinatorial optimization, but the subject is really hard. jasonp Last fiddled with by jasonp on 20060120 at 13:59 

20060120, 17:47  #4  
Bamboozled!
"๐บ๐๐ท๐ท๐ญ"
May 2003
Down not across
10110110000110_{2} Posts 
Quote:
Paul 

20060120, 19:22  #5  
Tribal Bullet
Oct 2004
3^{2}·5·79 Posts 
Quote:
jasonp 

20060120, 20:00  #6  
"Bob Silverman"
Nov 2003
North of Boston
2^{2}×1,877 Posts 
Quote:
the method can be improved. 

20060120, 22:29  #7  
Tribal Bullet
Oct 2004
3^{2}×5×79 Posts 
Quote:
 the only practical choices for polynomials is a pair of quadratics and the polynomials produced by the basem method. Both Murphy and Crandall/Pomerance believe that other choices of polynomials (say, a pair of cubics or degree4 / degree 2) can have better yield, especially if they're chosen so that their roots almost coincide. Does nobody know how to go about looking for such oddball polynomials? I certainly don't, but this has the potential to drastically increase sieving speed if C&P are to be believed.  is there a way to have a hybrid between line sieving and lattice sieving? Can data structures like a compressing radix tree (i.e. HewlettPackard's Judy library) replace a sieve as a practical way of handling extremely long lines? What about using Judy to sieve for several specialQ at once? What about making each special Q a product of several small primes? If each small prime has several roots, then each combination of roots is a separate specialQ and you get more specialQ that are are small (and so keep the relation yield high for longer)  nobody understands the behavior of cycles generated in the filtering phase. What if you took large primes found during sieving and turned them into the next 'special Q' for use in a lattice siever? Would suddenly having lots more of each large prime cause a cycle explosion sooner? (one of my friends is a chemist, and when I explained the cycle explosion problem and how it looks like a phase change, he said that in physical phenomena a phase change somehow needs a 'seed' to get started) I've wondered about these for a while, and they may go nowhere. But the point is that these ideas do not sound obviously dumb to me, and nobody has looked at them seriously. Why try to apply some unrelated field to factoring when the basics still need exploring? jasonp 

20060120, 22:38  #8  
"Nancy"
Aug 2002
Alexandria
2,467 Posts 
Quote:
Alex Last fiddled with by akruppa on 20060121 at 07:29 

20060120, 23:47  #9  
"Bob Silverman"
Nov 2003
North of Boston
16524_{8} Posts 
Quote:


20060120, 23:52  #10  
"Bob Silverman"
Nov 2003
North of Boston
2^{2}·1,877 Posts 
Quote:
very difficult lattice basis reduction computations. If it is possible to find such pairs it will require extensive precomputation. None of the proposed ideas are going to have a dramatic impact. All of them only affect the o(1) term in the run time. At best, they will have only a constant time speed improvement. It all depends on the size of the norms being sieved, and none of these ideas reduce the asymptotic size of those norms. One could get a speed improvement from 2 cubics if both have 3 real roots. But it will not be that large. Dan's idea of replacing sieving with ECM is TOTALLY impractical. 

20060121, 01:51  #11  
Tribal Bullet
Oct 2004
3^{2}·5·79 Posts 
Quote:
For the record, the qsieve package has an MPQS implementation whose factor base grows over time to incorporate large primes found during sieving. From what I can tell of the results available online, it doesn't seem to help, but it does reduce the number of cofactors that need factoring via nonsieving means. I suppose I should have stated up front that I'd be happy with a constant time improvement. MPQS is a constanttime improvement over QS (I think), and one is practical while the other really is not. Quote:
Another possibility for the linear algebra stage: is it possible to leverage the huge literature on sparse matrix methods to make the linear algebra easier? CWI's filtering algorithms don't reference the scientific computing literature on this subject at all; is that because there's nothing worth borrowing? jasonp 

Thread Tools  
Similar Threads  
Thread  Thread Starter  Forum  Replies  Last Post 
Factoring projects algorithms  What is Aliqueit ?  Romuald  Factoring  75  20160827 09:46 
General factoring algorithms not using sieving  mickfrancis  Factoring  21  20160222 19:38 
Implementing Factoring Algorithms  Raman  Hobbies  45  20090511 05:11 
Overview of Factoring Algorithms  ThiloHarich  Factoring  0  20070902 20:32 
factoring algorithms are patented?  koders333  Factoring  1  20060119 20:04 