View Single Post
Old 2020-05-02, 15:28   #53
Uncwilly
6809 > 6502
 
Uncwilly's Avatar
 
"""""""""""""""""""
Aug 2003
101×103 Posts

887410 Posts
Default

My 2 centavos is that any overhead that is not error checking related, should not be more than 0.5%. The loss vs. gain doesn't make sense.

What is the potential gain (speed-wise) to the project? Based upon the examination of "expected number of factors", we don't have systematic cheating. We have had a few idiots. Total loss due to them nearly nil.

Does this capture missed factors due to errors (thus speeding the project)? No.

Does this find false factors? No.

Is this an actual residual of the calculation, or is it just a manufactured calculation? It is not a natural byproduct of the calculation like the LL residual.



This is my very naive concept.
Why can't the program take the result to the mod operation at certain select k values and do a rolling operation on them like a CRC? Or certain classes. The k values to be selected could be calculated based upon the exponent. Doing the operation every 10000 k's tested should not put an undue burden on operation and would prove that the at least those items were done in sequence.
Uncwilly is online now   Reply With Quote