mersenneforum.org  

Go Back   mersenneforum.org > Prime Search Projects > Twin Prime Search

Reply
 
Thread Tools
Old 2007-01-05, 17:25   #188
ValerieVonck
 
ValerieVonck's Avatar
 
Mar 2004
Belgium

7·112 Posts
Default

I just got "Daily quota(2) exceeded" ?
Is this normal?
ValerieVonck is offline   Reply With Quote
Old 2007-01-05, 17:52   #189
Rytis
 
Rytis's Avatar
 
Nov 2006

23×11 Posts
Default

No; it means that you are returning invalid results. Standard daily quota is 1500, and every invalid result halves the quota, and every successful result duplicates it (no more than up to 1500). This is done to stop bad computers from trashing lots of work.

Might have just been a glitch with your PC?
Rytis is offline   Reply With Quote
Old 2007-01-05, 18:27   #190
ValerieVonck
 
ValerieVonck's Avatar
 
Mar 2004
Belgium

7×112 Posts
Default

Quote:
Originally Posted by Rytis View Post
No; it means that you are returning invalid results. Standard daily quota is 1500, and every invalid result halves the quota, and every successful result duplicates it (no more than up to 1500). This is done to stop bad computers from trashing lots of work.

Might have just been a glitch with your PC?
Strange .... I have just a new case with new PSU???
ValerieVonck is offline   Reply With Quote
Old 2007-01-05, 20:24   #191
smh
 
smh's Avatar
 
"Sander"
Oct 2002
52.345322,5.52471

29·41 Posts
Default

Quote:
Originally Posted by Rytis View Post
100% in my post means that all tasks are now sent in two copies. We are not looking back at old ones (although I will surely feel bad if we missed the prime in the previous ranges!).
This means that you set up double checking. This wastes 50% of the resources.

Why are you doing this?
Who asked you to do this?

Several people have told you it's wasting resources.

I'm of doing some other factorizations for the time being
smh is offline   Reply With Quote
Old 2007-01-05, 20:43   #192
MooMoo2
 
MooMoo2's Avatar
 
"Michael Kwok"
Mar 2006

22358 Posts
Default

There's always the manual reservations available for those who don't want double checking.
MooMoo2 is offline   Reply With Quote
Old 2007-01-06, 03:28   #193
Skligmund
 
Skligmund's Avatar
 
Dec 2006
Anchorage, Alaska

2·3·13 Posts
Default

SMH:

Wow. Okay. So let's assume that one of the non-cecked results from only 5M ago was a twinprime, but wasn't noted as such do to a bad result. This would mean (if we don't doublecheck) that we would do twice the amount of work to find the next one. Now, all we can hope for at this moment is that we did not miss a twin prime, and make sure we don't miss one. The work that is required to make up for a missed twin prime is more than what I think we want....

You seem to be reacting fairly hostile. The feeling I get when I read your post is that you are very pissed off, and not thinking rationaly about the situation, and quitting because you don't like it.

Personally, I don't see a problem with error checking. If it has been shown factually that there are a noticeable amount of error results, one must take corrective action. Tell me why this should not be so.

Last fiddled with by Skligmund on 2007-01-06 at 03:28
Skligmund is offline   Reply With Quote
Old 2007-01-06, 04:16   #194
Prime95
P90 years forever!
 
Prime95's Avatar
 
Aug 2002
Yeehaw, FL

17×487 Posts
Default

Quote:
Originally Posted by Skligmund View Post
So let's assume that one of the non-cecked results from only 5M ago was a twinprime, but wasn't noted as such do to a bad result...

...and not thinking rationaly about the situation, and quitting because you don't like it.

Personally, I don't see a problem with error checking. If it has been shown factually that there are a noticeable amount of error results, one must take corrective action. Tell me why this should not be so.
SMH's logic is correct. The fact that you *may* have missed a twin prime due to a bad result is irrelevant.

Look at it this way: Your goal is to find a world record twin prime in as short a time as possible. I hand you a sieved range of 10M and tell you that someone searched the lower 5M and found 90% (10% error rate) of the primes and there was no twin. Now, given your goal, would you rescan the lower 5M or would you scan the higher 5M where you *know* you will find 10 times as many new primes and 10 times as many chances of finding that twin prime?

I think SMH's frustration is that 50% of the project's CPU power is searching that lower 5M where chances of success are 10 times less than searching in virgin territory.
Prime95 is offline   Reply With Quote
Old 2007-01-06, 05:13   #195
Skligmund
 
Skligmund's Avatar
 
Dec 2006
Anchorage, Alaska

2×3×13 Posts
Default

Well, I guess it is just my nature as an aircraft mechanic to be sure about everything that is done. I was never good at taking things for chance, and always wanted to KNOW what everything was, is, and will be.

I just noticed I have 300-400 error results from one computer in 2 days. Fortunately for me, I think almost all of them were caught. I have since repaired the problem, and don't expect to have any more errors from any of my computers. If the double-checking had not been in effect, they would have passed through as valid. That is a whole 1M miscalculated. Thats one heck of a gap to be missing IMHO.

Just some opinions from me, I'll keep crunching one way or the other. :D
Skligmund is offline   Reply With Quote
Old 2007-01-06, 07:31   #196
jmblazek
 
jmblazek's Avatar
 
Nov 2006
Earth

26 Posts
Default

Quote:
Originally Posted by Prime95 View Post
I think SMH's frustration is that 50% of the project's CPU power is searching that lower 5M where chances of success are 10 times less than searching in virgin territory.

There's a "little" adjustment to that statement. Using your example, 50% of the project's CPU power is NOT searching that lower 5M (rechecking what's already been done). 50% of the project's CPU power IS double checking the virgin territory. AND it is my understanding that it's only being done to determine where the discrepancies are and the failure rate.

Nowhere has Rytis stated that old WU's were being re-issued and I have not noticed any "old" WU's on my machines. He actually states in a previous post, "We are not looking back at old ones." He's just attempting to understand the current failure situation.

I feel confident that as soon as he can get a reasonable idea as to what's a "normal" failure rate or even find who/what's causing the discrepancies, he'll return to 90%/10%. Maybe through this process, he might even feel confident to raise it to 95%/5% or higher. Can anyone shed some light on what's a "normal" failure/discrepancy rate???

Or maybe someone can come along with strong enough logic to guarantee a twin before 25G with 100% first pass.

I'm sure there's an explanation for these discrepancies...patience will discover it.

p.s. The avg. TP density is 1 twin every 13.6G. If a twin was skipped below 3G then statistically another 10.6G to 24.2G would need to be searched to possibly find the next twin. Of course if we knew the twin was below 3G then the obvious answer would be to double check 0G-3G...heck, even triple check still comes out better. But we don't know...so let's hope the error rate is extremely low or can be pinpointed to a specific cause.

If the twin is found tomorrow, then we've wasted a lot of cpu cycles talking about this.
jmblazek is offline   Reply With Quote
Old 2007-01-06, 08:03   #197
MooMoo2
 
MooMoo2's Avatar
 
"Michael Kwok"
Mar 2006

1,181 Posts
Default

Quote:
Originally Posted by jmblazek View Post
Can anyone shed some light on what's a "normal" failure/discrepancy rate???
GIMPS's failure rate is about 1-2%. Since our workunits are a lot smaller, the expected failure rate should be lower than 0.01%.
MooMoo2 is offline   Reply With Quote
Old 2007-01-06, 08:32   #198
b2riesel
 

5·1,931 Posts
Default

Hello SMH, George, Rytis and everyone else here in the TwinPrime Search Forum. jmblazek brought this thread to my attention and I'd like to give my opinion on the deal.

I've read what George and SMH have said about wasting CPU cycles on a double check. At first I didn't agree because my brain is still in Riesel Sieve land where a missed prime can create massive headaches and tons of unnecessary work. However, from what I read and have chatted with Rytis about....this project is a little different.

I've been mulling over the idea of new users/hosts having to complete an audit amount of workunits for double checking before they are allowed the priviledge to crunch on first run workunits. Say you had to complete 10 workunits that must ALL match before you can crunch your first workunit on the main effort. This would be a fair compromise. It would show that your computer passes certain tests before it wastes effort in trying to crunch with software that is VERY picky about CPUs, memory, powersupplies, heat issues and so forth. Many people believe that if their computer never crashes...it is good for anything....and most of us know that isn't the case when it comes to prime finding.

If your computer is sending out bad residuals....then you most likely have a hardware problem that needs attention. I can't tell you how many times a simple change of a PSU has made most of our bad residual problems go away. Or simple memory timings in the mother board BIOS.

So...I believe you should think about an audit period for each new user and new host. Then dedicate a few machines that audit random user/hosts full time. Let the bulk of your users steam full ahead on an effort of finding that mega twin prime. I think this is a compromise that everyone can atleast call a starting point to an agreement.

Lee Stephens
Head Cheese at Riesel Sieve
www.rieselsieve.com
  Reply With Quote
Reply



Similar Threads
Thread Thread Starter Forum Replies Last Post
BOINC Unregistered Information & Answers 6 2010-09-21 03:31
BOINC.BE BATKrikke Teams 2 2010-03-05 18:57
Boinc Xentar Sierpinski/Riesel Base 5 4 2009-04-25 10:26
BOINC? masser Sierpinski/Riesel Base 5 1 2009-02-09 01:10
BOINC bebarce Software 3 2005-12-15 18:35

All times are UTC. The time now is 13:41.


Fri Jul 7 13:41:50 UTC 2023 up 323 days, 11:10, 0 users, load averages: 1.00, 1.01, 1.10

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2023, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.

≠ ± ∓ ÷ × · − √ ‰ ⊗ ⊕ ⊖ ⊘ ⊙ ≤ ≥ ≦ ≧ ≨ ≩ ≺ ≻ ≼ ≽ ⊏ ⊐ ⊑ ⊒ ² ³ °
∠ ∟ ° ≅ ~ ‖ ⟂ ⫛
≡ ≜ ≈ ∝ ∞ ≪ ≫ ⌊⌋ ⌈⌉ ∘ ∏ ∐ ∑ ∧ ∨ ∩ ∪ ⨀ ⊕ ⊗ 𝖕 𝖖 𝖗 ⊲ ⊳
∅ ∖ ∁ ↦ ↣ ∩ ∪ ⊆ ⊂ ⊄ ⊊ ⊇ ⊃ ⊅ ⊋ ⊖ ∈ ∉ ∋ ∌ ℕ ℤ ℚ ℝ ℂ ℵ ℶ ℷ ℸ 𝓟
¬ ∨ ∧ ⊕ → ← ⇒ ⇐ ⇔ ∀ ∃ ∄ ∴ ∵ ⊤ ⊥ ⊢ ⊨ ⫤ ⊣ … ⋯ ⋮ ⋰ ⋱
∫ ∬ ∭ ∮ ∯ ∰ ∇ ∆ δ ∂ ℱ ℒ ℓ
𝛢𝛼 𝛣𝛽 𝛤𝛾 𝛥𝛿 𝛦𝜀𝜖 𝛧𝜁 𝛨𝜂 𝛩𝜃𝜗 𝛪𝜄 𝛫𝜅 𝛬𝜆 𝛭𝜇 𝛮𝜈 𝛯𝜉 𝛰𝜊 𝛱𝜋 𝛲𝜌 𝛴𝜎𝜍 𝛵𝜏 𝛶𝜐 𝛷𝜙𝜑 𝛸𝜒 𝛹𝜓 𝛺𝜔