mersenneforum.org  

Go Back   mersenneforum.org > Fun Stuff > Lounge

Reply
 
Thread Tools
Old 2011-09-19, 22:28   #34
Christenson
 
Christenson's Avatar
 
Dec 2010
Monticello

5×359 Posts
Default

Quote:
Originally Posted by Brian-E View Post
Could saving intermediate residues potentially cause an issue with less certain integrity of double-checking? While the PrimeNet server could and should keep the intermediate residues non-accessible to general users, the very existence of the extra information might lead to questions about whether people were still somehow getting access to it and thereby fraudulently performing a double-check from a point somewhere during the LL test (like towards the end) instead of starting at the beginning.
I think we'd need a little bit of cryptographic help from P95 here...that is, a BLOCK is DC'd, not the whole exponent...and you'd have to pull whatever cryptographic trick off that P95 does now to ensure that you aren't just guessing at the residue.
Christenson is offline   Reply With Quote
Old 2011-09-20, 09:01   #35
Brian-E
 
Brian-E's Avatar
 
"Brian"
Jul 2007
The Netherlands

63058 Posts
Default

Quote:
Originally Posted by Christenson View Post
I think we'd need a little bit of cryptographic help from P95 here...that is, a BLOCK is DC'd, not the whole exponent...and you'd have to pull whatever cryptographic trick off that P95 does now to ensure that you aren't just guessing at the residue.
Yes, the security can be achieved technically with already known techniques. And yes, this is already being done out of necessity for the final result as far as LL tests are concerned (last two hex digits of the residue not available before successful double check is completed). But the ICT world is littered with instances of security being breached, very often through human error in applying the security procedures. Extra information which needs to be kept secure implies extra risk.

Last fiddled with by Brian-E on 2011-09-20 at 09:03 Reason: hex digits, not bits
Brian-E is offline   Reply With Quote
Old 2011-09-20, 10:02   #36
xilman
Bamboozled!
 
xilman's Avatar
 
"π’‰Ίπ’ŒŒπ’‡·π’†·π’€­"
May 2003
Down not across

10,753 Posts
Default

Quote:
Originally Posted by Brian-E View Post
Extra information which needs to be kept secure implies extra risk.
True. It also implies extra inconvenience. TANSTAAFL.

However, a client could create a public key pair when it is initialised. The server registers the public key and uses it to verify all important communications from the client which, of course, must sign them with the private key. A residue is not sent en clair but, rather, concatenated with a nonce and the result signed; the result of which is encrypted to the server's public key. (Alternatively, theThe server verifies the signature and discards the nonce. The nonce is there to discourage replay attacks but this feature could be removed if such attacks are not thought to be important.


Paul
xilman is offline   Reply With Quote
Old 2011-09-20, 11:37   #37
Brain
 
Brain's Avatar
 
Dec 2009
Peine, Germany

1010010112 Posts
Default

Quote:
Originally Posted by NBtarheel_33 View Post
Yeah, I don't know what they are talking about either. Not sure why we would be doing tests on 3M exponents.
When I used the term "3M" I meant "FFT of length 3M" i.e. current LL test range.
Brain is offline   Reply With Quote
Old 2011-09-20, 16:59   #38
NBtarheel_33
 
NBtarheel_33's Avatar
 
"Nathan"
Jul 2008
Maryland, USA

21338 Posts
Default

Quote:
Originally Posted by Brian-E View Post
Yes, the security can be achieved technically with already known techniques. And yes, this is already being done out of necessity for the final result as far as LL tests are concerned (last two hex digits of the residue not available before successful double check is completed). But the ICT world is littered with instances of security being breached, very often through human error in applying the security procedures. Extra information which needs to be kept secure implies extra risk.
Suppose that Cray joins GIMPS (a guy can dream, can't he?) and starts turning in 100x the daily LL tests we get now. Wouldn't this pose the same security risks that you're discussing here - basically, a higher volume of information that needs to be kept secure per time interval?

Keeping 100 LL tests secure, in theory, ought to be the same as keeping 100 partial residues secure.

On the other hand, of the million-plus LL tests and tens-of-millions of factoring assignments that GIMPS has produced over its lifespan, what percentage of these results have been shown to be patent fakes? Moreover, in nearly sixteen years of existence, how many security threats have been waged on GIMPS? How about SETI? How about Folding@Home?

What's more sexy: "Hey baby, I just brought down AT&T's entire network" or "Hey baby, I just screwed up a bunch of 20-million-digit numbers belonging to some math geeks"?
NBtarheel_33 is offline   Reply With Quote
Old 2011-09-20, 21:44   #39
aketilander
 
aketilander's Avatar
 
"Γ…ke Tilander"
Apr 2011
Sandviken, Sweden

10001101102 Posts
Thumbs up Saving capacity

There will be several advantages splitting the really big LL-assignments in smaller pieces.

1. Today a number of persons dedicate more then one core to do a large first time LL-assignment in a reasonable time. This is a much less efficient way of using the capacity. If we have intermediary files saved the DD-assignments could start with any part, so if you would like to speed up a DD-assignment then you dedicate different parts to different cores without the loss of capacity. Due to this the DD-wave may in the end catch up on the LL-wave bacause it will be much more efficient doing a DD then a First time-LL.

2. Errors will be a growing problem when the assignments grow. I fear that the error-rate when really large LL-assignments are concerned will be a really large problem. If we have a mismatch between the first time-LL and the DD, with intermediary files saved, we could continue from the last point having a match and we would not need to start another DD. Again we would save capacity.

3. Looking at the project as a whole, I find the number of interrupted assignments which never finish is quite substantial. If we had intermediary files backuped on the server we would again save capacity. Today the waste due to unfinished assingments ist quite substantial.

I think the security problems could be managed.

Last fiddled with by aketilander on 2011-09-20 at 21:47
aketilander is offline   Reply With Quote
Old 2011-09-20, 21:59   #40
Brian-E
 
Brian-E's Avatar
 
"Brian"
Jul 2007
The Netherlands

CC516 Posts
Default

Quote:
Originally Posted by NBtarheel_33 View Post
Suppose that Cray joins GIMPS (a guy can dream, can't he?) and starts turning in 100x the daily LL tests we get now. Wouldn't this pose the same security risks that you're discussing here - basically, a higher volume of information that needs to be kept secure per time interval?

Keeping 100 LL tests secure, in theory, ought to be the same as keeping 100 partial residues secure.
Well, aside from the fact that we are hardly likely to see such a sudden huge increase in the LL work being turned in, I would think that it is more sensible to consider the volume of information to be secured per GHz-day of work, or perhaps per tested exponent. This is largely because if the integrity of the LL double checks were to be thrown into question and an investigation were needed, then the quantity of work that would need to be investigated would be important, not the length of absolute time that had passed while the tests were being done.


Quote:
On the other hand, of the million-plus LL tests and tens-of-millions of factoring assignments that GIMPS has produced over its lifespan, what percentage of these results have been shown to be patent fakes? Moreover, in nearly sixteen years of existence, how many security threats have been waged on GIMPS? How about SETI? How about Folding@Home?
No idea. But we've had strange, unexplained things happen. Such as small factors being found by P-1 which should have been found by the trial factoring which had supposedly already been done. (Maybe this was later explained by a bug found in the TF routines? Who remembers? Whatever, this sort of thing always throws up the possibility that someone has turned in results without carrying out the work.) And just a few months ago you may remember that someone here admitted to exploiting a bug which gave him huge amounts of credit on PrimeNet for turning in large factors apparently from P-1 but which were actually the product of two smaller factors found by TF. He said he did it to highlight the bug, but the credit he was getting was very nice too. So let's not rule out the possibility of fraud: it would only take one person to run riot for whatever reason to severely dent our confidence in the database of results.

Quote:
What's more sexy: "Hey baby, I just brought down AT&T's entire network" or "Hey baby, I just screwed up a bunch of 20-million-digit numbers belonging to some math geeks"?
Depends what turns you on.

I admit I'm being a touch paranoid and a spoil-sport. Pouring cold water on good ideas is a hobby of mine and it makes me really popular too. As aketilander says, the security problems can be managed. Maybe in the way Paul outlines.

Last fiddled with by Brian-E on 2011-09-20 at 22:01 Reason: Quote tags
Brian-E is offline   Reply With Quote
Old 2011-09-21, 04:31   #41
Christenson
 
Christenson's Avatar
 
Dec 2010
Monticello

5·359 Posts
Default

I'd also suggest that a certain amount of diversity of workers should be required; that is, if I turn in the LL test, or part of the LL test, I should not be able to turn in the same part of the DC.

I don't have the same problem when finding factors, because these are so easily verified -- but the sort of thing Brian-E discusses should be checked into -- on a confidential basis -- because, if you have a systematic cheater, that is the only way to catch him or her. On the other hand, my factoring efforts had a dry spell lasting a month this summer...not clear if it was simply bad luck (e.g. TF'ing in well P-1'ed territory and P-1'ing in well TF'ed territory) or erros from the high temperatures, since when double-checks were run on found factors, they were invariably found.

Anyway, another factor found tonight, another DC that won't be needed.....but 43 that will still be needed...

Last fiddled with by Christenson on 2011-09-21 at 04:33
Christenson is offline   Reply With Quote
Old 2011-09-21, 05:53   #42
Dubslow
Basketry That Evening!
 
Dubslow's Avatar
 
"Bunslow the Bold"
Jun 2011
40<A<43 -89<O<-88

722110 Posts
Default

Quote:
Originally Posted by axn View Post
Another data point: Today, M35 is at 274. That's a drop of 104 positions in 244 days. Linear projection gives me 2014 days to fall off top 5000 - that is 5.5 years! That means in practice, it'll be like 3-4 years!
http://primes.utm.edu/primes/lists/all.txt

It's now at 301, at 27pos/~50days ~ .5 pos/day --> ~25 years.
http://www.wolframalpha.com/input/?i...+days+to+years

Also, each of my P95 save files are about 6MB, which concurs with http://www.wolframalpha.com/input/?i...ion+bits+to+MB =~6.75MB for a 54M exponent. Or, at a bare minimum, a 332M exponent would have a ~45MB file size.

45MB*50,000 ~ 2.25 TB. That's 50,000 backup files of a 332M exponent. That costs a bit less than $100 at current HDD prices. We're all coming up with the idea that this is perfectly within our means. The only part I'd be worried about is bandwidth, because downloading a 45MB save file isn't trivial (yet).

EDIT: Whoops, seems I somehow missed there was a second page, along with its discussions. My points have all been brought up :P

Last fiddled with by Dubslow on 2011-09-21 at 06:02
Dubslow is offline   Reply With Quote
Old 2011-09-22, 22:44   #43
Mr. P-1
 
Mr. P-1's Avatar
 
Jun 2003

7·167 Posts
Default

Quote:
Originally Posted by Brian-E View Post
...we've had strange, unexplained things happen. Such as small factors being found by P-1 which should have been found by the trial factoring which had supposedly already been done. (Maybe this was later explained by a bug found in the TF routines? Who remembers?
I found one such, which did turn out to be the result of a bug. This does not prove that every such was the result of a bug.
Mr. P-1 is offline   Reply With Quote
Old 2011-09-23, 01:41   #44
davieddy
 
davieddy's Avatar
 
"Lucan"
Dec 2006
England

145128 Posts
Default

Quote:
Originally Posted by Mr. P-1 View Post
This does not prove that every such was the result of a bug.
I would have said a "bug" of some type would invariably describe such an occurence.

David
davieddy is offline   Reply With Quote
Reply

Thread Tools


Similar Threads
Thread Thread Starter Forum Replies Last Post
GIMPS and Fibonacci prime project? Shen Information & Answers 6 2017-10-25 20:59
Will GIMPS Ever Discover a New Prime Through Doublecheck? jinydu Lounge 34 2015-07-22 21:41
How do I test if it is a mersenne prime on GIMPS? spkarra Math 21 2015-01-23 18:13
gimps prime ps3 schoash Information & Answers 1 2008-12-24 13:57
GIMPS 10'th prime michaf Sierpinski/Riesel Base 5 6 2006-09-12 17:47

All times are UTC. The time now is 22:53.


Fri Jul 16 22:53:22 UTC 2021 up 49 days, 20:40, 1 user, load averages: 1.61, 2.20, 2.62

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.