![]() |
Google Cloud Compute 31.4 Trillion Digits of Pi
Repost a gazillion times. But I had to drop it here.
Blogs:[LIST][*][URL="http://www.numberworld.org/blogs/2019_3_14_pi_record/"]My Blog[/URL][*][URL="https://cloud.google.com/blog/products/compute/calculating-31-4-trillion-digits-of-archimedes-constant-on-google-cloud"]Google Cloud Blog[/URL][/LIST] Stats:[LIST][*]Decimal Digits: 31,415,926,535,897[*]Hexadecimal Digits: 26,090,362,246,629[*]Wall Time: 121 days (September 22, 2018 to January 21, 2019)[*]Program: y-cruncher 0.7.6.9486 (17-SKX ~ Kotori AVX512-DQ)[/LIST] Hardware: Google Cloud Platform[LIST][*]Primary Node: 1 x n1-megamem-96 (96 vCPU, 1.4TB) with 30TB of SSD[*]Storage Nodes: 24 x n1-standard-16 (16 vCPU, 60GB) with 10TB of SSD[/LIST] |
[QUOTE=Mysticial;510783][URL="https://cloud.google.com/blog/products/compute/calculating-31-4-trillion-digits-of-archimedes-constant-on-google-cloud"]Google Cloud Blog[/URL][/QUOTE]What would be the estimated cost of doing this "in the cloud" for someone as an ordinary user?
|
[QUOTE=retina;510801]What would be the estimated cost of doing this "in the cloud" for someone as an ordinary user?[/QUOTE]
Emma told me what that number was, but I'm not sure if i can disclose it. OTOH, some people on HN [URL="https://news.ycombinator.com/item?id=19387564"]calculated it to be $170k USD.[/URL] I'll just say that both figures are very high. And there's a lot of room to optimize it down should someone decide to replicate or beat it in the future via a cloud. It can be done in under $20k, if you custom-built something specifically for such a computation. |
[QUOTE=Mysticial;510806]Emma told me what that number was, but I'm not sure if i can disclose it.
OTOH, some people on HN [URL="https://news.ycombinator.com/item?id=19387564"]calculated it to be $170k USD.[/URL] I'll just say that both figures are very high. And there's a lot of room to optimize it down should someone decide to replicate or beat it in the future via a cloud. It can be done in under $20k, if you custom-built something specifically for such a computation.[/QUOTE]Yeah, that is pretty much the sort of difference I would have expected. Using the "cloud" is an expensive way to do things. |
[QUOTE=retina;510811]Yeah, that is pretty much the sort of difference I would have expected. Using the "cloud" is an expensive way to do things.[/QUOTE]
The whole point of it was advertising. Google Cloud is in third place and this is a cheap (for them) and viral way to remind the world that they exist. News media are already picking up the story ([URL="https://www.usatoday.com/story/tech/news/2019/03/14/pi-world-record-achieved-google-employee-emma-haruka-iwao/3160528002/"]USA Today[/URL], [URL="https://www.bbc.com/news/technology-47524760"]BBC[/URL], [URL="https://www.washingtonpost.com/business/2019/03/14/google-employee-breaks-guinness-world-record-calculating-trillion-digits-pi/"]Washington Post[/URL], etc). Not unlike what happens when we discover a new Mersenne prime and issue a press release. Except the public cares more about pi. Maybe it's just me, but it seems like kind of a ho-hum result. No new "thing" was discovered. |
[QUOTE=GP2;510822]The whole point [/QUOTE]
+1. :goodposting: |
Let me say, when the corporations of the world decide that the best way to get advertising is to lend a bit of their big iron toward pure mathematics to get some word-of-mouth for their latest thing, I'm all for it. They could have just as easily put the money into a teenage model, and where's the fun in that? :smile:
Edit: but next time, let's convince them to chase down zeta zeros, we could use an update. |
[QUOTE=CRGreathouse;511259]Let me say, when the corporations of the world decide that the best way to get advertising is to lend a bit of their big iron toward pure mathematics to get some word-of-mouth for their latest thing, I'm all for it.[/QUOTE]
How much TF work would that done or LL on 100M digit numbers? |
[QUOTE=CRGreathouse;511259]Let me say, when the corporations of the world decide that the best way to get advertising is to lend a bit of their big iron toward pure mathematics to get some word-of-mouth for their latest thing, I'm all for it. They could have just as easily put the money into a teenage model, and where's the fun in that? :smile:
Edit: but next time, let's convince them to chase down zeta zeros, we could use an update.[/QUOTE] This isn't the first time that Google has pulled a stunt like this. They did the same thing with [URL="https://shattered.io/"]SHAttered[/URL]. Obviously, they don't attempt something like this unless there's a reasonable probability of success to be worth the resource investment. It's not just about the hardware time, more importantly, it's the human resources of coordinating the whole thing. Google may have spent 6 figures of GCP time on this Pi computation, but I'm pretty sure they've spent much more than that on employee time - considering the number of levels of management and VPs this had to go through on their side. So it's not as simple as to just throw a dude's code on the cloud and let it run for half a year. Emma spent a considerable amount of time working out the right configuration to do this. She also had to deal with a lot of people in the company to make it happen. And since Google's reputation is at stake as well, I was told afterwards that they did independent cross-checking with other sources before they could even trust me and the program. All of this carries a high implicit cost. ----- Pi and SHAttered were easy to see as a high probability of success. y-cruncher has done this 5 times already so it's tried and proven. SHAttered would've been easy to see once they had their algorithm and could draw a probability graph of hitting the collision given X amount of computing resources. For something like finding the next Mersenne prime, it's a lot more uncertain. Let's put aside the lower amount of publicity generated by something like a new Mersenne prime compared to Pi or SHAttered and look at the practicality of it. We seem to be hitting a lot more Mersenne primes than we should. And there are notoriously large gaps at the smaller sizes. So it's really uncertain if you would be able to find a prime by throwing a ton of resources at it. Likewise, Google probably doesn't want the public to know about the failed projects. So if they threw a ton of resources into GIMPS, they'd either have to do it in secret (maintaining their own version of the database), or they'd collaborate with the GIMPS database and it would be apparent what's going on including if they fail to find a prime. One thing that would top both Pi and SHAttered is actually finding a non-trivial Zeta function zero off the critical line. But not a lot of people believe that's likely to happen. |
[QUOTE=Mysticial;511286]This isn't the first time that Google has pulled a stunt like this. They did the same thing with [URL="https://shattered.io/"]SHAttered[/URL].[/QUOTE]
I wouldn't class SHAttered as a stunt, that's a real security issue near and dear to their heart. Doing the public demo got their point across in a way they couldn't have done with pure marketing. |
[QUOTE=Mysticial;511286] We seem to be hitting a lot more Mersenne primes than we should. And there are notoriously large gaps at the smaller sizes. So it's really uncertain if you would be able to find a prime by throwing a ton of resources at it. Likewise, Google probably doesn't want the public to know about the failed projects. So if they threw a ton of resources into GIMPS, they'd either have to do it in secret (maintaining their own version of the database), or they'd collaborate with the GIMPS database and it would be apparent what's going on including if they fail to find a prime.[/QUOTE]
Of course Amazon did it, so it's not at all unreasonable to suggest Google might (though there are a lot of fish in the sea, I don't think we'll get that lucky). |
| All times are UTC. The time now is 08:46. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.