mersenneforum.org  

Go Back   mersenneforum.org > Great Internet Mersenne Prime Search > Hardware > Cloud Computing

Reply
Thread Tools
Old 2019-10-17, 19:34   #353
Corbeau
 
Corbeau's Avatar
 
Jul 2019

2·3 Posts
Default

@kriesel Can I run your script while running the GPU72 trial factoring script on Colab? If so, do I need to manually add work in the worktodo file, or will it get assignments from the GPU72-reserved LL assignments?
Corbeau is offline   Reply With Quote
Old 2019-10-17, 19:44   #354
kriesel
 
kriesel's Avatar
 
"TF79LL86GIMPS96gpu17"
Mar 2017
US midwest

22×5×271 Posts
Default Google drive etc mounting locally

One of the barriers, or rather a slowdown, I've run into initially in using Colab is that, handy as drag and drop of a file from a local host that I own to Google drive and vice versa is, compared to other methods, it still constitutes some overhead, and the file duplication seems likely to cause errors. What I'd really like is:
  1. Ability to mount the relevant Google drives on a local system here
  2. Easy setup for that
  3. Multiple OS support for the host system OS
  4. Low host system overhead
  5. Multiple cloud storage type support, to unify usage of other cloud options such as Box, OneDrive, etc
  6. Adequate security that it does not compromise the local host system
  7. Acceptable convenience and performance
  8. Low cost (free is ideal)
  9. No requirement to encrypt the cloud storage, which could interfere with the Colab or other cloud application being able to usefully process the data and return useful results.
  10. (What did I miss?)
I think CloudMounter qualifies, after a quick initial skim. There are alternatives.
https://www.nextofwindows.com/how-to...isk-on-windows
Windows https://cloudmounter.net/mount-cloud-drive-win.html
Mac https://cloudmounter.net/
linux https://cloudmounter.net/mount-cloud-drive-linux.html

Some of the following may qualify, while some are quite a stretch, or are useful utilities for other purposes alongside mounting a cloud drive. https://www.topbestalternatives.com/cloudmounter/

Favorite solutions? Thoughts? Experience?

(Please avoid OS-bashing. Different people use different OSes for different reasons, that make sense to them.)
kriesel is offline   Reply With Quote
Old 2019-10-17, 19:45   #355
PhilF
 
PhilF's Avatar
 
Feb 2005
Colorado

5·131 Posts
Default

Quote:
Originally Posted by kriesel View Post
And the rapid progress of this collaborative effort has been something to see.
That's probably what Google and Kaggle are thinking too...
PhilF is offline   Reply With Quote
Old 2019-10-17, 19:52   #356
kriesel
 
kriesel's Avatar
 
"TF79LL86GIMPS96gpu17"
Mar 2017
US midwest

22·5·271 Posts
Default

Quote:
Originally Posted by Corbeau View Post
@kriesel Can I run your script while running the GPU72 trial factoring script on Colab? If so, do I need to manually add work in the worktodo file, or will it get assignments from the GPU72-reserved LL assignments?
I don't use GPU72, so have no idea about compatibility with it. I'm also unsure which script you're referring to; the mprime resume one at https://www.mersenneforum.org/showpo...0&postcount=3? If so, it uses primenet, as set up by a variation of Dylan14's initial setup script, gets work, reports work, on its own, given enough 12-hour sessions to finish an assignment. My first Colab 87M primality test is weeks away from completion at twice-daily restarts; it takes persistence. My understanding is if using cpu and gpu application script sections in Colab in the same work section on the Colab web page, one of them needs to be run as a subprocess, then the other run as the script process. I'm about to try that as ath has described.
If you're using chalsall's reverse tunneling, you could probably instead use that to run mprime.
It appeals to me to have one Colab script to relaunch, that runs both the subprocess and the other to occupy the VM cpu and the VM gpu for the duration.
For gpu applications, the same or similar client management applies in the Colab environment as when running on our own gpus. A single pass through a Python script for result reporting and ensuring adequate work seems like a natural fit to me. There are other possibilities.
I regard the Colab thread scripts I've posted on my blog as a collaborative effort, not "mine", and try to give credit there for the originator.

Last fiddled with by kriesel on 2019-10-17 at 20:40
kriesel is offline   Reply With Quote
Old 2019-10-17, 19:54   #357
kriesel
 
kriesel's Avatar
 
"TF79LL86GIMPS96gpu17"
Mar 2017
US midwest

22×5×271 Posts
Default

Quote:
Originally Posted by PhilF View Post
That's probably what Google and Kaggle are thinking too...
Hence the disappearance of T4 availability? ("Give them only the older K80s, let's see what they come up with.")
kriesel is offline   Reply With Quote
Old 2019-10-18, 04:29   #358
LaurV
Romulan Interpreter
 
LaurV's Avatar
 
Jun 2011
Thailand

25B916 Posts
Default

Well, we should try to "come up with" running cudaLucas on it.
K80 is a waste if used for TF. This card is flying like a rocket at LL.
LaurV is offline   Reply With Quote
Old 2019-10-18, 04:33   #359
chalsall
If I May
 
chalsall's Avatar
 
"Chris Halsall"
Sep 2002
Barbados

262716 Posts
Default

Quote:
Originally Posted by kriesel View Post
And the rapid progress of this collaborative effort has been something to see.
The family's asleep. Back At Console for a little bit...

A book that I seriously internalized as a "youngin" was The Mythical Man-Month by Brooks.

Empirically, I have found that small ad-hock teams can often do better work faster than larger groups with hierarchical communication channels can.

YMMV.

Last fiddled with by chalsall on 2019-10-18 at 04:52 Reason: s/I book/A book/; s/youngen/youngin/; # The risks of coding human when sleepy...
chalsall is offline   Reply With Quote
Old 2019-10-18, 04:35   #360
chalsall
If I May
 
chalsall's Avatar
 
"Chris Halsall"
Sep 2002
Barbados

9,767 Posts
Default

Quote:
Originally Posted by LaurV View Post
Well, we should try to "come up with" running cudaLucas on it.
Don't disagree.

Why don't you code that up?
chalsall is offline   Reply With Quote
Old 2019-10-18, 04:46   #361
axn
 
axn's Avatar
 
Jun 2003

117328 Posts
Default

Quote:
Originally Posted by chalsall View Post
Why don't you code that up?
BTDT. Got 3 DCs out of it. Major pain; Colab gets a conniption if you run it for long, and then you don't get GPU instance for a while.
Pretty fast, though. I estimated that if you run it full time, you could get about 60 GhzDay/day, which is pretty much in line with https://www.mersenne.ca/cudalucas.php

EDIT:- Even more impressive is Kaggle and its P100. That is not a 2x GPU like K80, and so in theory you should get the full 160 GhzDay/day. But for that, you need a file hosting location to download the appropriate CUDALucas executables and libraries (cudart and cufft) -- google drive won't work.

Last fiddled with by axn on 2019-10-18 at 05:00
axn is offline   Reply With Quote
Old 2019-10-18, 08:46   #362
bayanne
 
bayanne's Avatar
 
"Tony Gott"
Aug 2002
Yell, Shetland, UK

5148 Posts
Default

Quote:
Originally Posted by chalsall View Post
Don't disagree.

Why don't you code that up?
Chapeau from me if you can :)
bayanne is offline   Reply With Quote
Old 2019-10-18, 10:00   #363
LaurV
Romulan Interpreter
 
LaurV's Avatar
 
Jun 2011
Thailand

32×29×37 Posts
Default

Well, I am digging, but Linux was never my strong point, and here the pain is to store and retrieve the checkpoint files. But I am learning...
LaurV is offline   Reply With Quote
Reply



Similar Threads
Thread Thread Starter Forum Replies Last Post
Alternatives to Google Colab kriesel Cloud Computing 11 2020-01-14 18:45
Notebook enzocreti enzocreti 0 2019-02-15 08:20
Computer Diet causes Machine Check Exception -- need heuristics help Christenson Hardware 32 2011-12-25 08:17
Computer diet - Need help garo Hardware 41 2011-10-06 04:06
Workunit diet ? dsouza123 NFSNET Discussion 5 2004-02-27 00:42

All times are UTC. The time now is 16:27.


Mon Aug 2 16:27:13 UTC 2021 up 10 days, 10:56, 0 users, load averages: 3.34, 2.70, 2.46

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.