![]() |
|
|
#419 | |
|
Feb 2005
Colorado
5·131 Posts |
Quote:
EDIT TO ADD: I just discovered the output files available in the output tab as you described. However, all the screen output was saved also.
Last fiddled with by PhilF on 2019-10-22 at 02:54 |
|
|
|
|
|
|
#420 |
|
P90 years forever!
Aug 2002
Yeehaw, FL
7,537 Posts |
|
|
|
|
|
|
#421 |
|
Jun 2003
2·3·7·112 Posts |
I haven't tried it. My use case finishes in 9 hrs. If for some reason it doesn't, I download the checkpoints and finishes off locally.
There is no obvious way to resume a committed session. In the output tab, there is a "Download all" button available. Perhaps, reverse engineering its URL and doing a wget of that within the notebook might be a (painful) way to do it? |
|
|
|
|
|
#422 | |
|
Jun 2003
2×3×7×112 Posts |
Quote:
I close my kaggle session after all 10 batches are committed. When I come back to them, I don't see any obvious way to see the screen output. |
|
|
|
|
|
|
#423 |
|
Feb 2005
Colorado
28F16 Posts |
It is in the "Notebook" tab, just above the "Data" tab, which is just above the "Output" tab. It shows the original code, and below that all screen output.
|
|
|
|
|
|
#424 |
|
Einyen
Dec 2003
Denmark
2·1,579 Posts |
On the page of the finished committed kernel in the top right corner menu there is a clone or duplicate to new kernel, I have not tried that yet if it actually copies the files as well.
I have been using the Kaggle API to try and automate it. https://www.kaggle.com/docs/api#crea...g-a-new-kernel https://github.com/Kaggle/kaggle-api Assuming you have python3 you install the API with: sudo apt-get install python3-pip pip3 install --user kaggle It will install to ~/.local/bin/kaggle. You need to download "kaggle.json" from your account: https://www.kaggle.com/<username>/account and place it in ~/.kaggle/ Then I have been able to download the output files from a finished kernel with: ~/.local/bin/kaggle kernels output <username>/<kernelname> -p tmpdir/ and then using "wput" to upload them to an ftp site (sudo apt-get install wput). Inside the kernel notebooks I use wget to download the files from the ftp. I tried to start a new kernel with the API but was not successful, have not spent much time on it yet. Once that works I think you can name kernels which makes everything much easier to automate. |
|
|
|
|
|
#425 |
|
Jun 2019
Boston, MA
3910 Posts |
Today I'm attempting to commit on Kaggle and grab checkpoint/results files via the Output tab as others have suggested.
One last question: I assume the GPU quota usage for the committed sessions stops after the 6 hour limit is reached and there's no need to go in and turn it off manually? |
|
|
|
|
|
#426 |
|
Einyen
Dec 2003
Denmark
C5616 Posts |
It does not use GPU quota after it is stopped by itself.
But remember when you just committed the script, the "interactive" GPU session might use quota I think, if you do not turn it off. So close the sign saying it was committed and then choose "Run" - "Power Off". Then you can check the committed version is running on: https://www.kaggle.com/<username>/kernels |
|
|
|
|
|
#427 |
|
"Eric"
Jan 2018
USA
22·53 Posts |
On the top right corner of the output tab, there is a button that says "New Dataset Version", which would pop open a new window prompting you to name it a certain version number of the dataset you used in the previous file. Then as you go on to the next commit, delete the dataset from the input section by crossing the x next to the folder, and reimport the same dataset you just updated with the new version, and you are good to go.
|
|
|
|
|
|
#428 |
|
If I May
"Chris Halsall"
Sep 2002
Barbados
9,767 Posts |
|
|
|
|
|
|
#429 |
|
"Eric"
Jan 2018
USA
22·53 Posts |
What I am seeing recently is that updating the dataset with new versions from the output takes a long time, so make sure that it is finished before reimporting the dataset. The reimporting process will waste about 2-5 minutes of GPU quota but I think that's probably insignificant enough considering you are only doing that 4 times a week (yes ik quite a lot tbh).
After an exponent in gpuowl is say finished, I would delete everything but the executable in the dataset version. Then in the console scripts I would create a worktodo.txt file and echo the line of work into the file and work from there. |
|
|
|
![]() |
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| Alternatives to Google Colab | kriesel | Cloud Computing | 11 | 2020-01-14 18:45 |
| Notebook | enzocreti | enzocreti | 0 | 2019-02-15 08:20 |
| Computer Diet causes Machine Check Exception -- need heuristics help | Christenson | Hardware | 32 | 2011-12-25 08:17 |
| Computer diet - Need help | garo | Hardware | 41 | 2011-10-06 04:06 |
| Workunit diet ? | dsouza123 | NFSNET Discussion | 5 | 2004-02-27 00:42 |