2 useful Python programs (require: sudo apt install -y python3 python3-pip):
- gdown: Download a large file public from Google Drive (curl/wget fails due to security notice).
pip install gdown - gshell: Navigate in Google Drive as you do on shell (gshell = Google Drive + Shell).
pip install gshell (maybe discontinued)
To upload to GDrive, one can use https://labbots.github.io/google-drive-upload/.
Examples:
To download 1 public large file from Google Drive (5 GB):
# syntax: gdown https://drive.google.com/uc?id=FILE-ID gdown https://drive.google.com/uc?id=1LC5iVcvgksQhNVJ-CbMigqXnPAaquiA2
To download 1 folder:
gshell init # log in to your Google account gshell ll # list files gshell download spam.txt # download file -> you could script to download all files listed by gshell ll
I use gdown and it works well. I haven't tried gshell.
Note that if one uses gdown more than ~20 times within 24 hours (might depend on the file size), we get the error:
Access denied with the following error:
Too many users have viewed or downloaded this file recently. Please try accessing the file again later. If the file you are trying to access is particularly large or is shared with many people, it may take up to 24 hours to be able to view or download the file. If you still can't access a file after 24 hours, contact your domain administrator.
You may still be able to access the file from the browser:
https://drive.google.com/uc?id=1LC5iVcvgksQhNVJ-CbMigqXnPAaquiA2
ocamlfusecan handle this, I thought of downloading them everytime I need them though time comsuming with respect to mount. But I have no other options.wget https://googledrive.com/host/file_idfor you that iswget https://googledrive.com/host/0B-Zc9K0k9q-WdEY5a1BCUDBaejQ. Solution explained here in more details.