What is the problem you are having with rclone?
I am aware of the 750GB upload limit per day when using rclone copy and GDrive. I am also aware of the --drive-stop-on-upload-limit flag. I am wondering what level/account the 750GB applies to. I have a shared drive folder on my Google Drive that myself and two other colleagues are using as a data archive folder. I have one other folder that is similar for another project with different collaborators. Does the 750GB limit apply to each shared folder? Each Google Drive user? The internet connection used? The computer the command is run on/from? Something else? I am not trying to cheat my way around the limit - rather wanting to understand how to prioritize my data archival on various projects - so that I am prioritizing the correct data to archive within a given 24 hr period (and not interfering with my collaborators' ability to archive data).
Thank you!
Run the command 'rclone version' and share the full output of the command.
rclone v1.53.3
- os/arch: linux/amd64
- go version: go1.15.5
Which cloud storage system are you using? (eg Google Drive)
Google Drive
The command you were trying to run (eg rclone copy /tmp remote:tmp
)
rclone --drive-stop-on-upload-limit copy mygoogledrive:mydatafolder shared_googledrive:fulldatafolder