What is the problem you are having with rclone?
I'm having many questions about how rclone works and what's below the surface, and I will present each below.
What is your rclone version (output from rclone version
)
This:
rclone v1.57.0
- os/version: Microsoft Windows 10 Home 1909 (64 bit)
- os/kernel: 10.0.18363.1645 (x86_64)
- os/type: windows
- os/arch: amd64
- go/version: go1.17.2
- go/linking: dynamic
- go/tags: cmount
and this:
rclone v1.57.0
- os/kernel: 5.4.104+ (x86_64)
- os/type: linux
- os/arch: amd64
- go/version: go1.17.2
- go/linking: static
- go/tags: noneos/version: ubuntu 18.04 (64 bit)
Which cloud storage system are you using? (eg Google Drive)
I'm using Google Drive and One Drive, sometimes transferring between Google Drive and Team Drive.
The command you were trying to run (eg rclone copy /tmp remote:tmp
)
rclone config
The rclone config contents with secrets removed.
[odrive]
type = onedrive
token = {"xxx"}
drive_id = xxx
drive_type = business
[drive1]
type = drive
client_id = xxx.apps.googleusercontent.com
client_secret = xxx
scope = drive
service_account_file = D:\SAKey_drive1\first-SA-credentials.json //path to SA credentials files
team_drive =
[drive2] // this is a team drive
type = drive
client_id = xxx.apps.googleusercontent.com
client_secret = xxx
scope = drive
service_account_file = D:\SAKey_drive2\first-SA-credentials.json //path to a credential file
root_folder_id = xxx (Will be explained why I have to use root folder ID)
team_drive =
These are my questions:
- Rclone related:
So I have been using rclone for transferring between clouds for 3 days. I'm curious about what is it doing internally, for instance, when transferring between clouds (
src
todes
) which are the same provider by usingcopy
, it downloads the files fromsrc
to my server and then uploads it todes
orcopy
fromsrc
todes
in cloud provider's server?
For transferring between clouds which are not the same provider, eg.
Google Drive
toOneDrive
, I think that rclone downloads the file into my server, then upload it toOneDrive
, so that we will have--drive-chunk-size
? I read somewhere thatOneDrive Bussiness Account
doesn't have upload limits per day andGoogle Drive
has 10TB download limits. So when we transferring fromGoogle Drive
toOneDrive
, we will have 10TB quota forGoogle Drive
, by limiting the transfer speed to~8.43 MB/s
, we can assume that this task will have no limitations?
I'm planning to backup my
Google Drive
(20TB) account toOneDrive
, how can I optimize this task, because I've tried some transfers yesterday and the speed quite slow (used my own client ID and secrets), sometimes it was 0B/s for many hours.
I'm intent to leave it for transfer, as well as when I'm sleeping, but this situation (0B/s
) for a long time just appears usually, so will I have to sync manually?
- Google Drive related:
I'm using rclone to perform this task: copy from my drive (drive1) to another shared drive of different account (drive2), so this is my config:
drive1: My source files
drive2: My account which is the owner of Shared Drive, I configured
as Shared Drive on rclone
So on my
drive1's
quotas (upload/download
), which will be counted as I'mcopy
fromdrive1
todrive2
? Do I need to create my ownService Account
ondrive1
?
After exceeding quota limit of
drive2
, I tried to figure how to bypass it, and I choose to useService Accounts (SA)
, and this is the real problem:
I see in this tutorial, it's just a singlecredentials.json
file, but when creatingSA
for my project, each key created is a singlekey.json
downloaded to my machine, so how to combine them into one file as to use multipleSA
when each one exceeds quota limits?
My
Google Drive (drive1)
account isGSuite
type, and if I want to enableSA
, do I need to do this?
I've searched many documents about how to using
SA
accounts and I don't find any standard guide about how to interact withSA
accounts onGoogle Drive
When using
SA
account, I don't need to authenticate myself, and I have to specify theroot
folder ID on theTeam Drive
unless it will not work, aboutGSuite
'sSA
, I can't list the folder at all, this is it's log:
2021/11/23 05:14:18 DEBUG : rclone: Version "v1.57.0" starting with parameters ["rclone" "ls" "drive1:" "-vv"]
2021/11/23 05:14:18 DEBUG : Creating backend with remote "drive1:"
2021/11/23 05:14:18 DEBUG : Using config file from "C:\\Users\\usr\\AppData\\Roaming\\rclone\\rclone.conf"
2021/11/23 05:14:19 DEBUG : 6 go routines active
And then it stops. When I specify root_folder_id
as my team_drive_id
, the ls
command works. I'm now freaky obscure about how Service Accounts
work, I will be grateful if anyone can explain it for me and how it works.
Summary:
-
How rclone works?
-
What is
Service Accounts
? How it works? How to useService Accounts
to bypass quota limits? A fully recently updated document that guides this will be appreciated! -
Transferring quotas between clouds questions.
Thanks for reading such a wall of text like this , and thanks for answering my questions (If you had). Hope you have a nice day!