How Rclone works and about clouds transfer questions

What is the problem you are having with rclone?

I'm having many questions about how rclone works and what's below the surface, and I will present each below.

What is your rclone version (output from rclone version)

This:

rclone v1.57.0
- os/version: Microsoft Windows 10 Home 1909 (64 bit)
- os/kernel: 10.0.18363.1645 (x86_64)
- os/type: windows
- os/arch: amd64
- go/version: go1.17.2
- go/linking: dynamic
- go/tags: cmount

and this:

rclone v1.57.0 
 - os/kernel: 5.4.104+ (x86_64)
- os/type: linux
- os/arch: amd64
- go/version: go1.17.2
- go/linking: static
- go/tags: noneos/version: ubuntu 18.04 (64 bit)

Which cloud storage system are you using? (eg Google Drive)

I'm using Google Drive and One Drive, sometimes transferring between Google Drive and Team Drive.

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone config

The rclone config contents with secrets removed.

[odrive]
type = onedrive
token = {"xxx"}
drive_id = xxx
drive_type = business

[drive1]
type = drive
client_id = xxx.apps.googleusercontent.com
client_secret = xxx
scope = drive
service_account_file = D:\SAKey_drive1\first-SA-credentials.json //path to SA credentials files
team_drive = 

[drive2] // this is a team drive
type = drive
client_id = xxx.apps.googleusercontent.com
client_secret = xxx
scope = drive
service_account_file = D:\SAKey_drive2\first-SA-credentials.json //path to a credential file
root_folder_id = xxx (Will be explained why I have to use root folder ID)
team_drive = 

These are my questions:

  • Rclone related:

So I have been using rclone for transferring between clouds for 3 days. I'm curious about what is it doing internally, for instance, when transferring between clouds (src to des) which are the same provider by using copy, it downloads the files from src to my server and then uploads it to des or copy from src to des in cloud provider's server?

For transferring between clouds which are not the same provider, eg. Google Drive to OneDrive, I think that rclone downloads the file into my server, then upload it to OneDrive, so that we will have --drive-chunk-size? I read somewhere that OneDrive Bussiness Account doesn't have upload limits per day and Google Drive has 10TB download limits. So when we transferring from Google Drive to OneDrive, we will have 10TB quota for Google Drive, by limiting the transfer speed to ~8.43 MB/s, we can assume that this task will have no limitations?

I'm planning to backup my Google Drive (20TB) account to OneDrive, how can I optimize this task, because I've tried some transfers yesterday and the speed quite slow (used my own client ID and secrets), sometimes it was 0B/s for many hours.
I'm intent to leave it for transfer, as well as when I'm sleeping, but this situation (0B/s) for a long time just appears usually, so will I have to sync manually?

  • Google Drive related:

I'm using rclone to perform this task: copy from my drive (drive1) to another shared drive of different account (drive2), so this is my config:

drive1: My source files
drive2: My account which is the owner of Shared Drive, I configured
as Shared Drive on rclone

So on my drive1's quotas (upload/download), which will be counted as I'm copy from drive1 to drive2? Do I need to create my own Service Account on drive1?

After exceeding quota limit of drive2, I tried to figure how to bypass it, and I choose to use Service Accounts (SA), and this is the real problem:
I see in this tutorial, it's just a single credentials.json file, but when creating SA for my project, each key created is a single key.json downloaded to my machine, so how to combine them into one file as to use multiple SA when each one exceeds quota limits?

My Google Drive (drive1) account is GSuite type, and if I want to enable SA, do I need to do this?

I've searched many documents about how to using SA accounts and I don't find any standard guide about how to interact with SA accounts on Google Drive

When using SA account, I don't need to authenticate myself, and I have to specify the root folder ID on the Team Drive unless it will not work, about GSuite's SA, I can't list the folder at all, this is it's log:

2021/11/23 05:14:18 DEBUG : rclone: Version "v1.57.0" starting with parameters ["rclone" "ls" "drive1:" "-vv"]
2021/11/23 05:14:18 DEBUG : Creating backend with remote "drive1:"
2021/11/23 05:14:18 DEBUG : Using config file from "C:\\Users\\usr\\AppData\\Roaming\\rclone\\rclone.conf"
2021/11/23 05:14:19 DEBUG : 6 go routines active

And then it stops. When I specify root_folder_id as my team_drive_id, the ls command works. I'm now freaky obscure about how Service Accounts work, I will be grateful if anyone can explain it for me and how it works.

Summary:

  • How rclone works?

  • What is Service Accounts? How it works? How to use Service Accounts to bypass quota limits? A fully recently updated document that guides this will be appreciated!

  • Transferring quotas between clouds questions.

Thanks for reading such a wall of text like this :smile: , and thanks for answering my questions (If you had). Hope you have a nice day!

If you copy between different providers, it generally goes through wherever you are running rclone.

Not sure what you are asking here. The only published Google limit is 750GB upload per day. Everything else is conjecture/assumptions since it's not published.

Dunno. Post a log/command/more details as without any info, it's really a guessing game.

You can upload 750GB per day with Google.

We don't condone or advise to avoid any quota limits as that's the provider.

You don't need a service account for use. You can use your own account.

Without a full set of details, it's tough.

Best to pick one question and include all the information for one question and move on. Trying to put many things in a single post is confusing.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.