I am about to copy some files to me from a Google Drive user. But it will only be a data synchronization: I copy from him what I am still missing in my drive collection.
The problem is: he has had all uploads encrypted with rclone and uses rclone mount. So far I have always copied files from one Drive user to another with the help of a self-made Google Apps Script. This does not copy physically, but only virtually. However, the owner of this virtual copy actually becomes the user who executed the app. This copy is not affected if the original is deleted or similar. This solution had the advantage that no traffic was generated and approx. 100 GB / minute (the larger the individual files, the more; the smaller, the more less) were copied (still, each copied file counted towards the 24-hour limit of 750 GB).
In the present case, I could simply download the entire encrypted data, decrypt it and then upload it to my Google Drive. But now comes the hammer: the total size of the collection is 240 TB.
But I only need a few thousand files (too many to be selected by hand), a total of a few TB.
Do any of you have a suggested solution? If not, can I create a profile with the encrypted folder and a profile with my drive in rclone and then rclone copy from profile a to b? Or would rclone not decrypt on-the-fly but pass on the encrypted files 1: 1?
if both remotes use the same backend then you can use use https://rclone.org/crypt/#crypt-server-side-across-configs.
rclone will not need to download/decrypt and upload/encrypt.
if each remote points to different backends, you can use https://rclone.org/crypt/#backing-up-a-crypted-remote
I just don't understand what do you mean with "same backend"?
I'm using Google Drive unencrypted, the source is using it encrypted.
create two remotes.
one for the crypted remote, called source:
one for your non-crypted remote, called dest:
rclone copy source:folder dest:folder to copy just the folders and/or files you want.
rclone will decrypt on the fly
Thanks! But the Problem is: These are around 7.500 Files in 7.500 Folders.
I have a table of all Files with folders I wish to copy.
manual sorting would be too cumbersome
Do you have a solution?
Like: rclone processes the external list of files
A possibility to move the correct files in the source into * one * central folder (but this is only possible during rclone mount). So you would need a program that looks for these files (i.e. processes the list) and then moves them
is there a root or a few root folder(s) of those 7,500 folders?
and you read about filtering here
best to test filtering with
rclone ls, not
Thanks man! You show how powerful rclone is!
All files are in just one root folder.
yes, rclone is very powerful.
do you want all the files in the root folder and subfolders
just some files and just some subfolders
Only some files; as I saw on the filtering Site, it is possible to insert all the files I want to include to a txt and link this txt in the rclone copy command.
This is a great solution.
Now I have plenty accounts to do the copy job, as Google allows only 750 GB per 24 hours. All 20 accounts have reading access for the source and writing access for the destination.
Do I need to create a remote for every account or can I change it more easily after hitting the daily account upload limit?
yes, filtering is a great solution
sorry, i cannot offer support about working around google drive limits.
This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.