I am about to copy some files to me from a Google Drive user. But it will only be a data synchronization: I copy from him what I am still missing in my drive collection.
The problem is: he has had all uploads encrypted with rclone and uses rclone mount. So far I have always copied files from one Drive user to another with the help of a self-made Google Apps Script. This does not copy physically, but only virtually. However, the owner of this virtual copy actually becomes the user who executed the app. This copy is not affected if the original is deleted or similar. This solution had the advantage that no traffic was generated and approx. 100 GB / minute (the larger the individual files, the more; the smaller, the more less) were copied (still, each copied file counted towards the 24-hour limit of 750 GB).
In the present case, I could simply download the entire encrypted data, decrypt it and then upload it to my Google Drive. But now comes the hammer: the total size of the collection is 240 TB.
But I only need a few thousand files (too many to be selected by hand), a total of a few TB.
Do any of you have a suggested solution? If not, can I create a profile with the encrypted folder and a profile with my drive in rclone and then rclone copy from profile a to b? Or would rclone not decrypt on-the-fly but pass on the encrypted files 1: 1?
Thanks! But the Problem is: These are around 7.500 Files in 7.500 Folders.
I have a table of all Files with folders I wish to copy.
manual sorting would be too cumbersome
Do you have a solution?
Like: rclone processes the external list of files
or:
A possibility to move the correct files in the source into * one * central folder (but this is only possible during rclone mount). So you would need a program that looks for these files (i.e. processes the list) and then moves them
is there a root or a few root folder(s) of those 7,500 folders?
and you read about filtering here https://rclone.org/filtering/
best to test filtering with rclone ls, not rclone copy
Only some files; as I saw on the filtering Site, it is possible to insert all the files I want to include to a txt and link this txt in the rclone copy command.
This is a great solution.
Now I have plenty accounts to do the copy job, as Google allows only 750 GB per 24 hours. All 20 accounts have reading access for the source and writing access for the destination.
Do I need to create a remote for every account or can I change it more easily after hitting the daily account upload limit?