I have about half a million text files on Google Drive. Using rclone in Windows 10 x64, a virtual disk is connected, the default settings from the instructions were used. API key has been received and connected. Google One 2 TB tariff. Windows Explorer, FAR Manager 3.0 ("Use system copy routine" is on), Total Commander 10 are used to work with files.
I'm wondering which of the following operations might cause Google to throttle?
- calculating the size of folders with tens and hundreds of thousands of files (for example, when the "Show total copy progress indicator" option is enabled in FAR);
- search for duplicates using the AllDup application (selected "File extension" and "File size" only);
- one-time deletion of tens of thousands of files using the AllDup application;
- search for empty and almost empty folders using the "Remove Empty Directories" application;
- mass deletion of tens of thousands of folders and files less than 1 KB using the "Remove Empty Directories" application;
- one-time bulk moving of thousands of small files within one cloud storage.
rclone.exe mount Google_Drive: h:
PS: The fact is that I do not know
7. what information about files Google gives to local file managers via the API / rclone without having to download a file, and
8. whether it is really true that when transferring within one virtual disk mounted with rclone, only the file address changes, but not is it downloading, deleting, and uploading to a new location?
I beg your pardon for the list of questions, but in our case they relate to the only problem that I described here, and to which I have not yet found an exhaustive answer. Also, perhaps they are more related to the peculiarities of the work of specific user applications, and the question should be asked to their developers. Unfortunately, they can send back to this resource, and they will also be right.
It seems to me that the discussion of the optimal rclone settings for our tasks is better placed in a separate topic?