Rclone copy - API hits?

When checking for existing files in Gdrive, will rclone cause an API hit for every file it checks (likely causing API ban)?

Say I have 20 TB files (1000+ files in 1000+ directories) stored both locally and in Gdrive.
Now I want to make sure that anything new that is added locally is also stored in Gdrive, so I set up a script to copy the contents of the root local directory to Gdrive once a week (skipping already existing files).

I am pretty sure, you will hit some ‘access’ shadow limit. But there is not much what you can do to prevent that. Just hit enter, let the script run and drink some tea.

@Enf0, I have some jobs running on a daily basis. One of these jobs, for example, run a sync command for a folder with 18,652 files in 2,089 folders, and it runs smoothly. However, it uses Dropbox

1 Like