Backup large amount of data

I need to backup more than 50 TB from one Gdrive to another. I cannot seem to understand how rclone behaves with the 750gb limit. Does it stop the copying and waits that the limit is removed and resumes or does it stop completely?
If so how can i make it resume from the same position it was in before the ban?

If you start rclone again, it will work out where it got to and restart copying. It won’t copy over stuff you have already copied.

Consider using the --bwlimit flag to not hit the 750GB limit each day too.

This is bad advice IMO. You absolutely want to hit the limit. Why not? It’s actually hard to reach 750gb per day even with unlimited bandwidth when transferring a large number of small files because of the vague unknown number of api requests per minute limit.

If you’re transferring a mix of large and small files you’ll want to max out your upload speed and race towards that 750GB limit sometimes, but at other times it’ll take you 24hours to upload 200,000-300,000 small files well below the 750GB limit.

It is ~30 files per second at the moment. After the 750 GB limit it is nearly ~4.

Do you mean backup or copy?

How are you doing the copying? That is where is rclone running and what speed connection does the machine running rclone have?

If there are some very large file files (> 1TB) you might be better to Share them in Google Drive, rather than transferring.