Does anyone else have problem with rclone server side copy?

I’m trying to copy a lot of medium size files in hundreds of folders that totals around 2TB and it takes weeks and I have not completed it.

Is this another undocumented Google drive throttling?

In the logs I see errors similar to errors I have when I upload more 800GB in a day using rclone move. But then when it’s this case, the next day after the account is free from temporary ban, it should do rclone server side copy normally right? But it’s not. I have seen days where rclone could only do server side copy for literally only 2-3 files, each size around 200MB.

This is the example command I use:

rclone copy -v --include-from cpjp.txt remote1:/J-Drama/ remote1:/TV/ --fast-list

In which J-Drama is a shared folder owned by another user, TV is folder owned by ‘remote1’

I have heard from other users that it is very slow and there is definitely some quota involved. I thought it was 100GB/day but that figure may be out of date.

I did some tests, if the rclone server side copy is done in the same domain with rclone having read and write permission for original and target directory, it copies a lot more files in a day and definitely there’s a quota both based on number of files and total files size.

In the above case, it was done on 2 different domain name, and rclone has only read access for original directory. It takes a long time for rclone just to copy few files and soon it triggers below error.

8-10-04 08:45:24 ERROR : My Boss My hero/My Boss My Hero ep06 (1280x720 DivX521).avi: Failed to copy: googleapi: Error 500: Internal Error, internalError
2018-10-04 08:47:28 ERROR : My Boss My hero/My Boss My Hero ep08 (1280x720 DivX521).avi: Failed to copy: googleapi: Error 500: Internal Error, internalError
2018-10-04 10:51:38 ERROR : My Boss My hero/My Boss My Hero ep09 (1280x720 DivX521).avi: Failed to copy: googleapi: Error 503: Backend Error, backendError
2018-10-04 10:53:25 ERROR : Nobuta wo Produce (2005) Complete (1440x1080 x264)/Nobuta wo Produce - 01 (1024x576 DivX511).avi: Failed to copy: googleapi: Error 403: Rate Limit Exceeded, rateLimitExceeded
2018-10-04 11:02:16 ERROR : My Boss My hero/My Boss My Hero ep10 finale (1280x720 DivX521).avi: Failed to copy: googleapi: Error 503: Backend Error, backendError
2018-10-04 11:05:24 ERROR : Nobuta wo Produce (2005) Complete (1440x1080 x264)/Nobuta wo Produce - 02 (1024x576 DivX511).avi: Failed to copy: googleapi: Error 500: Internal Error, internalError
Transferred: 0 / 57.486 GBytes, 0%, 0 Bytes/s, ETA -
Errors: 64
Checks: 12780 / 12780, 100%
Transferred: 0 / 56, 0%
Elapsed time: 36h38m49.1s
Transferring:

I wonder if Google throttling more of account that keeps retrying to do server side copy even after getting errors above which I guess is the default behaviour for rclone…
Because now it seems to me that it’s faster to use Google Web UI or other android app to do server side copy because these tools use API without keep retrying like rclone do.

I’m copying 8Tb shared folder and 750GB are copied to my account for day

Example
Transferred: 0 / 5.5 TB, 0%, 0 Bytes/s, ETA -
Errors: 5
Checks: 12780 / 12780, 100%
Transferred: 35 / 556
Elapsed time: 20h38m49.1s

Could you share the command you use? Are these copy between the same domain name?

If you could do that by server side copy then maybe it’s a problem of cpu or ram in my case?
But it can’t be because rclone almost don’t use any memory and cpu is mostly idle.
Or that too many checks that’s need to be done by rclone…?

Im using this command on rclone 1.43 Linux Machine

rclone -P copy goodrive:movies goodrive:mymovies

googdrive:movies= is the shared folder. Aren’t the same domain name.
I do not think the problem is ram or cpu. Im running this on my vps with 256 MB ram.

Here is the the screenshot. The files are being copied.

Thank you. Then I’m at my wits end because I am doing exactly the kind of thing that you do and I transferred few files before I triggered the rate limit exceeded error message and other errors.
Although in the same machine, with different Google drive account I am doing heavy-duty tasks such as rclone move from my NAS to Google drive and at the same time plex analysis which downloads many terabytes of data each day everyday.

So now I’m guessing that Google may put a limit for each IP address of some kind. I don’t know…

And that is the problem - we don’t know! I don’t think Google have stated anywhere the limits for various operations - they’ve all been worked out by observation of the system :frowning:

Maybe can work using proxy