To slow to start copying

Hello,

I am trying to use rclone to copy my files to google drive, but I am experiencing a very low transfer when trying to copy my files:

On this picture you can see how on the first 20sec, it hasn’t started copying anything yet.

Just wondering why it is so slow to start the copying process.

Kind regards

Since they aren’t any logs to look at.

  • Not using your own API key
  • Too many transfers
  • Network bottleneck
  • CPU bottleneck
  • Disk IO bottleneck

Those would be my guesses.

Hello,

Let me check if I can find any logs, but atm the too many transfers can not be, because it was also happening with just 2 transfers and just tried increasing it to see what was happening.

Kind regards

I think it is a google drive problem, I am getting this errors in the log when using the --log-level DEBUG

2019/01/20 22:56:07 DEBUG : pacer: Rate limited, sleeping for 1.284606096s (1 consecutive low level retries)
2019/01/20 22:56:07 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expec$
2019/01/20 22:56:07 DEBUG : pacer: Rate limited, sleeping for 2.725776682s (2 consecutive low level retries)
2019/01/20 22:56:07 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expec$
2019/01/20 22:56:07 DEBUG : pacer: Resetting sleep to minimum 10ms on success
2019/01/20 22:56:07 DEBUG : pacer: Rate limited, sleeping for 1.5396131s (1 consecutive low level retries)
2019/01/20 22:56:07 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expec$
2019/01/20 22:56:07 DEBUG : pacer: Resetting sleep to minimum 10ms on success
2019/01/20 22:56:07 DEBUG : pacer: Rate limited, sleeping for 1.054260037s (1 consecutive low level retries)
2019/01/20 22:56:07 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expec$
2019/01/20 22:56:07 DEBUG : pacer: Resetting sleep to minimum 10ms on success
2019/01/20 22:56:10 INFO : paula/paula/1376003009889.jpg: Copied (new)
2019/01/20 22:56:10 DEBUG : pacer: Rate limited, sleeping for 1.913584237s (1 consecutive low level retries)
2019/01/20 22:56:10 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: Rate Limit Exceeded, rateLimitExceeded)
2019/01/20 22:56:10 DEBUG : pacer: Rate limited, sleeping for 2.853655692s (2 consecutive low level retries)
2019/01/20 22:56:10 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: Rate Limit Exceeded, rateLimitExceeded)

Any idea on why?

Kind regards

You should use your own API key.

What’s the actual command you are running as you only included the logs this time.

Hello,

What do you mean by using my own API key? rclone took it automatically when the browser openned for loging into my drive.

The commands I tried this type were:

rclone -P --transfers 20 copy --log-file /tmp/prueba.txt --log-level DEBUG /mnt/archivos/paula/ drive:paula
rclone -P --transfers 1 --checkers 1 copy --log-file /tmp/prueba.txt --log-level DEBUG /mnt/archivos/paula/ drive:paula

Kind regards

It’s listed here:

https://rclone.org/drive/#making-your-own-client-id

Hello,

I have already created my Client ID and Client secret for using it in rclone, also have added it, but it is still prompting me the same errors as before.

Kind regards

You have to run rclone config and redo it.

You can’t simply just enter it in or it uses the old key.

Are you sure?
According to the link you pasted, I could just edit the existing one to make it work:
It will show you a client ID and client secret. Use these values in rclone config to add a new remote or edit an existing remote.

How can I be sure if it is using mine or not?

As far as I know rclone loads the config one you execute the task, at the moment this is my config structure:
[drive]
type = drive
scope = drive
token = {“access_token”:"**",“token_type”:“Bearer”,“refresh_token”:"
client_id = ****
client_secret = ****

Kind regards

You have all the evidence in front of you as it’s not working.

If you’d like to run rclone config and test again, that would be awesome.

Says it right here as well:

Well, as I am using it in a console interface enviroment, I normally generate my configs on windows and then copy those into the linux enviroment, I assume that will also be fine, right?

Can’t say I’ve personally ever done that as I’d assume it would be fine.

I just do it headless in a terminal on Linux.

Yes, but seems I am not able to have this working once I get to this step:
Use auto config?

  • Say Y if not sure
  • Say N if you are working on a remote or headless machine or Y didn’t work
    y) Yes
    n) No
    y/n> y
    If your browser doesn’t open automatically go to the following link: http://127.0.0.1:53682/auth

That port in my VM is not answering if I access it from my pc, and I do not have a firewall enabled on the VM, so that is why I was doing it first on windows and then copying the config to the linux one.

You’d have to use SSH port forwarding as 127.0.0.1 is a local port on the machine itself.

https://www.ssh.com/ssh/tunneling/example

Hello,

I have done a new one without the need of making an ssh tunnel, seems I selected the NO option and it worked me with this:

If your browser doesn’t open automatically go to the following link: https://accounts.google.com/o/oauth2/auth?access_type=********************************

Let me check now, the copy command.

Kind regards

Wow,

It seems it is now working fine and in the google drive api now metrics are showing, so I assume it should be good :smiley:

Thank you so much

Oh that’s right, I forgot that even existed. Happy you got it working!