Not able to upload full external drive to goole drive

STOP and READ USE THIS TEMPLATE NO EXCEPTIONS - By not using this, you waste your time, our time and really hate puppies. Please remove these two lines and that will confirm you have read them.

What is the problem you are having with rclone?

I am trying to upload files (mostly raw files and videos) from an external SSD to Google Drive via the copy command. I want to make a copy of the files from my drive to the external drive.

For context, I have about 2.69 TiB of files to move (checked this with the rclone size command). I used to copy command, but it does not appear to attempt to transfer all the files. I'll start copying and it'll be something like 2/443 GiB, which is obviously not the total amount of files in the drive. The total amount it decides to upload also seems pretty random, sometimes showing up as out of 600, or 1 Tib, etc. I cannot figure out why it is not trying to upload the whole of the drive, especially when I use the rclone size command and it seemingly knows the drive has that amount of files.

Run the command 'rclone version' and share the full output of the command.

rclone v1.68.1

  • os/version: darwin 15.3.1 (64 bit)

  • os/kernel: 24.3.0 (arm64)

  • os/type: darwin

  • os/arch: arm64 (ARMv8 compatible)

  • go/version: go1.23.1

  • go/linking: dynamic

  • go/tags: cmount

Which cloud storage system are you using? (eg Google Drive)

Google Drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone copy /“Volumes/drive” gdrive:/Backups/ExternalDrive --progress --transfers=8 --checkers=16 --drive-chunk-size=128M

Please run 'rclone config redacted' and share the full output. If you get command not found, please make sure to update rclone.

[gdrive]
type = drive
scope = drive
token = XXX
team_drive = 

A log from the command that you were trying to run with the -vv flag

Paste  log here
1 Like

For sure the answer will be in debug log.

Add -vv --log-file /path/to/rclone.log to your command and either you see it yourself or post it here and somebody will have a look.

Most likely your transfer is seriously throttled by Google. For the start create your own client_id. Then recreate your remote.

Also,

delete these. I think it is too much for Gdrive. If all works with defaults you can experiment later with increasing it. But maybe you have to even set it lower than defaults. You have to try.

updating rclone to the latest version is also recommended. It is v1.69.1 now.

Just updated rclone to v1.69.1 now.

Sorry, new to rclone and not really familiar with terminal stuff in general. What do I post from the log? Is the log wherever I specify it to be? And also, what do I do with creating my own client ID?

I pasted the command as such: rclone copy /“Volumes/drive” gdrive:/Backups/ExternalDrive --progress --drive-chunk-size=128M -vv --log-file / Users/myuserfolder/rclone.log

Im on MacOS.

https://rclone.org/drive/#making-your-own-client-id


keep in mind that gdrive has a hard limit of something like 750GiB/24 hours.
depending on the upload speed of your internet connection, you might hit that limit.
the workaround is to limit the upload bandwidth using --bwlimit=8.5M

I understand the upload limit - but to my understanding, when that limit is hit, i can just rerun the command after the 24 hr cooldown, correct? or is it just better to limit bandwidth

Also, it still doesn't upload the full amount - I just tried and it was out of 1.1Tib, but still not the full 2.69.

Am I doing the copy line wrong? Should I limit it to subfolders at a time?

better, as no need to re-run the command multiple times.

well, no idea, no debug log was posted.
try rclone check

no, should not be required.

I know kapitaninsky mentioned this but unsure how I run this - do I just run the copy command, and put this behind the copy command? where is the log, or does the path just creates the log to whatever I set it to? and what do I send from that. sorry, new to this.

yes


yes


no problem. to better understand what is going on, i would run
rclone check /Volumes/drive gdrive:/Backups/ExternalDrive --combined=/path/to/combined.txt

Ok. But to clarify,

  1. what do I paste from the logs?
  2. do I need to finish the copy operation before I can paste from the logs, or do I just take the first 20 or so lines of whatever is there and paste it?
  1. create a new remote, let's call it gdrive2, as per https://rclone.org/drive/#making-your-own-client-id
  2. post the output of rclone config redacted gdrive2:

When I try to create the new remote and eventually configure, it gives me the error Access blocked: Authorization Error. Error 401: invalid_client

Request details: flowName=GeneralOAuthFlow

Does this mean I messed up the process of granting access somewhere? I feel like I double checked every step correctly

i never saw that before. and there are just two or so mentions in the forum.
so, for now, let's skip that.


at this point, need to use a debug log.
search for words such as ERROR, WARNING, pacer, retry,retries

Do you have more than 10 000 files to transfer? If yes see

--max-backlog

from Global Flags
Can set --max-backlog bigger at the cost of using more memory (RAM)
Also rclone only transfers and shows files transferred in the current run of rclone. So already uploaded files do not show in transferred amount nor are queued to be uploaded again.

Copy tools like GoodSync, Gs Richcopy 360 or Syncback can handle this issue. Search all

I’ve run into the same problem, and it’s honestly been a huge headache. You’d think uploading an entire external drive would be straightforward with Rclone, but between permission issues, weird rate limits, and inconsistent behavior, it’s anything but.

What’s frustrating is the lack of clear error messages — it just stops or skips files without much context. I’ve tried tweaking flags, chunk sizes, even using service accounts, but the results are still unreliable. This kind of thing really kills confidence in what’s otherwise a powerful tool.

Would love to see better documentation or at least more robust error handling around these kinds of bulk uploads.

That's the best part as rclone is open source so if you think the documentation can be improved, by all means, help out.

Most of the items you mention are Google API items and not directly rclone things unfortunately.