RClone with Google Drive have a file size limit?

What is the problem you are having with rclone?

Not uploading big size files (around 450gb)

What is your rclone version (output from rclone version)

rclone v1.51.0

  • os/arch: linux/amd64
  • go version: go1.13.7

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Ubuntu 18.04.4 LTS 64bit

Which cloud storage system are you using? (eg Google Drive)

Google Drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone copy

A log from the command with the -vv flag (eg output from rclone -vv copy /tmp remote:tmp)

Im trying to run that and this is the output I am getting:
2020/03/23 07:52:54 DEBUG : rclone: Version "v1.51.0" starting with parameters ["rclone" "-vv" "copy" "/tmp" "remote:tmp"]
2020/03/23 07:52:54 DEBUG : Using config file from "/home/mrangel0/.config/rclone/rclone.conf"
2020/03/23 07:52:54 Failed to create file system for "remote:tmp": didn't find section in config file


When I test rclone copy with a simple test.txt file to my Google Drive (Team Drive) it works perfectly fine but when I try a bigger file like 450gb it looks like start normal but then send me back to prompt and there is nothing copied in Google Drive. It seems to me like is a file size issue but I dont know if there is a fix or adjustment for that in Rclone.

What I am trying to accomplish:
I am using Duplicati to make local backups and then I want to upload those backups to my Google Drive. I dont know why but Duplicati usually creates a very small file in kbites which Rclone copy just fine and 2 more that are big, usually around 450gb (I wish Duplicati create more files in a small size) but those files can be copied just fine using the Google Drive website with drag and drop but not with rclone.

Thanks in advance for any help.

hello and welcome to the forum,

the problem is you are not using the correct name of the remote in your command
what is why you are getting the error in the output
didn't find section in config file

what is the exact name of the remote you created?
did you name it remote or something else.

you can do a rclone config and see a list of remotes

also when testing, you can use --dry-run, that way you can see what rclone would do.
if the command and log looks good, then you can remove the --dry-run

about the large files, you can use the chunker storage system.

Thank you very much or the quick response. I checked that in the config, I am running it again, i will report soon what happened.

Confirmed the remote in the config, tried again and same behavior, it looks like start working and then drop me back to prompt. this is my command:

rclone copy /media/mrangel0/Plex6TB1/1-NAS/1-BACKUPS/PLEX_Home_Videos/duplicati-i4045b48949f14cd2968b521bdfdd25f8.dindex.zip.aes GDrive_2-DUPLICATI:PLEX_Home_Videos/

add the flag -vv for debug output and add the flag --progress
and run the command again

this is what i am getting:

2020/03/23 08:33:00 DEBUG : rclone: Version "v1.51.0" starting with parameters ["rclone" "-vv" "copy" "/media/mrangel0/Plex6TB1/1-NAS/1-BACKUPS/PLEX_Home_Videos/duplicati-i4045b48949f14cd2968b521bdfdd25f8.dindex.zip.aes" "GDrive_2-DUPLICATI:PLEX_Home_Videos/"]
2020/03/23 08:33:00 DEBUG : Using config file from "/home/mrangel0/.config/rclone/rclone.conf"
2020/03/23 08:33:01 DEBUG : duplicati-i4045b48949f14cd2968b521bdfdd25f8.dindex.zip.aes: Size and modification time the same (differ by -950.8┬Ás, within tolerance 1ms)
2020/03/23 08:33:01 DEBUG : duplicati-i4045b48949f14cd2968b521bdfdd25f8.dindex.zip.aes: Unchanged skipping
2020/03/23 08:33:01 INFO :
Transferred: 0 / 0 Bytes, -, 0 Bytes/s, ETA -
Checks: 1 / 1, 100%
Elapsed time: 0.0s

2020/03/23 08:33:01 DEBUG : 5 go routines active
2020/03/23 08:33:01 DEBUG : rclone: Version "v1.51.0" finishing with parameters ["rclone" "-vv" "copy" "/media/mrangel0/Plex6TB1/1-NAS/1-BACKUPS/PLEX_Home_Videos/duplicati-i4045b48949f14cd2968b521bdfdd25f8.dindex.zip.aes" "GDrive_2-DUPLICATI:PLEX_Home_Videos/"]

so that looks good, no errors,

that file should already be uploaded to the remote.

not sure what you think the problem is?

what a second, that one was copied, I grabbed a light one, let me try a heavy one, one sec. sorry

well, it looks like now is copying, is saying "sending chunk ####", definitely I am going to be using -vv all the time because it tells me that is doing something. is weird why last night it copied one of the files (the lightest) but then cant do the others. Well, this is a 400gb file, so is going to take some time, so far looks good, I will report here if if finished successfully.

thank you very much asdffdsa, that was the quickest response I got in a forum in my entire life lol.

sure, glad to help

and you can the flag --progress
and you should read this page, to understand what can be done.

Gdrive has a max filesize of something like 5TB I think - and rclone should be able to use all of that, so I doubt this is the problem.

If you are using any other layers in the chain here that might more likely be the root of the issue.
But it sounds like maybe you already solved it? Let us know if not :slight_smile:

Right, 450GB should be no problem.
I've uploaded a 500GB bare metal image before, and made it 95ish % through a 1TB upload several times (always failed to finish due to power outage, eventually gave up)

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.