Source is 48GB / Destination is 102GB and still going - Centos7 to Google Drive

I'm using Rclone inside a shell script to send a Centos7 backup folder to Google drive. The purpose is to keep a remote clone of my server backups. The backup is running, however the source folder is 48GB and so far the backup is on 102GB and still going!

Command used:
rclone copy -l --log-file /path/to/log -v --filter-from /path/to/excludes /backups/ mydestination:

I'm using the drive.file scope in config for google drive.

The source backups folder being copied contains tar's, sql dumps and (probably more relevant) a folder with rsync backups of complete system (using laurent22/rsync-time-backup). There is a great deal of sym/hard links in this folder.

  • First run I used the --copy-links switch and it resulted in a gigantic folder. I misunderstood the command and appears the sym/hard links were followed and copied as files /folders.

  • I'm now using the -l switch assuming the sym/hard links are copied as links (not followed). The log file shows thousands of .rclonelink files so it appears to work as intended.

Any ideas?

Blockquote

Transferred: 102.409G / 104.965 GBytes, 98%, 1.699 MBytes/s, ETA 25m40s
Errors: 318 (retrying may help)
Checks: 0 / 0, -
Transferred: 184907 / 194919, 95%
Elapsed time: 17h9m0s

Blockquote

hello and welcome to the forum,

  1. the backup has been running for over 17 hours. is there a chance that some of the files that are to be backed-up are being modified while rclone is running.
  2. there are over 300 errors, why so many, what are they?

Hello

I'm also surprised with the time it's taken so far! This source folder would only grow by a few MB's from one day to another. It's only accessed briefly, once a day when the backup(s) take place. Looking at the errors, seems mainly down to not excluding certain folders/ files which I'm adding to the exclusions as they come up.

normally, you would start a new post using the question template.
if you had, you would be ask to answer the following questions.
please answer them, thanks

What is the problem you are having with rclone?

What is your rclone version (output from rclone version)

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Which cloud storage system are you using? (eg Google Drive)

The command you were trying to run (eg rclone copy /tmp remote:tmp)

A log from the command with the -vv flag (eg output from rclone -vv copy /tmp remote:tmp)

What is the problem you are having with rclone?

As per title and post.

What is your rclone version (output from rclone version )

rclone v1.47.0

Which OS you are using and how many bits (eg Windows 7, 64 bit)

CentOS Linux release 7.7.1908 (Core) x86_64

Which cloud storage system are you using? (eg Google Drive)

Google Drive

The command you were trying to run (eg rclone copy /tmp remote:tmp )

rclone copy -l --log-file /path/to/log -v --filter-from /path/to/excludes /backups/ mydestination:

A log from the command with the -vv flag (eg output from rclone -vv copy /tmp remote:tmp )

The current job has been running for 24h35m0s. Is it necessary to re-run from scratch? If required I can attach the log currently a 32mb txt file.

Evil, but I'd kill it and restart and specify -P and target ONLY a single small-ish top-level directory to start with. Keep going bigger until it fails.

Using -P will talk to you (it's like rsync -P) and needs to be pointed at a terminal. (Will generate a bunch of junk if redirected to a file.)

You might also use "ncdu" for a local vs remote manual size compare. You could also compare a "ls -lR local" vs "rclone ls target:dir" and compare them after a data massage. If you didn't clean out your original copy of the linked files, I bet they're still there "helping out."

And RESTARTING the copy will only copy new/modified files, it won't copy everything all over again. (like rsync -rt) It WILL take a few seconds (minutes/hours) to skip over the existing files doing what I think is a time/date/size check.)

What is it?? NOT SURE HERE -- isn't it "rclone sync" that will DELETE all the extra files found on the remote? Unsure how it would handle "-l" vs actual files, though. --dry-run is your friend.

the version of rclone you are using is very old, please update and test again.

and using filters, can be very confusing, as @C_B has suggested, test with --dry-run

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.