Syncing of large catalog files

What is the problem you are having with rclone?

Merely a question. I've just started using rclone to backup my photos, and in addition to the photos themselves, the catalog files from Adobe Lightroom are essential to backup. I'm just curious as to how rclone handles single large files? My catalog is a single .lrcat file approaching 800 MB. Whenever I've work in Lightroom and close it, the catalog file will have been changed. Will running rclone detect that the file was changed and upload the entire file, or is there some more granular detection of changed blocks/segments within a file (assuming Lightroom itself basically appends data to the catalog rather than rewriting the whole thing every time)? Are there any options or configurations that are more ideal for such a situation?
It's probably not a big deal to upload ~1 GB every time, but I'm basically just curious as to how it works and if there are more or less ideal setups.

What is your rclone version (output from rclone version)


Which OS you are using and how many bits (eg Windows 7, 64 bit)

Windows 10

Which cloud storage system are you using? (eg Google Drive)

Google Drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

Paste command here

The rclone config contents with secrets removed.

Paste config here

A log from the command with the -vv flag

Paste  log here

No, it's not block based. It copies the whole file on a change as majority of cloud providers do not offer block based uploads.

What are you using now? I keep this in my rclone.conf on my GD to help with uploads as I have a lot of memory on my server:

chunk_size = 1024M

Other than that, defaults.

you can write a very simle script that

  • start lightroom
  • waits for lightroom to exit
  • rclone the .lrcat

you might try to zip the .lrcat and upload that.

and if you are using rclone for backup, you might want to check out

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.