It's a new run with said parameter producing a new log (I linked to it).
regarding the single file:
apparently that works w/o cecksumming (I searched for a file that was checksummed before, it's the 5th from top in my last screenshot):
2024/09/10 16:55:27 DEBUG : rclone: Version "v1.68.0" starting with parameters ["rclone" "sync" "Z:\\CT\\re251sg67nnhm135oad319cpmk\\smj2013l08uf4aqjtooq6ds6hk\\0s9irvrm7p7m8oq6594hrl70kk" "F:\\" "--progress" "-vv" "--dry-run" "--log-file" "C:\\Logs\\rclone.log"]
2024/09/10 16:55:27 DEBUG : Creating backend with remote "Z:\\CT\\re251sg67nnhm135oad319cpmk\\smj2013l08uf4aqjtooq6ds6hk\\0s9irvrm7p7m8oq6594hrl70kk"
2024/09/10 16:55:27 NOTICE: Config file "C:\\Users\\[snipped]\\AppData\\Roaming\\rclone\\rclone.conf" not found - using defaults
2024/09/10 16:55:27 DEBUG : fs cache: adding new entry for parent of "Z:\\CT\\re251sg67nnhm135oad319cpmk\\smj2013l08uf4aqjtooq6ds6hk\\0s9irvrm7p7m8oq6594hrl70kk", "//?/Z:/CT/re251sg67nnhm135oad319cpmk/smj2013l08uf4aqjtooq6ds6hk"
2024/09/10 16:55:27 DEBUG : Creating backend with remote "F:\\"
2024/09/10 16:55:27 DEBUG : fs cache: renaming cache item "F:\\" to be canonical "//?/F:/"
2024/09/10 16:55:27 DEBUG : 0s9irvrm7p7m8oq6594hrl70kk: Need to transfer - File not found at Destination
2024/09/10 16:55:27 NOTICE: 0s9irvrm7p7m8oq6594hrl70kk: Skipped copy as --dry-run is set (size 1023.250Mi)
2024/09/10 16:55:27 NOTICE:
Transferred: 1023.250 MiB / 1023.250 MiB, 100%, 0 B/s, ETA -
Transferred: 1 / 1, 100%
Elapsed time: 0.1s
2024/09/10 16:55:27 DEBUG : 3 go routines active
File not found at Destination
the source file is not found in the dest. rclone will simply copy the file, no need for checksum.
need to pick one trouble file, the same file i posted about and test on that one single file only.
post a full debug log. re251sg67nnhm135oad319cpmk/058srvaumm56v2l6q44gc5m8mqujceocescn9putobrv1udqhke0
yeah, i used to use ATI, but about six years ago, i switch to the free veeam backup and replication.
and using veeam, i ran into the same/similar issue.
my workaround was to disable synthetic backups and a few other features.
so that once veeam created a backup file, full or incremental, veeam would never touch that backup file again.
then i run rclone copy --immutable to the cloud.
another side benefit, makes it practical to archive backups to aws deep glacier.
I have a working, proven, paid solution, am not going to invest much time in changing it (besides: IIRC "Veeam free" only provides a single job (but I do have at least 2 to run)).
A pity (official) rclone has an issue in this (pseudo)local setup and is unable to resort to date-and-size-only but I guess I'll wait a bit and if there's no solution I rather abandon or sell my Filen acc (bought it only for the promise of "rclone integration" anyway and an rclone fork is not the same (its, again, only a proprietary client!)) than grow gray hair.
pCloud and Koofr work fully OK with official rclone, so Filen wasn't necessary at all.
lol, try --magic
else should be able to write a simple script to avoid that initial checksum.
just sharing how i do combine block-based backups with rclone+cloud.
well, that might apply to the standalone agent.
the free edition of veeam backup and replication will protect up to 10 machines, any combination of physical, virtual, cloud. desktops or servers, windows, linux or macos.
each machine running the veeam agent can have multiple jobs.
also, can backup SMB shares.
You mean to run the command twice, once with --size-only and once with "--date-only" (the latter doesn't exist though) or how do you think to achieve a restriction using a bunch of commands (== script) that a single command cannot implement?