Logfile states that file got moved, but its still existing and broken on remote?

What is the problem you are having with rclone?

I´m moving files from the local filesystem to a GoogleStorage Bucket with limited read/write-acess. Files can only be listed and uploaded but not be read or get deleted with the account I created solely for this backup task. Worked fine for several weeks now. But now it failed due to a file that was already existing in the bucket (and couldn´t be overwritten, which just worked as designed).
The file in the bucket wasn´t complete (4MB instead of 40MB), when I deleted it manually in the bucket and re-run my script it was uploaded fully and also removed from the filesystem this time.

What bothers me is that the first logfile states that the "secondFile.2021-10-08.tar.gz" was uploaded and deleted. So it seems that this upload failed silently while the log states something else ?!? ( I also check for the exit-code of the rclone-command in the script, which was 0)

Any ideas what could have happened here ? I increased the loglevel now to -vv, but I don´t expect much better info if this issue happens again as the details for a single transfer didnt increase...

Run the command 'rclone version' and share the full output of the command.

rclone v1.47.0
- os/arch: linux/amd64
- go version: go1.11.5

Which cloud storage system are you using? (eg Google Drive)

Google Cloud Storage with a special account that is only allowed to store and list files.

The command you were trying to run (eg rclone copy /tmp remote:tmp)

 rclone -v move  --log-file=<myLogFile> --min-age 183d <MySourceDir> <MyRemote>:<MyBucket>

The rclone config contents with secrets removed.

[MyRemote]
type = google cloud storage
service_account_file = /home/<user>/.config/rclone/<user@remote>.json
bucket_policy_only = true
location = EU
storage_class = NEARLINE
client_id = <UserID>
project_number = <MyProject>
object_acl = private
bucket_acl = private

A log from the command with the -vv flag

Logfile from 19.04 which stated that "secondFile.2021-10-08.tar.gz" got moved:
...
2022/04/19 02:06:42 INFO  : firstFile.2021-10-13.tar.gz: Copied (new)
2022/04/19 02:06:42 INFO  : firstFile.2021-10-13.tar.gz: Deleted
2022/04/19 02:06:42 INFO  : secondFile.2021-10-08.tar.gz: Copied (new)
2022/04/19 02:06:42 INFO  : secondFile.2021-10-08.tar.gz: Deleted
2022/04/19 02:06:42 INFO  : thirdFile.2021-10-13.tar.gz: Copied (new)
2022/04/19 02:06:42 INFO  : thirdFile.2021-10-13.tar.gz: Deleted
...

Logfile from 20.04 which states that "secondFile.tar.gz" already exists in Bucket:
...
2022/04/20 02:06:26 INFO  : randomFile.2021-10-14.tar.gz: Copied (new)
2022/04/20 02:06:26 INFO  : randomFile.2021-10-14.tar.gz: Deleted
2022/04/20 02:06:27 ERROR : secondFile.2021-10-08.tar.gz: Failed to copy: googleapi: Error 403: <User@project>.iam.gserviceaccount.com does not have storage.objects.delete access to the Google Cloud Storage object., forbidden
2022/04/20 02:06:27 ERROR : secondFile.2021-10-08.tar.gz: Not deleting source as copy failed: googleapi: Error 403: <user@project>.iam.gserviceaccount.com does not have storage.objects.delete access to the Google Cloud Storage object., forbidden
2022/04/20 02:06:34 INFO  : randomFile2.2021-10-14.tar.gz: Copied (new)
2022/04/20 02:06:34 INFO  : randomFile2.2021-10-14.tar.gz: Deleted
...

Did something put the file back? You'd want to check after rclone finishes and see if a file is there locally.

And if when you come back the next day, prior to running, you can check the local file system again.

Rclone does a great job, but usually doesn't make files come back from being deleted :slight_smile:

that version of rclone is over three years old, best to update to latest stable v1.58.0
the issue could be with a bug, that has since been fixed.

Well, had the same thought, but I can´t think of any job or cause that might put the file in place again after it got deleted. It also had the old timestamp from October last year, so it seems very unlikely to me.
However, the version from the Remi-Repository is indeed very old. I searched the rclone changelog and didn´t find an obvious bug... but updating is a good idea nevertheless.

Is it possible this file was being created while rclone was being run?

It is very unlikely rclone uploaded a partial file as it checks the size and checksum after uploading.

Is it possible this file was being created while rclone was being run?
It is very unlikely rclone uploaded a partial file as it checks the size and checksum after uploading.

At least I can´t think of any process that might copy/write files there with an old timestamp... the file definitely had a ~six month old timestamp. Beside that, can I somehow log the size and checksum-check from the file upload ? Would be interesting if it happens again. Using -vvv just writes out:

2022/04/27 18:45:49 INFO  : <file>: Copied (new)
2022/04/27 18:45:49 INFO  : <file> : Deleted
2022/04/27 18:45:49 DEBUG : pacer: Reducing sleep to 14.741322ms

Maybe there´s an obvious option, but I didn´t see it right now :slight_smile:

If it happens again, a -vv log of the upload should throw some light on what is happening.

If I try this with rclone 1.58 I get

2022/04/27 09:54:45 DEBUG : file15.txt: Need to transfer - File not found at Destination
2022/04/27 09:54:45 DEBUG : file15.txt: md5 = 276fa3589aa7798d51a1e723a661daea OK
2022/04/27 09:54:45 INFO  : file15.txt: Copied (new)
2022/04/27 09:54:45 INFO  : file15.txt: Deleted

Which shows the checksum of the transferred file

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.