Rclone "Caching" AWS Credentials Per File

What is your rclone version (output from rclone version)

1.52.1

Which cloud storage system are you using? (eg Google Drive)

AWS S3

What is the problem you are having with rclone?

  • Our program is attempting to upload many files to S3 using Rclone RC. An example command:
rc: "operations/copyfile": with parameters map[_group:<nil> dstFs:REMOTE-NAME:bucket-name/2654c16b1cd044e18de0ac45015ed37c/file/path/ dstRemote:File.txt srcFs:/path/to/input/ srcRemote:File.txt
  • After refreshing AWS Credentials, config/update was called to update the credentials in Rclone.
rc: "config/update": with parameters map[name:REMOTE-NAME obscure:true parameters:map[access_key_id:ASIAXXXXXXXXMOR env_auth:false provider:AWS region:us-east-1 secret_access_key:Redacted session_token:Redacted]]
  • A file failed to copy with "The AWS Access Key Id you provided does not exist in our records." with the original AWSAccessKeyId
File.txt: Failed to copy: s3 upload: 403 Forbidden: <?xml version="1.0" encoding="UTF-8"?>
<Error><Code>InvalidAccessKeyId</Code><Message>The AWS Access Key Id you provided does not exist in our records.</Message><AWSAccessKeyId>ASIAXXXXXXXXRWO</AWSAccessKeyId><RequestId>DC3E05DCE4E23787</RequestId><HostId>2HZvmnLKSlUYzU4CmLJNa/2WaeQtaf2FdXWZPBhinOM/FhKjTUB1gzFNGbEEQqY05TTPgyZ5POk=</HostId></Error>
ERROR : rc: "operations/copyfile": error: s3 upload: 403 Forbidden: <?xml version="1.0" encoding="UTF-8"?>]
  • For every future attempt to upload that same file Rclone returns the same error with the original AWSAccessKeyId (ASIAXXXXXXXXRWO). Confirmed that both the config file and config/dump show the new credentials.

  • Other files are able to successfully upload using the updated credentials.

-The only way to get the "stuck" file to upload is to restart the Rclone instance.

My questions:

  • Does Rclone "cache" the AWS credentials for a single file (even after the file completes, either successfully or unsuccessfully)
  • Is there any setting / flag / method to ensure Rclone does not remember files after an upload attempt?

Rclone caches the backends for 5 minutes when not in use.

I could make a remote control command to clear that cache.

Alternatively you could use a different name for the remote each time.

Maybe config updating should clear the cache of that remote, that would fix the problem.

Thank you for your response - we will try renaming the remote after each refresh.

Couple follow-up questions:

  • Assume there are some files currently copying by Rclone, if we refresh the credentials and copy to a new remote, will the currently copying files be able to finish successfully under the "old" remote?

  • "Rclone caches the backends for 5 minutes when not in use." -> After refreshing the credentials, new files were able to copying using the new credentials. However, the original file continued to fail since it was using the original credentials. Does your response mean that the backend is cached per-file? Since we retried copying the same file within 5 minutes, the file continued to use the cached (old) credentials?

Yes, that will work fine.

I found a bug in the caching which was fixed for 1.53.1 so I suggest you try that. The caching should work properly then

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.