Can you try the latest beta? That part of the VFS code has been extensively re-written since v1.52.2.
If it is still a problem there then I'll need a bit more to go on to reproduce. Ideally a sequence of steps I can follow to reproduce the problem. Note that you can use rclone touch to set timestamps which might be useful in any repro!
I'm now running the latest beta, and still getting the same error:
2020/07/24 23:10:25 ERROR : [path to file]: vfs cache: failed to set modification time of cached file: chtimes /home/jm/.cache/rclone/vfs/[path to file]: no such file or directory
Essentially what my script is doing is crawling a web directory, and checking files in that directory against my rclone bucket, and if they are duplicates, it does not re-download them.
The relevant part is:
def check_duplicates(download_path, content_length):
found = False
content_length = int(content_length)
local_size = os.path.getsize(download_path)
if local_size == content_length:
found = True
So, basically, I thought what should be happening is rclone would pulling the entire file in to VFS to check its size. The VFS directory definitely fills with files.
I don't do any operations on these files other than check their length and when using rclone mount so I would expect them to be quickly purged (I don't see updated modification times or anything of that nature either). However, these errors suggest to me that rclone is actually not pulling the file in to VFS?
I also get this error in the beta (but not the current release):
2020/07/24 23:17:14 ERROR :[path to file]: vfs cache: setModTime: failed to save item info: vfs cache item: failed to write metadata: open [cache path]/vfsMeta/[path to file]: no such file or directory
VFS doesn't download any file that the OS touches? I assumed any attempt to access the file (or its metadata) would cause it to be pulled in to the VFS cache...
I do not run multiple instances of rclone on that machine. I do occasionally have that directory mounted with --read-only on my laptop whilst the directory is mounted with --vfs-cache-mode xxx on a cloud instance.
The script simply checks to see if the remote file exists locally by comparing path and size and, if so, it skips the download. I don't do anything else with already-downloaded files after they have been compared... if that routine finds a match based on those criteria, the script does not touch that saved file and does nothing to it. The script then restarts to check the next file in the directory.
What type of log are you looking for?
Rclone's output consists entirely of what I sent above, but I'm happy to send more.
Could this be a permissions issue? I do not run rclone as root. I do not use --allow-other either.
permissions on the cache are drwx------ 3 user user
a quick chmod 777 of the vfs directory didn't impact the issue, but if it's worth diving down that rabbit hole, I can try
Besides 4 instances of my app running, I have a bash script that takes care of moving files from local folder to remote.
it runs continuously, but is fed a list of files that are about 3 minutes old since creation
every now and then I see this in rclone.log
2020/08/14 18:45:48 INFO : <folder2/file> : Moved (server side)
2020/08/14 18:45:48 ERROR : <folder2/file>: Failed to set modification time of cached file: chtimes /caches/gdrive/vfs/vfs/<REMOTE>/crypto/<folder2/file>: no such file or directory
After the fact, I manually check if the file and folder exist, both in local and remote, and they do.
If this is, in fact, something different, let me know and I'll start another thread
I don't do anything else with already-downloaded files after they have been compared... if that routine finds a match based on those criteria, the script does not touch that saved file and does nothing to it. The script then restarts to check the next file in the directory.