--max-age / --Files-From odd issue between 1.44/1.45 on windows

Hello all, after the new year decided to start fresh with 1.45 and look at improving rclone copy performance from windows -> box , large data set , small daily deltas

as our proprietor suggested I have been testing the following combo:

rclone lsf -R --exclude-from exclude.rclone --max-age 7d M:\Path\ > path-delta.txt

rclone --files-from path-delta.txt copy M:\path\ cache:_path_on_box --exclude-from exclude.rclone --no-update-modtime --size-only --max-size 14.9G --transfers 7 --ignore-checksum --stats-file-name-length 0 --stats 15s --checkers 12 --cache-db-path C:\Windows\Temp\rclone\cache-backend --config C:\rclone\rclone.conf -vv --log-file ā€œpath.logā€

two odd things are happening -

first off the initial lsf is clearly picking up some old files and folders , as far as I can tell they have not even been opened for years - I confirmed this behavior with trying a few older rclone .exe files I have going back to 1.36, 1.41, 1.44 and a 1.44 beta - I can chalk that up some windows/NTFS weirdness and it doesnā€™t really impact anything in reality (except the copy run taking longer than it should)

when running the copy command however with 1.45 I noticed two errors I had not seen before

DEBUG : find failed: not found in either local or remote fs

2019/01/08 14:39:15 DEBUG : Cache remote cache:_blocstudio/Bloc_GS: new object ā€˜Long Path/Linksā€™
2019/01/08 14:39:15 DEBUG : Long Path/Links: find: error: couldnā€™t find object (Links)
2019/01/08 14:39:17 DEBUG : find failed: not found in either local or remote fs

and this:

2019/01/08 12:50:25 DEBUG : Long/Path/to/file.txt: find: error: couldnā€™t open parent bucket for long/path
(this is in the destination path)

and this errors do not occur on the same files, I have not found any pattern as to what would cause it for one or the other, as far as I can tell they all exist in source and destination or have some weird naming thatā€™s causing a problem.

I re-ran the copy command with 1.44 and it seems to be working as expected

thanks

Quick Update:

After successfully finishing a copy with 1.44 - subsequent runs of lsf --max-age no longer seem to find those old folders - have not been able to reproduce it with the various version of rclone - at this point Iā€™m at least 50/50 on some kind of user error on my part that is glaringly obvious

thanks

did some more testing on a different folder set, and lsf still was picking up tons of old documents/folders shrug even after a backup so that also seems like a red herring - just fyi, thanks

sorry for the multiple addemdums: Iā€™m wondering if mixing 1.44 and 1.45 pointing to the same cache could be a problem?

That probably wants to have a --files-only in there too so it ignores directories.

I suspect that is probably a consequence of the missing --files-onlyā€¦

Those are both errors from the cache backend. Iā€™m not 100% certain what is causing them though. Do you still get them? Flushing the cache is likely to get rid of them.

Originally I was doing files only - I guess I will have to confirm if itā€™s a bug or not - the problem with box.com is that as far as I know it canā€™t create multiple levels of folders with one 1 api call, so if files only finds files that are in folders that donā€™t exist on box - I think that was the reason for the ā€œparent bucketā€ errorā€¦

Will test and confirm

1 Like

I redid -the lsf with --files-only and cleared clearing the cache still shows the ā€œbucketā€ and local or remose FS errors but as you said thatā€™s just cache related strictly a cache error of some sort - all file copies do in fact complete correctly - so I think this was all a red herring so far :slight_smile: thanks

Glad you got it working :smile: