What is the problem you are having with rclone?
I am trying to copy files using rclone from s3 to s3. However, I am seeing errors as Entry doesn't belong in the directory for 2 different buckets. This is happening as i see with name and nameless virtual folders I have seen bug captured earlier but it seems it is still not fixed.
Entry doesn't belong in directory.. errors when using rclone sync with s3 · Issue #1621 · rclone/rclone · GitHub
Here are my errors
ERROR : eecId14015/: Entry doesn't belong in directory "eecId14015" (same as directory) - ignoring
ERROR : : Entry doesn't belong in directory "" (same as directory) - ignoring
Run the command 'rclone version' and share the full output of the command.
rclone v1.59.1 and rclone v1.59.4
Which cloud storage system are you using? (eg Google Drive)
The command you were trying to run (eg
rclone copy /tmp remote:tmp)
./rclone --progress --log-file=ssssss-log.txt --log-level DEBUG copy config1:ssssss/ config2:testbucket/
The rclone config contents with secrets removed.
type = s3
provider = Other
endpoint = https://test.com
type = s3
provider = Other
region = other-v2-signature
endpoint = test2.com
A log from the command with the
1st Error debug logs
2022/09/22 16:23:30 DEBUG : fs cache: renaming cache item "config1:ssssss/" to be canonical "config1:ssssss"
2022/09/22 16:23:30 DEBUG : Creating backend with remote "config2:testbucket/"
2022/09/22 16:23:30 DEBUG : fs cache: renaming cache item "config2:testbucket/" to be canonical "config2:testbucket"
2022/09/22 16:23:34 ERROR : eecId14015/: Entry doesn't belong in directory "eecId14015" (same as directory) - ignoring
2022/09/22 16:23:34 DEBUG : S3 bucket testbucket: Waiting for checks to finish
2nd Error debug logs
2022/09/22 16:35:55 ERROR : : Entry doesn't belong in directory "" (same as directory) - ignoring
2022/09/22 16:35:57 DEBUG : S3 bucket testbucket: Waiting for checks to finish
2022/09/22 16:35:57 DEBUG : S3 bucket testbucket: Waiting for transfers to finish
I think these ERRORs are harmless - you should find all the actual files got copied OK.
Hi ncw, Thanks for the response. No, it's not, any files under these virtual folders are not getting copied. Could you pls help? It's ignoring the folder as shown in the logs.
Here is the bucket difference.
./rclone size config1:bucket
Total objects: 1.324k (1324)
Total size: 93.311 MiB (97844187 Byte)
./rclone size config2:testbucket
Total objects: 66 (66)
Total size: 7.401 MiB (7760560 Byte)
What exactly do the keys of the objects look like? Do they start with a
/? or have double
Yep. that's right.. When I checked with aws cli.. this is how it looks like. It is same with rclone as well.
# aws s3 ls s3://ssssss
2018-01-24 16:05:33 15877 1.jpg
Recursive list is as below
2017-07-15 13:27:20 44632 eecId14015//440763.json
2017-07-15 13:27:21 274128 eecId14015//440763.pdf
2017-07-15 13:27:22 11091 eecId14015//42809.json
2017-07-15 13:27:23 105801 eecId14015//42809.pdf
# aws s3 ls s3://test
2017-05-09 14:27:49 3929 090517-022749.008-c.pdf
2017-05-11 16:17:02 49749 110517-041702.013.png
Recursive list as below
2017-05-11 16:13:50 98397 //8.png
2017-05-11 16:18:05 31373 //F-1.png
2017-05-08 15:00:31 19585 /test/Template.xlsx
2017-05-08 15:00:31 52418 /test/Template17.xlsx
2017-05-08 15:00:30 52301 /test/Template7.xlsx
Rclone can't deal with these at the moment.
There is a plan for a fix which involves a bit of encoding the paths.
Can you rename them?
Thanks @ncw. Unfortunately, we can't rename them as they are being used in an application and they expect the same structure while migrating the data. Is there any workaround from rclone until the fix rolls out?
Nothing which doesn't involve some coding!
Adding an encode "leading slash and multiple slashes" encoder wouldn't be too tricky and it would fix the problem.