# Issue with lsf -R --files-only - first line is blank

rclone v1.51.0 windows and linux

%rcmd% lsf -R --files-only remote:folder > destfiles.txt

the root folder has no files
but in the output, the first line is blank.
this only happens if the root folder has no files.
in subfolders this does not happen

a/a.txt
b/b.txt
c/c.txt
d/dd/ddd/d.txt


Strange! What backend is this?

If you do rclone lsjson -R --files-only remote:folder > destfiles.json does the first entry have an empty "Path" item? Can you post the first entry?

remote: is an alias. i thought that when posting scripts for other users, it would be easier to understand.
i tried to run the lsjson and pipe that output to a file but i was gettings errors on the command line that did not show in the json file.

so here is the raw ouput for the alias and the actual remote.
both lsf and lsjson

"remote": {
"remote": "wasabieast2:aliasremote",
"type": "alias"


------------start remote:\folder

C:\data\rclone\scripts\rr\other\test>C:\data\rclone\scripts\rclone.exe lsf -R --files-only remote:\folder

a/a.txt
b/b.txt
c/c.txt
------------end remote:\folder

------------start wasabieast2:aliasremote\folder

C:\data\rclone\scripts\rr\other\test>C:\data\rclone\scripts\rclone.exe lsf -R --files-only wasabieast2:aliasremote\folder

a/a.txt
b/b.txt
c/c.txt
------------end wasabieast2:aliasremote\folder

------------start remote:folder

C:\data\rclone\scripts\rr\other\test>C:\data\rclone\scripts\rclone.exe lsjson -R --files-only remote:folder
[
{"Path":"","Name":".","Size":0,"MimeType":"application/octet-stream","ModTime":"2020-03-28T14:37:52.610400200-04:00","IsDir":false,"Tier":"STANDARD"},
{"Path":"a/a.txt","Name":"a.txt","Size":22,"MimeType":"text/plain; charset=utf-8","ModTime":"2020-03-28T10:51:27.633838900-04:00","IsDir":false,"Tier":"STANDARD"},
{"Path":"b/b.txt","Name":"b.txt","Size":2,"MimeType":"text/plain","ModTime":"2020-03-27T17:21:40.772068600-04:00","IsDir":false,"Tier":"STANDARD"},
{"Path":"c/c.txt","Name":"c.txt","Size":58,"MimeType":"text/plain; charset=utf-8","ModTime":"2020-03-28T10:51:32.639289600-04:00","IsDir":false,"Tier":"STANDARD"}
]

C:\data\rclone\scripts\rr\other\test>echo ------------end remote:folder
------------end remote:folder

------------start wasabieast2:aliasremote\folder

C:\data\rclone\scripts\rr\other\test>C:\data\rclone\scripts\rclone.exe lsjson -R --files-only wasabieast2:aliasremote\folder
[
{"Path":"","Name":".","Size":0,"MimeType":"application/octet-stream","ModTime":"2020-03-28T14:37:57.420412000-04:00","IsDir":false,"Tier":"STANDARD"},
{"Path":"a/a.txt","Name":"a.txt","Size":22,"MimeType":"text/plain; charset=utf-8","ModTime":"2020-03-28T10:51:27.633838900-04:00","IsDir":false,"Tier":"STANDARD"},
{"Path":"b/b.txt","Name":"b.txt","Size":2,"MimeType":"text/plain","ModTime":"2020-03-27T17:21:40.772068600-04:00","IsDir":false,"Tier":"STANDARD"},
{"Path":"c/c.txt","Name":"c.txt","Size":58,"MimeType":"text/plain; charset=utf-8","ModTime":"2020-03-28T10:51:32.639289600-04:00","IsDir":false,"Tier":"STANDARD"}
]
------------end wasabieast2:aliasremote\folder

Is it possible you have an object called . in your wasabi bucket? That would certainly confuse rclone in this way!

wasabi is s3 clone.
what do you mean by object?

i think that the minimum length for a folder is 3 characters.

so i deleted the folder folder, created a new empty folder newfolder
still getting

C:\data\rclone\scripts\rr\other\test>C:\data\rclone\scripts\rclone.exe lsjson -R --files-only wasabieast2:aliasremote\newfolder
[
{"Path":"","Name":".","Size":0,"MimeType":"application/octet-stream","ModTime":"2020-03-28T15:05:12.802113600-04:00","IsDir":false,"Tier":"STANDARD"}
]

what is strange that rclone tree returns the correct info

/
├── a
│   └── a.txt
└── b
└── b.txt

2 directories, 2 files
------------start wasabieast2:aliasremote\newfolder

C:\data\rclone\scripts\rr\other\test>C:\data\rclone\scripts\rclone.exe lsf -R --files-only wasabieast2:aliasremote\newfolder

a/a.txt
b/b.txt
------------end   wasabieast2:aliasremote\newfolder

------------start wasabieast2:aliasremote\newfolder

C:\data\rclone\scripts\rr\other\test>C:\data\rclone\scripts\rclone.exe lsjson -R --files-only wasabieast2:aliasremote\newfolder
[
{"Path":"","Name":".","Size":0,"MimeType":"application/octet-stream","ModTime":"2020-03-28T15:34:42.066068300-04:00","IsDir":false,"Tier":"STANDARD"},
{"Path":"a/a.txt","Name":"a.txt","Size":22,"MimeType":"text/plain","ModTime":"2020-03-28T19:18:09.000000000Z","IsDir":false,"Tier":"STANDARD"},
{"Path":"b/b.txt","Name":"b.txt","Size":2,"MimeType":"text/plain","ModTime":"2020-03-28T19:31:12.000000000Z","IsDir":false,"Tier":"STANDARD"}
]
------------end   wasabieast2:aliasremote\newfolder

rclone ls is returning incorrect info
notice there is a file of zero length and no name

------------start wasabieast2:aliasremote\newfolder
C:\data\rclone\scripts\rr\other\test>C:\data\rclone\scripts\rclone.exe ls wasabieast2:aliasremote\newfolder
0
22 a/a.txt
2 b/b.txt
------------end wasabieast2:aliasremote\newfolder

rclone lsf is also returning incorrect info.

notice that the timestamp of the zero length file.

each time i re-run the command, that timestamps changes to the current time the command is run.

------------start wasabieast2:aliasremote\newfolder
C:\data\rclone\scripts\rr\other\test>C:\data\rclone\scripts\rclone.exe lsl wasabieast2:aliasremote\newfolder
0 2020-03-28 15:48:48.985526100
22 2020-03-28 15:18:09.000000000 a/a.txt
2 2020-03-28 15:31:12.000000000 b/b.txt
------------end   wasabieast2:aliasremote\newfolder

I tried to replicate this on wasabi but I couldn't.

Can you describe exactly how you created the failing environment - how did you upload the files and create the directories - so I can try to reproduce it here?

i figured it out. and why you cannot recreate it.

if i rclone.exe sync c:\path\to\local\folder\ wasabieast2:aliasremote\folder\
no problem, rclone creates the folder and
rclone.exe lsf -R --files-only wasabieast2:aliasremote\folder\ outputs as expected.

wasabi has a rebranded version of cloudberry explorer. a gui frontend to s3 storage.
the problem happens when i use the explorer to create another folder.
rclone lsf output has the blank line in the output.

so i logged into website wasabi.com, created another folder, then did rclone lsf.
and again same problem.

this is from the dump header based on when creating the folder via wasabi.
there is an extra content in the output
but the dump header based on when rclone creates the folder, does not have this

<Contents>
<Key>folder/</Key>
<LastModified>2020-03-29T15:49:33.000Z</LastModified>
<ETag>"dummy etag"</ETag>
<Size>0</Size>
<Owner><ID>dummy id</ID><DisplayName>dummy name</DisplayName></Owner>
<StorageClass>STANDARD</StorageClass>
</Contents>


it only happens when i rclone lsf -R a wasabi created folder.
but when i run the same command on the remote itself, no problem

rclone lsf --files-only -R wasabieast2:aliasremote
rclone.created.folder/a.txt
wasabi.created.folder/b.txt

So what is going on is that cloudberry is creating objects called directory/ probably to hold metadata. Rclone knows how to ignore these, but I think it is also creating one at the root called . - this is the cause of the empty line.

If you use, for example, the s3cmd tool to list the bucket, you'll see exactly which objects are in there.

thanks,

i will try s3cmd, but what would s3cmd show, that rclone dump headers would not?
if a folder named . was created, i do no see it in the dump headers.

cloudberry is not the problem.
wasabi is the problem.

• error - if i create folder with cloudberry
• error - if i create folder via wasabi website
• no error - if rclone creates the folder

here is the output from s3cmd.

i could be wrong but noticed that there are only three entries, not four
seems that s3://aliasremote/rclone.created.folder is missing

C:\data\C\s3cmd>python s3cmd ls s3://aliasremote --recursive
2020-03-29 17:50        22   s3://aliasremote/rclone.created.folder/a.txt
2020-03-29 17:27         0   s3://aliasremote/wasabi.created.folder/
2020-03-29 17:29         2   s3://aliasremote/wasabi.created.folder/b.txt


C:\data\C\s3cmd>python s3cmd ls s3://aliasremote
DIR   s3://aliasremote/rclone.created.folder/
DIR   s3://aliasremote/wasabi.created.folder/


Nothing, it is just slightly easier to read output!

Rclone doesn't create folders - that isn't a concept s3 has. S3 can only store objects and cloudberry/wasabi are emulating that by creating 0 length objects ending in /.

I managed to replicate the problem by createing the directory in the web app

$rclone lsf -R --files-only wasabi:rclone-test2/wasabi-folder hello.txt  That blank line is rclone's internal representation of the current directory. Interestingly without -R I get this which is probably more like I expected. $ rclone lsf --files-only wasabi:rclone-test2/wasabi-folder
2020/03/30 17:14:56 ERROR : : Entry doesn't belong in directory "" (same as directory) - ignoring
hello.txt


So what do you think rclone should do here?

• silently ignore the directory markers?
• given an ERROR
• something else?

so i am not going crazy being stuck at home.

is there a use-case for a blank line in the output?

such output is often fed back to rclone or used by other scripts.

for lsf --file-only, for sure, remove that directory marker.

I suggest "silently ignore the directory markers"

I don't think so.

The code does attempt to do that, but it is just failing at the root.

I've fixed that and you can find it here