Absolute paths in lsf, lsjson

I would find it useful to have the ability to output (in lsf --format, or lsjson) the absolute path of an object. Currently only the path relative to the given source path is outputted.

(To be fair, I've only used rclone for azure blob storage so far. I read in another post that rclone explicitly never uses absolute paths. Maybe that's an architecture thing or some design choice due to abstracting from all those different filesystems, in which case I'm sorry for asking.)

A workaround is to move the source path to the --filter. But in case of the azure blob storage for example that would take a lot of client side filtering instead of sending a prefix to the server. Another workaround is to prefix the output of rclone yourself, or sending it to jq or something.

The background for my wish is compatibility with minio client. We used minio's azure gateway until it was abandoned. And azure's own az-cli is not as nice as rclone. Minio client was able to output the full path of objects found.

I've started an attempt to implement this. It seems not very hard to use it for fs.Objects because they already have the Root() in their ObjectInfo. A Dir does not have that info though. So before I continue, I thought: let's discuss it here first.

Are you feeding the output of rclone lsf into --filter somehow? Can you describe what you are doing? I always like to have use cases for new rclone features in mind.

Rclone paths are always relative to the root of the transfer. This is so you can sync drive:path with /home/me/myfiles and rclone has identical paths to deal with from the source and destination.

It would be a moments work with jq to prefix the path yourself...

rclone lsjson . | jq '.[].Path = "/my/new/prefix/\(.[].Path)"'

I'd calculate the addition to the path once and stick it in the opt.

I'm not sure a new feature is needed though - I'll need a good use case!

Thank you for the quick and elaborate response.
I cannot think of a use case that would not be solved by prefixing myself afterwards. (To no avail I tried to come up with something where there would be duplicate names.)

Ftr, this is an example of how I could use the --filter --include:

# all
$ rclone lsf --recursive --files-only --format p azblob:public
another/long/path/test.txt
some/long/path/test.txt

# only in some directory
$ rclone lsf --recursive --files-only --format p azblob:public/some
long/path/test.txt

# only in some directory, output full path
$ rclone lsf --recursive --files-only --format p --include='public/some/**' azblob:
public/some/long/path/test.txt

But as said, that would filter locally and be less performant. So I'll go with jq (or sed).

2 Likes

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.