Struggling to translate a working rclone command into a `rclone rc` one

What is the problem you are having with rclone?

rclone ls works but what I think is the equivalent rclone rc operations/list does not work

Run the command 'rclone version' and share the full output of the command.

rclone v1.67.0
- os/version: ubuntu 22.04 (64 bit)
- os/kernel: 6.5.0-1023-aws (aarch64)
- os/type: linux
- os/arch: arm64 (ARMv8 compatible)
- go/version: go1.22.4
- go/linking: static
- go/tags: none

Which cloud storage system are you using? (eg Google Drive)

AWS S3

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone rc operations/list -vv fs="rclone_s3_target:" remote="bucketname/folder1/folder2/folder3"

Please run 'rclone config redacted' and share the full output. If you get command not found, please make sure to update rclone.

[rclone_s3_target]
type = s3
provider = AWS
env_auth = true
region = eu-central-1
storage_class = STANDARD
bucket_acl = private
acl = private
no_check_bucket = true

A log from the command that you were trying to run with the -vv flag

2024/08/08 12:39:35 DEBUG : rclone: Version "v1.67.0" starting with parameters ["rclone" "rc" "operations/list" "-vv" "fs=rclone_s3_target:" "remote=bucketname/folder1/folder2/folder3/"]
{
        "error": "error in ListJSON: AccessDenied: Access Denied\n\tstatus code: 403, request id: ZAQFPNPJMD7BHTJG, host id: 2JO1WEsj/jZjkYpIIJHkrMoAqYEfuT28YLpp55cZOvapxm7ZC66lqGN8GJxDTIVr/BAWx/T1b9UCkc4yydu22A==",
        "input": {
                "fs": "rclone_s3_target:",
                "remote": "bucketname/folder1/folder2/folder3/"
        },
        "path": "operations/list",
        "status": 500
}
2024/08/08 12:39:35 DEBUG : 6 go routines active
2024/08/08 12:39:35 Failed to rc: operation "operations/list" failed: error in ListJSON: AccessDenied: Access Denied
        status code: 403, request id: ZAQFPNPJMD7BHTJG, host id: 2JO1WEsj/jZjkYpIIJHkrMoAqYEfuT28YLpp55cZOvapxm7ZC66lqGN8GJxDTIVr/BAWx/T1b9UCkc4yydu22A==

rclone daemon config

[Unit]
Description=Rclone rcd service
After=network-online.target

[Service]
Type=notify
ExecStart=/usr/bin/rclone rcd \
  --rc-enable-metrics \
  --rc-no-auth
  --log-level='DEBUG' \
  --log-file='/var/log/rclone-rcd.log'
Restart=on-failure
RestartSec=5
StartLimitInterval=60s
StartLimitBurst=3
TimeoutStartSec=150

[Install]
WantedBy=multi-user.target

Other info

Hello, Hopefully you can find above all the relevant configs, etc. I can list the content of my bucket fine use the above config with "plain" rclone

rclone ls rclone_s3_target:bucketname/folder1/folder2/folder3

I have not found that many examples of RC commands, but I think the below is correct?

rclone rc operations/list -vv fs="rclone_s3_target:" remote="bucketname/folder1/folder2/folder3"

The Access Denied status code: 403 error I'm getting, I do not know if it's coming from the RC daemon or from AWS? I doubt AWS permissions, since rclone ls works and also aws s3 ls, using the same env vars for auth

I have tried running rcd with and without auth (if using auth i passed --rc-user and --rc-pass. rclone rc with and without trailing slashes in remote. With and without trailing colon for fs. With and without quotations for both fs and remote parameters

Any ideas?

If you have config locked to a single bucket you'll want to include it in the fs= thus

rclone rc operations/list -vv fs="rclone_s3_target:bucketname" remote="folder1/folder2/folder3"

Otherwise rclone will be listing the buckets which may not be allowed.

That would be my guess.

If that doesn't work then run the rclone rcd with -vv --dump headers and send the log that it makes.

forgot to provide an update. the problem was me, I did not take into account that the rc daemon and my user session run on different contexts, meaning different sets of environment variables (which I was trying to use). I switched to credentials in the configuration file for the s3 backend and that fixed the issue. Did not want to mess with system wide AWS_ env vars since it might affect other resources that support them