rclone confusing my bucket name with folder name?

I'm using laravel php artisan command to do a rclone copy of a sql database dump, however I'm getting the following error:

Failed to copy: s3 upload: 404 Not Found: <?xml version="1.0" encoding="UTF-8"?><Error><Code>NoSuchBucket</Code><Message></Message><BucketName>sql-backups</BucketName><RequestId>txxxxxxxxx-0065f424e0-e72ab73d-fra1b</RequestId><HostId>xxxxx-fra1b-fra1-zgxx</HostId></Error>

Notice the message name says (I think) that no bucket named sql-backups was found, but sql-backups is the folder inside my bucket name called gabotron, both the bucket and the folder exist (as I can correctly ls their contents) but for some reason rclone is trying to look for a bucket name sql-backups

This is my code:

$remoteName = env('SPACES_BUCKET');
$bucketFoldername = 'sql-backups';
$datestring = date('Y_m_d_H');
$filename = "{$dirname}/{$database}-{$datestring}.sql.gz";

$configFile = "/home/GabotronES/configs/rclone/wooloveapp.conf";

        // Rclone command for copy
        $rcloneCopyCommand = "rclone --config={$configFile} copy {$filename} {$remoteName}:{$bucketFoldername}";


When I print on console $rcloneCopyCommand I get this (wich seems about right).

rclone --config=/home/Gabotron/configs/rclone/laraapp.conf copy /home/Gabotron/sql-backups/laraapp-2024_03_15_10.sql.gz gabotron:sql-backups

Why is rclone trying to find a bucket that doesn't exist even when I'm specifying the bucket name correctly.

My config file (changed credentials):

type = s3
env_auth = false
access_key_id = xxx
secret_access_key = xxxx/xxxx
endpoint = fra1.digitaloceanspaces.com
acl = public-read

Thanks in advance.

1 Like

welcome to the forum,

the correct syntax is remote_name:bucket_name/folder_name

the bucket name is missing, so might try
$bucketFoldername = 'gabotron/sql-backups'

to view your storage structure, try rclone lsd, rclone ls, rclone tree, rclone ncdu

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.