Error while syncing the bucket

What is the problem you are having with rclone?

It's not with rclone but with the buckets not sure how to find the errors source.

While sync the bucket from spaces to Minio I get these errors,

Failed to copy: Forbidden: Forbidden                                             
        status code: 403, request id: 168F838099C502C5, host id:

How do I know why these error are popping up? The transfer is working it upload around 500Gb but what will happen to those object which got failed?

What is your rclone version (output from rclone version)

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Ubuntu 18

Which cloud storage system are you using? (eg Google Drive)

Digital ocean space and Self Hosted minio

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone sync digital:bucketname Minio:bucketname --progress --transfers 50 --checkers 50

Also can you share cmd to use copy cmd to single folder in the bucket as the data in it much more important.

Eg folder name media_attachments how do I only run copy for this folder.

The rclone config contents with secrets removed.

Not required 

A log from the command with the -vv flag

Dont have 

hello,

403 are usually permissions errors,
if rclone does not have permission to access the file, rclone cannot copy it.

hard to know what is going on as you failed to post the requested information including a debug log.

Rclone version

V 1.55.1

For logs I don't understand --vv flag usage so can you add an example of cmd.

One more thing it seems to be an error from digital ocean
As visiting some media directly result in this. And for other media it shows the media file.

I will first reslove this issue then come back if error still persist.

<Error>
<Code>NoSuchKey</Code>
<BucketName>buckname</BucketName>
<RequestId>tx000000000000016f6162e-0060e5ac31-12280137-nyc3c</RequestId>
<HostId>12280137-nyc3c-nyc3-zg03</HostId>
</Error>

can you share the cmd to copy only certain folder that way I can retrieve the missing files?

rclone copy source:bucket/folder dest:bucket/folder

if you can identify a single file with the error, then for testing, you can rclone copyto

1 Like

Thanks, I think digital ocean is limiting the api calls for the bucket that may seem to explain the error as rclone already transferred around 500Gb of data.

I will write here if I find something useful.

Hello,

I talked to digital ocean support they said it could be because of Wrong access and secret key but running.

rclone -vv ls digital:bucketname

Doesn't throw any error list all the file without any issue.

One wierd thing that is happening is that even after showing 100K errors it's still uploading and the size is reaching almost the size of source bucket.

Digital ocean bucket size - 1.6Tb
Minio bucket - 1.3Tb

what is going on? :slightly_frowning_face: I check the uploaded media files those are actual file opening and playing without any issue on minio.

        status code: 403, request id: 168FE2D8866056DB, host id:                       
Transferred:      617.962G / 618.501 GBytes, 100%, 7.697 MBytes/s, ETA 1m11s           
Errors:           1684923 (retrying may help)                                          
Checks:            706956 / 717016, 99%                                                
Transferred:        93703 / 94344, 99%                                                 
Elapsed time:  22h50m20.4s                                                             
Checking:


no point in wasting time running a large sync that has over 100,000+ errors.
best to find a single file that has the 403 error and test with rclone copyto

should be quick and easy to create a new set.

with s3 permissions, it is possible list a filename, but not able to read the contents of that file.

1 Like

Although what you gonna say about actual uploaded content?

1.3TB has been uploaded that is what I can't get my head around. The total size 1.6TB

Will create a new set of keys and test with single file that throw that error.

when you find that single file, run a command like this and post the rclone.log
and try to copy that file to local

rclone copyto digital:bucketname/file.txt /path/to/local/folder/file.txt --dump=bodies --retries=1 --low-level-retries=1 --log-level=DEBUG --log-file=rclone.log

1 Like

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.