STOP and READ USE THIS TEMPLATE NO EXCEPTIONS - By not using this, you waste your time, our time and really hate puppies.
What is the problem you are having with rclone?
Trying to copy github content from local backup to s3 bucket. Receiving error of object not found
Run the command 'rclone version' and share the full output of the command.
/rclone --version
rclone v1.62.2
os/version: redhat 7.3 (64 bit)
os/kernel: 3.10.0-514.el7.x86_64 (x86_64)
os/type: linux
os/arch: amd64
go/version: go1.20.2
go/linking: static
go/tags: none
Yes it seems to be the latest
Which cloud storage system are you using? (eg Google Drive)
Amazon S3
The command you were trying to run (eg rclone copy /tmp remote:tmp
)
./rclone copyto /ghedata/ghebackup/current aws-s3:pazes3ideasgitbkup/githubprod -l -P -v
or
./rclone sync /ghedata/ghebackup/current aws-s3:pazes3ideasgitbkup/githubprod -l -P -v
The rclone config contents with secrets removed.
type = s3
provider = AWS
env_auth = false
region = us-east-1
server_side_encryption = AES256
A log from the command with the -vv
flag
rclone: Version "v1.62.2" starting with parameters ["./rclone" "copyto" "/ghedata/ghebackup/current" "aws-s3:pazes3ideasgitbkup/githubprod" "-l" "-P" "-vv"]
2023/04/14 09:12:11 DEBUG : Creating backend with remote "/ghedata/ghebackup/current"
2023/04/14 09:12:11 DEBUG : Using config file from "/home/ec2-user/.config/rclone/rclone.conf"
2023/04/14 09:12:11 DEBUG : local: detected overridden config - adding "{b6816}" suffix to name
2023/04/14 09:12:11 DEBUG : fs cache: adding new entry for parent of "/ghedata/ghebackup/current", "local{b6816}:/ghedata/ghebackup"
2023/04/14 09:12:11 DEBUG : Creating backend with remote "aws-s3:pazes3ideasgitbkup/"
2023/04/14 09:12:11 DEBUG : fs cache: renaming cache item "aws-s3:pazes3ideasgitbkup/" to be canonical "aws-s3:pazes3ideasgitbkup"
2023-04-14 09:12:11 ERROR : Attempt 1/3 failed with 1 errors and: object not found
2023-04-14 09:12:11 ERROR : Attempt 2/3 failed with 1 errors and: object not found
2023-04-14 09:12:11 ERROR : Attempt 3/3 failed with 1 errors and: object not found
Transferred: 0 B / 0 B, -, 0 B/s, ETA -
Errors: 1 (retrying may help)
Elapsed time: 0.0s
2023/04/14 09:12:11 INFO :
Transferred: 0 B / 0 B, -, 0 B/s, ETA -
Errors: 1 (retrying may help)
Elapsed time: 0.0s
2023/04/14 09:12:11 DEBUG : 2 go routines active
2023/04/14 09:12:11 Failed to copyto: object not found
Note: This did work yesterday and flew... I was looking to just do an update. So, somewhat unsure why this is happening today.
asdffdsa
(jojothehumanmonkey)
April 14, 2023, 1:29pm
2
if env_auth = false
then where is rclone to find access_key_id
and secret_access_key
access_key_id and secret_access_key lines are there but to comply with no revealing of secrets I removed them from the post.
[aws-s3]
type = s3
provider = AWS
env_auth = false
access_key_id = REDACTED
secret_access_key = REDACTED
region = us-east-1
server_side_encryption = AES256
both source and dest rclone ls seem to work
rclone-v1.62.2-linux-amd64]$ ./rclone ls aws-s3:pazes3ideasgitbkup/githubprod | more
30412 audit-log/audit_log-1-2016-01-2.gz
7 audit-log/audit_log-1-2016-01-2.gz.size
168110 audit-log/audit_log-1-2016-02-2.gz
8 audit-log/audit_log-1-2016-02-2.gz.size
./rclone ls /ghedata/ghebackup/current | more
12719 authorized-keys.json
20155 enterprise.ghl
0 es-scan-complete
61 manage-password
26 mysql-binary-backup-sentinel
4378990532 mysql.sql.gz
Directory structure on the mount
df -h | grep githubprod
[ec2-user@ip-10-234-6-145 githubprod]$ df -h | grep paz
pazes3ideasgitbkup 1.0P 0 1.0P 0% /opt/githubback
[ec2-user@ip-10-234-6-145 githubprod]$ pwd
/opt/githubback/githubprod
[ec2-user@ip-10-234-6-145 githubprod]$ ls
audit-log manage-password settings.json
authorized-keys.json mysql-binary-backup-sentinel ssh-host-keys.tar
benchmarks mysql.sql.gz ssl-ca-certificates.tar
elasticsearch pages storage
enterprise.ghl password-pepper strategy
es-scan-complete redis.rdb uuid
git-hooks repositories version
logfile.txt saml-keys.tar
copy, copyto and sync all give the same error. Thanks for the help so far. Incidentally the same exact command works on a different server just a different directory.
even without the s3: (going directly against the os mount - where the fuse/s3 is mounted) also does not seem to work.
./rclone sync /ghedata/ghebackup/current /opt/githubback/githubprod -P -l
2023-04-14 09:52:32 ERROR : Attempt 1/3 failed with 1 errors and: object not found
2023-04-14 09:52:32 ERROR : Attempt 2/3 failed with 1 errors and: object not found
2023-04-14 09:52:32 ERROR : Attempt 3/3 failed with 1 errors and: object not found
Transferred: 0 B / 0 B, -, 0 B/s, ETA -
Errors: 1 (retrying may help)
Elapsed time: 0.1s
2023/04/14 09:52:32 Failed to sync: object not found
Going to dismount and remount the s3 and see if that is it...
asdffdsa
(jojothehumanmonkey)
April 14, 2023, 2:06pm
7
make sure region
is correct.
maybe try without server_side_encryption = AES256
fwiw, no point testing rclone mount
until rclone copy/sync
works, just adds another layer of confusion.
not sure what that means, as changing the dir, changes the command?
so you ran a different rclone command on another machine or what?
using the exact same config file?
ncw
(Nick Craig-Wood)
April 14, 2023, 2:22pm
8
Is /ghedata/ghebackup/current
a symlink maybe?
You either want to use -L
to copy whatever the symlink points to or -l
to copy the symlink itself. Probably -L
in this case, so
rclone copy /ghedata/ghebackup/current aws-s3:pazes3ideasgitbkup/githubprod -L -P -v
Thanks all. The solution was the -L vs -l. current is indeed a link and my notes were not as detailed as needed. -L change is cooking along nicely. You can consider this matter closed.
1 Like
system
(system)
Closed
May 14, 2023, 3:04pm
10
This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.