helper
June 29, 2019, 6:17am
1
Hi,
i have a folder on S3 (backup) with a sync every day, but every 1st month i need to copy this S3 folder to another S3 folder /monthly/ to keep a monthly back-up, i now try that with this command:
rclone copy bucket:bucket/backups/daily bucket:bucket/backups/monthly -P --tpslimit 150 --s3-upload-cutoff 800M --transfers 16 --size-only
But i takes really much time.
How can i make this proces faster ?
and is there maybe a better command ?
ncw
(Nick Craig-Wood)
June 29, 2019, 10:08am
2
I think that should be doing server side copies - is that the case?
how many files have you got?
If you have lots of memory you can try --fast-list
which might help.
helper
June 29, 2019, 12:08pm
3
Yes, i want to copy a server-side folder.
--fast-list gives enough queue , problem is more uploading speed/proces.
Transferred: 114.846k / 1.421 GBytes, 0%, 489 Bytes/s, ETA 866h6m24s
Errors: 0
Checks: 1657 / 1657, 100%
Transferred: 127 / 3220, 4%
Elapsed time: 4m0.3s
ncw
(Nick Craig-Wood)
June 30, 2019, 7:01pm
4
I think there are some quite severe limits for server side copies so maybe you've exceeded your quota. Check with -vv
system
(system)
Closed
September 28, 2019, 7:01pm
5
This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.