Using rclone for s3 -> s3 copy

hi, looking for general guidance of how to get maximum speed for s3->s3 copy, and just s3 copies in general.

ive tried --s3-upload-concurrency=20 and --s3-chunk-size=100M but get speeds of around 20MB/s which is same as defaults. what other configurations can i change to try to speed things up? around 32 GB of ram on machine to work with if this matters.

also, rclone --version: v.166.0

welcome to the forum,

often the local internet is the limiting factor. can you post the result of a speed test?

when you posted, you were asked to answer a template of questions.
help us to help you and answer them....

Hi, download speed around 400 MB download, 140-145 MB upload, using Amazon test server on speedtest dot net

Here is the template answered:

What is the problem you are having with rclone?

Relatively slow speeds observed when trying to use rclone to copy from s3 bucket to another s3 bucket.

Run the command 'rclone version' and share the full output of the command.

rclone v1.66.0
- os/version: darwin 12.7.5 (64 bit)
- os/kernel: 21.6.0 (arm64)
- os/type: darwin
- os/arch: arm64 (ARMv8 compatible)
- go/version: go1.22.1
- go/linking: dynamic
- go/tags: none

Which cloud storage system are you using? (eg Google Drive)

S3

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone copy :s3,access_key_id=redacted,secret_access_key=redacted,region=us-east-2:bucket-name-a/file.example.txt :s3,access_key_id=redacted,secret_access_key=redacted,region=us-east-2:bucket-name-b/file.example.txt -vv -P

Please run 'rclone config redacted' and share the full output. If you get command not found, please make sure to update rclone.

; empty config

Note: I do not have a config file created currently (but maybe having one would help?)

A log from the command that you were trying to run with the -vv flag

2024/07/15 02:33:18 DEBUG : rclone: Version "v1.66.0" starting with parameters ["rclone" "copy" ":s3,access_key_id=redacted,secret_access_key=redacted,region=us-east-2:bucket-name-a/file.example.txt" ":s3,access_key_id=redacted,secret_access_key=redacted,region=us-east-2:bucket-name-b/file.example.txt" "-P" "-vv"]
2024/07/15 02:33:18 DEBUG : Creating backend with remote ":s3,access_key_id=redacted,secret_access_key=redacted,region=us-east-2:bucket-name-a/file.example.txt"
2024/07/15 02:33:18 NOTICE: Config file "~./config/rclone/rclone.conf" not found - using defaults
2024/07/15 02:33:18 DEBUG : :s3: detected overridden config - adding "{pJafe}" suffix to name
2024/07/15 02:33:18 NOTICE: s3: s3 provider "" not known - please set correctly
2024/07/15 02:33:19 DEBUG : fs cache: adding new entry for parent of ":s3,access_key_id=redacted,secret_access_key=redacted,region=us-east-2:bucket-name-a/file.example.txt", ":s3{pJafe}:bucket-name-a"
2024/07/15 02:33:19 DEBUG : Creating backend with remote ":s3,access_key_id=redacted,secret_access_key=redacted,region=us-east-2:bucket-name-b/file.example.txt"
2024/07/15 02:33:19 DEBUG : :s3: detected overridden config - adding "{GJrII}" suffix to name
2024/07/15 02:33:19 NOTICE: s3: s3 provider "" not known - please set correctly
2024/07/15 02:33:19 DEBUG : fs cache: renaming cache item ":s3,access_key_id=redacted,secret_access_key=redacted,region=us-east-2:bucket-name-b/file.example.txt" to be canonical ":s3{GJrII}:bucket-name-b/upload/file.example.txt"
2024/07/15 02:33:19 DEBUG : file.example.txt: Need to transfer - File not found at Destination
2024/07/15 02:33:19 DEBUG : file.example.txt: size: 593.035Gi, parts: 10000, default: 5Mi, new: 61Mi; default chunk size insufficient, returned new chunk size
2024/07/15 02:33:19 DEBUG : file.example.txt: open chunk writer: started multipart upload: [redacted]
2024/07/15 02:33:19 DEBUG : file.example.txt: multi-thread copy: using backend concurrency of 4 instead of --multi-thread-streams 4
2024/07/15 02:33:19 DEBUG : file.example.txt: Starting multi-thread copy with 9956 chunks of size 61Mi with 4 parallel streams
2024/07/15 02:33:19 DEBUG : file.example.txt: multi-thread copy: chunk 4/9956 (191889408-255852544) size 61Mi starting
2024/07/15 02:33:19 DEBUG : file.example.txt: multi-thread copy: chunk 2/9956 (63963136-127926272) size 61Mi starting
2024/07/15 02:33:19 DEBUG : file.example.txt: multi-thread copy: chunk 1/9956 (0-63963136) size 61Mi starting
2024/07/15 02:33:19 DEBUG : file.example.txt: multi-thread copy: chunk 3/9956 (127926272-191889408) size 61Mi starting
2024/07/15 02:33:53 DEBUG : file.example.txt: multipart upload wrote chunk 4 with 63963136 bytes and etag "..."
2024/07/15 02:33:53 DEBUG : file.example.txt: multi-thread copy: chunk 4/9956 (191889408-255852544) size 61Mi finished
2024/07/15 02:33:53 DEBUG : file.example.txt: multi-thread copy: chunk 5/9956 (255852544-319815680) size 61Mi starting
2024/07/15 02:33:54 DEBUG : file.example.txt: multipart upload wrote chunk 1 with 63963136 bytes and etag "..."
2024/07/15 02:33:54 DEBUG : file.example.txt: multi-thread copy: chunk 1/9956 (0-63963136) size 61Mi finished
2024/07/15 02:33:54 DEBUG : file.example.txt: multi-thread copy: chunk 6/9956 (319815680-383778816) size 61Mi starting
2024/07/15 02:33:55 DEBUG : file.example.txt: multipart upload wrote chunk 2 with 63963136 bytes and etag "..."
2024/07/15 02:33:55 DEBUG : file.example.txt: multi-thread copy: chunk 2/9956 (63963136-127926272) size 61Mi finished
2024/07/15 02:33:55 DEBUG : file.example.txt: multi-thread copy: chunk 7/9956 (383778816-447741952) size 61Mi starting
2024/07/15 02:33:56 DEBUG : file.example.txt: multipart upload wrote chunk 3 with 63963136 bytes and etag "..."
2024/07/15 02:33:56 DEBUG : file.example.txt: multi-thread copy: chunk 3/9956 (127926272-191889408) size 61Mi finished
2024/07/15 02:33:56 DEBUG : file.example.txt: multi-thread copy: chunk 8/9956 (447741952-511705088) size 61Mi starting
2024/07/15 02:34:22 DEBUG : file.example.txt: multipart upload wrote chunk 5 with 63963136 bytes and etag "..."
2024/07/15 02:34:22 DEBUG : file.example.txt: multi-thread copy: chunk 5/9956 (255852544-319815680) size 61Mi finished
2024/07/15 02:34:22 DEBUG : file.example.txt: multi-thread copy: chunk 9/9956 (511705088-575668224) size 61Mi starting
2024/07/15 02:34:25 DEBUG : file.example.txt: multipart upload wrote chunk 7 with 63963136 bytes and etag "..."
2024/07/15 02:34:25 DEBUG : file.example.txt: multi-thread copy: chunk 7/9956 (383778816-447741952) size 61Mi finished
2024/07/15 02:34:25 DEBUG : file.example.txt: multi-thread copy: chunk 10/9956 (575668224-639631360) size 61Mi starting
2024/07/15 02:34:26 DEBUG : file.example.txt: multipart upload wrote chunk 6 with 63963136 bytes and etag "..."
2024/07/15 02:34:26 DEBUG : file.example.txt: multi-thread copy: chunk 6/9956 (319815680-383778816) size 61Mi finished
2024/07/15 02:34:26 DEBUG : file.example.txt: multi-thread copy: chunk 11/9956 (639631360-703594496) size 61Mi starting
2024/07/15 02:34:26 DEBUG : file.example.txt: multipart upload wrote chunk 8 with 63963136 bytes and etag "..."
2024/07/15 02:34:26 DEBUG : file.example.txt: multi-thread copy: chunk 8/9956 (447741952-511705088) size 61Mi finished
2024/07/15 02:34:26 DEBUG : file.example.txt: multi-thread copy: chunk 12/9956 (703594496-767557632) size 61Mi starting
2024/07/15 02:34:39 DEBUG : file.example.txt: multipart upload wrote chunk 9 with 63963136 bytes and etag "..."
2024/07/15 02:34:39 DEBUG : file.example.txt: multi-thread copy: chunk 9/9956 (511705088-575668224) size 61Mi finished
2024/07/15 02:34:39 DEBUG : file.example.txt: multi-thread copy: chunk 13/9956 (767557632-831520768) size 61Mi starting
2024/07/15 02:34:57 DEBUG : file.example.txt: multipart upload wrote chunk 12 with 63963136 bytes and etag "..."
2024/07/15 02:34:57 DEBUG : file.example.txt: multi-thread copy: chunk 12/9956 (703594496-767557632) size 61Mi finished
2024/07/15 02:34:57 DEBUG : file.example.txt: multi-thread copy: chunk 14/9956 (831520768-895483904) size 61Mi starting
2024/07/15 02:35:00 DEBUG : file.example.txt: multipart upload wrote chunk 11 with 63963136 bytes and etag "..."
2024/07/15 02:35:00 DEBUG : file.example.txt: multi-thread copy: chunk 11/9956 (639631360-703594496) size 61Mi finished
2024/07/15 02:35:00 DEBUG : file.example.txt: multi-thread copy: chunk 15/9956 (895483904-959447040) size 61Mi starting
2024/07/15 02:35:00 DEBUG : file.example.txt: multipart upload wrote chunk 10 with 63963136 bytes and etag "..."
2024/07/15 02:35:00 DEBUG : file.example.txt: multi-thread copy: chunk 10/9956 (575668224-639631360) size 61Mi finished
2024/07/15 02:35:00 DEBUG : file.example.txt: multi-thread copy: chunk 16/9956 (959447040-1023410176) size 61Mi starting
2024/07/15 02:35:04 DEBUG : file.example.txt: multipart upload wrote chunk 13 with 63963136 bytes and etag "..."
2024/07/15 02:35:04 DEBUG : file.example.txt: multi-thread copy: chunk 13/9956 (767557632-831520768) size 61Mi finished
2024/07/15 02:35:04 DEBUG : file.example.txt: multi-thread copy: chunk 17/9956 (1023410176-1087373312) size 61Mi starting
Transferred:      906.969 MiB / 593.035 GiB, 0%, 8.149 MiB/s, ETA 20h40m4s
Transferred:            0 / 1, 0%
Elapsed time:      1m58.3s
Transferring:
 * file.example.txt:  0% /593.035Gi, 8.147Mi/s, 20h40m23s

This is just a few minutes of the log from starting command but it stays this speed of upload.

need to fix that and test again.

for the copy command, the dest needs to be a bucket or directory, not filename.

need to fix that and test again.

I am now including --s3-provider=AWS in my command, this has addressed that NOTICE message.

for the copy command, the dest needs to be a bucket or directory, not filename.

I've redacted/replaced real file names for privacy, the actual ones look more like s3://bucket-a/path/file.ext and s3://bucket-b/path/file.ext

I am still seeing same speeds and same style of log messages as before, note that I am still not passing in any rclone.conf file (so defaults are used).

fwiw, for testing, create two remotes, one for the source and one for the dest.
really simplifies debugging.

  1. the source and dest. both in the same region?
  2. access_key_id=redacted, the source and dest use the same access_key_id ?

Both buckets are in the same region (us-east-2) but in different AWS accounts. I am using a different set of IAM keys to authenticate to both separately (so unique IAM key for the first one, and unique IAM key for the second).

This is a security policy enforcement (the IAM key must be in the same account as the bucket it is accessing)

yes, i do the same.

i would think the copy should be server-side, maybe try --server-side-across-configs

can you post a full debug log?
test a smaller file, perhaps 5GiB.
to keep the log small, increase the chunk size to something like 256M or larger.

When I pass --server-side-across-configs I get 403 Access Denied errors and the upload fails at the beginning of the command execution (upload never begins and program exits). Does that somehow mean a server side copy isnt actually happening/misconfigured on my end?

Also, Here is a debug log of me copying a 5 GB file from bucketA to bucketB in different accounts but same regions. Took around 4m30s for the full copy.

I passed in: --s3-provider=AWS --s3-upload-concurrency=20 --s3-chunk-size=100M --s3-disable-checksum --transfers=20 -P and -vv

NOTE: I also tried the same cmd without the upload concurrency, chunk size, disable checksum ,transfers options. The other one took around 8m55s (log not shown)

2024/07/15 12:39:11 DEBUG : rclone: Version "v1.66.0" starting with parameters ["rclone" "copy" ":s3,access_key_id=redacted,secret_access_key=redacted,session_token=redacted,region=us-east-2:bucketA/5GBFile.txt" ":s3,access_key_id=redacted,secret_access_key=redacted,session_token=redacted,region=us-east-2:bucketB/5GBFile.txt" "--s3-provider=AWS" "--s3-upload-concurrency=20" "--s3-chunk-size=100M" "--s3-disable-checksum" "--transfers=20" "-P" "-vv"]
2024/07/15 12:39:11 DEBUG : Creating backend with remote ":s3,access_key_id=redacted,secret_access_key=redacted,session_token=redacted,region=us-east-2:bucketA/5GBFile.txt"
2024/07/15 12:39:11 NOTICE: Config file "~/.config/rclone/rclone.conf" not found - using defaults
2024/07/15 12:39:11 DEBUG : :s3: detected overridden config - adding "{xS_iI}" suffix to name
2024/07/15 12:39:11 DEBUG : fs cache: adding new entry for parent of ":s3,access_key_id=redacted,secret_access_key=redacted,session_token=redacted,region=us-east-2:bucketA/5GBFile.txt", ":s3{xS_iI}:bucketA"
2024/07/15 12:39:11 DEBUG : Creating backend with remote ":s3,access_key_id=redacted,secret_access_key=redacted,session_token=redacted,region=us-east-2:bucketB/5GBFile.txt"
2024/07/15 12:39:11 DEBUG : :s3: detected overridden config - adding "{Bv0zq}" suffix to name
2024/07/15 12:39:11 DEBUG : fs cache: renaming cache item ":s3,access_key_id=redacted,secret_access_key=redacted,session_token=redacted,region=us-east-2:bucketB/5GBFile.txt" to be canonical ":s3{Bv0zq}:bucketB/5GBFile.txt"
2024/07/15 12:39:11 DEBUG : 5GBFile.txt: Need to transfer - File not found at Destination
2024/07/15 12:39:12 DEBUG : 5GBFile.txt: open chunk writer: started multipart upload: .FOGNBNyDOaNZjzJrUzccGuG7B5ZqYJBsCqpscMms_81X5HTD4Q7XB4eXoErK50rDzFKP5i7LLFTdZDcbhsWdh5PoGpWOAdAPubzPoWzYxaYjLqB38XSHLESOy3QPN2K4zWoHrt2GnUdZy8NBlQ4E6tKCk1ejSzAeSg4JOD403E-
2024/07/15 12:39:12 DEBUG : 5GBFile.txt: multi-thread copy: using backend concurrency of 20 instead of --multi-thread-streams 4
2024/07/15 12:39:12 DEBUG : 5GBFile.txt: Starting multi-thread copy with 52 chunks of size 100Mi with 20 parallel streams
2024/07/15 12:39:12 DEBUG : 5GBFile.txt: multi-thread copy: chunk 18/52 (1782579200-1887436800) size 100Mi starting
2024/07/15 12:39:12 DEBUG : 5GBFile.txt: multi-thread copy: chunk 1/52 (0-104857600) size 100Mi starting
2024/07/15 12:39:12 DEBUG : 5GBFile.txt: multi-thread copy: chunk 6/52 (524288000-629145600) size 100Mi starting
2024/07/15 12:39:12 DEBUG : 5GBFile.txt: multi-thread copy: chunk 20/52 (1992294400-2097152000) size 100Mi starting
2024/07/15 12:39:12 DEBUG : 5GBFile.txt: multi-thread copy: chunk 9/52 (838860800-943718400) size 100Mi starting
2024/07/15 12:39:12 DEBUG : 5GBFile.txt: multi-thread copy: chunk 16/52 (1572864000-1677721600) size 100Mi starting
2024/07/15 12:39:12 DEBUG : 5GBFile.txt: multi-thread copy: chunk 17/52 (1677721600-1782579200) size 100Mi starting
2024/07/15 12:39:12 DEBUG : 5GBFile.txt: multi-thread copy: chunk 14/52 (1363148800-1468006400) size 100Mi starting
2024/07/15 12:39:12 DEBUG : 5GBFile.txt: multi-thread copy: chunk 19/52 (1887436800-1992294400) size 100Mi starting
2024/07/15 12:39:12 DEBUG : 5GBFile.txt: multi-thread copy: chunk 13/52 (1258291200-1363148800) size 100Mi starting
2024/07/15 12:39:12 DEBUG : 5GBFile.txt: multi-thread copy: chunk 8/52 (734003200-838860800) size 100Mi starting
2024/07/15 12:39:12 DEBUG : 5GBFile.txt: multi-thread copy: chunk 12/52 (1153433600-1258291200) size 100Mi starting
2024/07/15 12:39:12 DEBUG : 5GBFile.txt: multi-thread copy: chunk 7/52 (629145600-734003200) size 100Mi starting
2024/07/15 12:39:12 DEBUG : 5GBFile.txt: multi-thread copy: chunk 2/52 (104857600-209715200) size 100Mi starting
2024/07/15 12:39:12 DEBUG : 5GBFile.txt: multi-thread copy: chunk 3/52 (209715200-314572800) size 100Mi starting
2024/07/15 12:39:12 DEBUG : 5GBFile.txt: multi-thread copy: chunk 10/52 (943718400-1048576000) size 100Mi starting
2024/07/15 12:39:12 DEBUG : 5GBFile.txt: multi-thread copy: chunk 15/52 (1468006400-1572864000) size 100Mi starting
2024/07/15 12:39:12 DEBUG : 5GBFile.txt: multi-thread copy: chunk 4/52 (314572800-419430400) size 100Mi starting
2024/07/15 12:39:12 DEBUG : 5GBFile.txt: multi-thread copy: chunk 5/52 (419430400-524288000) size 100Mi starting
2024/07/15 12:39:12 DEBUG : 5GBFile.txt: multi-thread copy: chunk 11/52 (1048576000-1153433600) size 100Mi starting
2024/07/15 12:40:48 DEBUG : 5GBFile.txt: multipart upload wrote chunk 7 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:40:48 DEBUG : 5GBFile.txt: multi-thread copy: chunk 7/52 (629145600-734003200) size 100Mi finished
2024/07/15 12:40:48 DEBUG : 5GBFile.txt: multi-thread copy: chunk 21/52 (2097152000-2202009600) size 100Mi starting
2024/07/15 12:40:49 DEBUG : 5GBFile.txt: multipart upload wrote chunk 3 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:40:49 DEBUG : 5GBFile.txt: multi-thread copy: chunk 3/52 (209715200-314572800) size 100Mi finished
2024/07/15 12:40:49 DEBUG : 5GBFile.txt: multi-thread copy: chunk 22/52 (2202009600-2306867200) size 100Mi starting
2024/07/15 12:40:50 DEBUG : 5GBFile.txt: multipart upload wrote chunk 13 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:40:50 DEBUG : 5GBFile.txt: multi-thread copy: chunk 13/52 (1258291200-1363148800) size 100Mi finished
2024/07/15 12:40:50 DEBUG : 5GBFile.txt: multi-thread copy: chunk 23/52 (2306867200-2411724800) size 100Mi starting
2024/07/15 12:40:51 DEBUG : 5GBFile.txt: multipart upload wrote chunk 19 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:40:51 DEBUG : 5GBFile.txt: multi-thread copy: chunk 19/52 (1887436800-1992294400) size 100Mi finished
2024/07/15 12:40:51 DEBUG : 5GBFile.txt: multi-thread copy: chunk 24/52 (2411724800-2516582400) size 100Mi starting
2024/07/15 12:40:51 DEBUG : 5GBFile.txt: multipart upload wrote chunk 10 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:40:51 DEBUG : 5GBFile.txt: multi-thread copy: chunk 10/52 (943718400-1048576000) size 100Mi finished
2024/07/15 12:40:51 DEBUG : 5GBFile.txt: multi-thread copy: chunk 25/52 (2516582400-2621440000) size 100Mi starting
2024/07/15 12:40:52 DEBUG : 5GBFile.txt: multipart upload wrote chunk 6 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:40:52 DEBUG : 5GBFile.txt: multi-thread copy: chunk 6/52 (524288000-629145600) size 100Mi finished
2024/07/15 12:40:52 DEBUG : 5GBFile.txt: multi-thread copy: chunk 26/52 (2621440000-2726297600) size 100Mi starting
2024/07/15 12:40:55 DEBUG : 5GBFile.txt: multipart upload wrote chunk 17 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:40:55 DEBUG : 5GBFile.txt: multi-thread copy: chunk 17/52 (1677721600-1782579200) size 100Mi finished
2024/07/15 12:40:55 DEBUG : 5GBFile.txt: multi-thread copy: chunk 27/52 (2726297600-2831155200) size 100Mi starting
2024/07/15 12:40:57 DEBUG : 5GBFile.txt: multipart upload wrote chunk 18 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:40:57 DEBUG : 5GBFile.txt: multi-thread copy: chunk 18/52 (1782579200-1887436800) size 100Mi finished
2024/07/15 12:40:57 DEBUG : 5GBFile.txt: multi-thread copy: chunk 28/52 (2831155200-2936012800) size 100Mi starting
2024/07/15 12:40:57 DEBUG : 5GBFile.txt: multipart upload wrote chunk 5 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:40:57 DEBUG : 5GBFile.txt: multi-thread copy: chunk 5/52 (419430400-524288000) size 100Mi finished
2024/07/15 12:40:57 DEBUG : 5GBFile.txt: multi-thread copy: chunk 29/52 (2936012800-3040870400) size 100Mi starting
2024/07/15 12:40:59 DEBUG : 5GBFile.txt: multipart upload wrote chunk 14 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:40:59 DEBUG : 5GBFile.txt: multi-thread copy: chunk 14/52 (1363148800-1468006400) size 100Mi finished
2024/07/15 12:40:59 DEBUG : 5GBFile.txt: multi-thread copy: chunk 30/52 (3040870400-3145728000) size 100Mi starting
2024/07/15 12:41:03 DEBUG : 5GBFile.txt: multipart upload wrote chunk 1 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:41:03 DEBUG : 5GBFile.txt: multi-thread copy: chunk 1/52 (0-104857600) size 100Mi finished
2024/07/15 12:41:03 DEBUG : 5GBFile.txt: multi-thread copy: chunk 31/52 (3145728000-3250585600) size 100Mi starting
2024/07/15 12:41:04 DEBUG : 5GBFile.txt: multipart upload wrote chunk 2 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:41:04 DEBUG : 5GBFile.txt: multi-thread copy: chunk 2/52 (104857600-209715200) size 100Mi finished
2024/07/15 12:41:04 DEBUG : 5GBFile.txt: multi-thread copy: chunk 32/52 (3250585600-3355443200) size 100Mi starting
2024/07/15 12:41:04 DEBUG : 5GBFile.txt: multipart upload wrote chunk 20 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:41:04 DEBUG : 5GBFile.txt: multi-thread copy: chunk 20/52 (1992294400-2097152000) size 100Mi finished
2024/07/15 12:41:04 DEBUG : 5GBFile.txt: multi-thread copy: chunk 33/52 (3355443200-3460300800) size 100Mi starting
2024/07/15 12:41:04 DEBUG : 5GBFile.txt: multipart upload wrote chunk 12 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:41:04 DEBUG : 5GBFile.txt: multi-thread copy: chunk 12/52 (1153433600-1258291200) size 100Mi finished
2024/07/15 12:41:04 DEBUG : 5GBFile.txt: multi-thread copy: chunk 34/52 (3460300800-3565158400) size 100Mi starting
2024/07/15 12:41:05 DEBUG : 5GBFile.txt: multipart upload wrote chunk 4 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:41:05 DEBUG : 5GBFile.txt: multi-thread copy: chunk 4/52 (314572800-419430400) size 100Mi finished
2024/07/15 12:41:05 DEBUG : 5GBFile.txt: multi-thread copy: chunk 35/52 (3565158400-3670016000) size 100Mi starting
2024/07/15 12:41:07 DEBUG : 5GBFile.txt: multipart upload wrote chunk 8 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:41:07 DEBUG : 5GBFile.txt: multi-thread copy: chunk 8/52 (734003200-838860800) size 100Mi finished
2024/07/15 12:41:07 DEBUG : 5GBFile.txt: multi-thread copy: chunk 36/52 (3670016000-3774873600) size 100Mi starting
2024/07/15 12:41:08 DEBUG : 5GBFile.txt: multipart upload wrote chunk 15 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:41:08 DEBUG : 5GBFile.txt: multi-thread copy: chunk 15/52 (1468006400-1572864000) size 100Mi finished
2024/07/15 12:41:08 DEBUG : 5GBFile.txt: multi-thread copy: chunk 37/52 (3774873600-3879731200) size 100Mi starting
2024/07/15 12:41:08 DEBUG : 5GBFile.txt: multipart upload wrote chunk 9 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:41:08 DEBUG : 5GBFile.txt: multi-thread copy: chunk 9/52 (838860800-943718400) size 100Mi finished
2024/07/15 12:41:08 DEBUG : 5GBFile.txt: multi-thread copy: chunk 38/52 (3879731200-3984588800) size 100Mi starting
2024/07/15 12:41:09 DEBUG : 5GBFile.txt: multipart upload wrote chunk 11 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:41:09 DEBUG : 5GBFile.txt: multi-thread copy: chunk 11/52 (1048576000-1153433600) size 100Mi finished
2024/07/15 12:41:09 DEBUG : 5GBFile.txt: multi-thread copy: chunk 39/52 (3984588800-4089446400) size 100Mi starting
2024/07/15 12:41:17 DEBUG : 5GBFile.txt: multipart upload wrote chunk 16 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:41:17 DEBUG : 5GBFile.txt: multi-thread copy: chunk 16/52 (1572864000-1677721600) size 100Mi finished
2024/07/15 12:41:17 DEBUG : 5GBFile.txt: multi-thread copy: chunk 40/52 (4089446400-4194304000) size 100Mi starting
2024/07/15 12:42:10 DEBUG : 5GBFile.txt: multipart upload wrote chunk 21 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:42:10 DEBUG : 5GBFile.txt: multi-thread copy: chunk 21/52 (2097152000-2202009600) size 100Mi finished
2024/07/15 12:42:10 DEBUG : 5GBFile.txt: multi-thread copy: chunk 41/52 (4194304000-4299161600) size 100Mi starting
2024/07/15 12:42:10 DEBUG : 5GBFile.txt: multipart upload wrote chunk 24 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:42:10 DEBUG : 5GBFile.txt: multi-thread copy: chunk 24/52 (2411724800-2516582400) size 100Mi finished
2024/07/15 12:42:10 DEBUG : 5GBFile.txt: multi-thread copy: chunk 42/52 (4299161600-4404019200) size 100Mi starting
2024/07/15 12:42:10 DEBUG : 5GBFile.txt: multipart upload wrote chunk 23 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:42:10 DEBUG : 5GBFile.txt: multi-thread copy: chunk 23/52 (2306867200-2411724800) size 100Mi finished
2024/07/15 12:42:10 DEBUG : 5GBFile.txt: multi-thread copy: chunk 43/52 (4404019200-4508876800) size 100Mi starting
2024/07/15 12:42:12 DEBUG : 5GBFile.txt: multipart upload wrote chunk 27 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:42:12 DEBUG : 5GBFile.txt: multi-thread copy: chunk 27/52 (2726297600-2831155200) size 100Mi finished
2024/07/15 12:42:12 DEBUG : 5GBFile.txt: multi-thread copy: chunk 44/52 (4508876800-4613734400) size 100Mi starting
2024/07/15 12:42:22 DEBUG : 5GBFile.txt: multipart upload wrote chunk 22 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:42:22 DEBUG : 5GBFile.txt: multi-thread copy: chunk 22/52 (2202009600-2306867200) size 100Mi finished
2024/07/15 12:42:22 DEBUG : 5GBFile.txt: multi-thread copy: chunk 45/52 (4613734400-4718592000) size 100Mi starting
2024/07/15 12:42:34 DEBUG : 5GBFile.txt: multipart upload wrote chunk 38 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:42:34 DEBUG : 5GBFile.txt: multi-thread copy: chunk 38/52 (3879731200-3984588800) size 100Mi finished
2024/07/15 12:42:34 DEBUG : 5GBFile.txt: multi-thread copy: chunk 46/52 (4718592000-4823449600) size 100Mi starting
2024/07/15 12:42:36 DEBUG : 5GBFile.txt: multipart upload wrote chunk 30 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:42:36 DEBUG : 5GBFile.txt: multi-thread copy: chunk 30/52 (3040870400-3145728000) size 100Mi finished
2024/07/15 12:42:36 DEBUG : 5GBFile.txt: multi-thread copy: chunk 47/52 (4823449600-4928307200) size 100Mi starting
2024/07/15 12:42:40 DEBUG : 5GBFile.txt: multipart upload wrote chunk 34 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:42:40 DEBUG : 5GBFile.txt: multi-thread copy: chunk 34/52 (3460300800-3565158400) size 100Mi finished
2024/07/15 12:42:40 DEBUG : 5GBFile.txt: multi-thread copy: chunk 48/52 (4928307200-5033164800) size 100Mi starting
2024/07/15 12:42:41 DEBUG : 5GBFile.txt: multipart upload wrote chunk 28 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:42:41 DEBUG : 5GBFile.txt: multi-thread copy: chunk 28/52 (2831155200-2936012800) size 100Mi finished
2024/07/15 12:42:41 DEBUG : 5GBFile.txt: multi-thread copy: chunk 49/52 (5033164800-5138022400) size 100Mi starting
2024/07/15 12:42:42 DEBUG : 5GBFile.txt: multipart upload wrote chunk 32 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:42:42 DEBUG : 5GBFile.txt: multi-thread copy: chunk 32/52 (3250585600-3355443200) size 100Mi finished
2024/07/15 12:42:42 DEBUG : 5GBFile.txt: multi-thread copy: chunk 50/52 (5138022400-5242880000) size 100Mi starting
2024/07/15 12:42:43 DEBUG : 5GBFile.txt: multipart upload wrote chunk 25 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:42:43 DEBUG : 5GBFile.txt: multi-thread copy: chunk 25/52 (2516582400-2621440000) size 100Mi finished
2024/07/15 12:42:43 DEBUG : 5GBFile.txt: multi-thread copy: chunk 51/52 (5242880000-5347737600) size 100Mi starting
2024/07/15 12:42:43 DEBUG : 5GBFile.txt: multipart upload wrote chunk 29 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:42:43 DEBUG : 5GBFile.txt: multi-thread copy: chunk 29/52 (2936012800-3040870400) size 100Mi finished
2024/07/15 12:42:43 DEBUG : 5GBFile.txt: multi-thread copy: chunk 52/52 (5347737600-5368709120) size 20Mi starting
2024/07/15 12:42:46 DEBUG : 5GBFile.txt: multipart upload wrote chunk 31 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:42:46 DEBUG : 5GBFile.txt: multi-thread copy: chunk 31/52 (3145728000-3250585600) size 100Mi finished
2024/07/15 12:42:46 DEBUG : 5GBFile.txt: multipart upload wrote chunk 40 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:42:46 DEBUG : 5GBFile.txt: multi-thread copy: chunk 40/52 (4089446400-4194304000) size 100Mi finished
2024/07/15 12:42:47 DEBUG : 5GBFile.txt: multipart upload wrote chunk 35 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:42:47 DEBUG : 5GBFile.txt: multi-thread copy: chunk 35/52 (3565158400-3670016000) size 100Mi finished
2024/07/15 12:42:47 DEBUG : 5GBFile.txt: multipart upload wrote chunk 26 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:42:47 DEBUG : 5GBFile.txt: multi-thread copy: chunk 26/52 (2621440000-2726297600) size 100Mi finished
2024/07/15 12:42:47 DEBUG : 5GBFile.txt: multipart upload wrote chunk 33 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:42:47 DEBUG : 5GBFile.txt: multi-thread copy: chunk 33/52 (3355443200-3460300800) size 100Mi finished
2024/07/15 12:42:47 DEBUG : 5GBFile.txt: multipart upload wrote chunk 39 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:42:47 DEBUG : 5GBFile.txt: multi-thread copy: chunk 39/52 (3984588800-4089446400) size 100Mi finished
2024/07/15 12:42:51 DEBUG : 5GBFile.txt: multipart upload wrote chunk 37 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:42:51 DEBUG : 5GBFile.txt: multi-thread copy: chunk 37/52 (3774873600-3879731200) size 100Mi finished
2024/07/15 12:42:52 DEBUG : 5GBFile.txt: multipart upload wrote chunk 36 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:42:52 DEBUG : 5GBFile.txt: multi-thread copy: chunk 36/52 (3670016000-3774873600) size 100Mi finished
2024/07/15 12:42:56 DEBUG : 5GBFile.txt: multipart upload wrote chunk 52 with 20971520 bytes and etag "8f4e33f3dc3e414ff94e5fb6905cba8c"
2024/07/15 12:42:56 DEBUG : 5GBFile.txt: multi-thread copy: chunk 52/52 (5347737600-5368709120) size 20Mi finished
2024/07/15 12:43:17 DEBUG : 5GBFile.txt: multipart upload wrote chunk 44 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:43:17 DEBUG : 5GBFile.txt: multi-thread copy: chunk 44/52 (4508876800-4613734400) size 100Mi finished
2024/07/15 12:43:21 DEBUG : 5GBFile.txt: multipart upload wrote chunk 42 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:43:21 DEBUG : 5GBFile.txt: multi-thread copy: chunk 42/52 (4299161600-4404019200) size 100Mi finished
2024/07/15 12:43:21 DEBUG : 5GBFile.txt: multipart upload wrote chunk 43 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:43:21 DEBUG : 5GBFile.txt: multi-thread copy: chunk 43/52 (4404019200-4508876800) size 100Mi finished
2024/07/15 12:43:24 DEBUG : 5GBFile.txt: multipart upload wrote chunk 45 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:43:24 DEBUG : 5GBFile.txt: multi-thread copy: chunk 45/52 (4613734400-4718592000) size 100Mi finished
2024/07/15 12:43:24 DEBUG : 5GBFile.txt: multipart upload wrote chunk 41 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:43:24 DEBUG : 5GBFile.txt: multi-thread copy: chunk 41/52 (4194304000-4299161600) size 100Mi finished
2024/07/15 12:43:33 DEBUG : 5GBFile.txt: multipart upload wrote chunk 46 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:43:33 DEBUG : 5GBFile.txt: multi-thread copy: chunk 46/52 (4718592000-4823449600) size 100Mi finished
2024/07/15 12:43:33 DEBUG : 5GBFile.txt: multipart upload wrote chunk 47 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:43:33 DEBUG : 5GBFile.txt: multi-thread copy: chunk 47/52 (4823449600-4928307200) size 100Mi finished
2024/07/15 12:43:37 DEBUG : 5GBFile.txt: multipart upload wrote chunk 48 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:43:37 DEBUG : 5GBFile.txt: multi-thread copy: chunk 48/52 (4928307200-5033164800) size 100Mi finished
2024/07/15 12:43:38 DEBUG : 5GBFile.txt: multipart upload wrote chunk 50 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:43:38 DEBUG : 5GBFile.txt: multi-thread copy: chunk 50/52 (5138022400-5242880000) size 100Mi finished
2024/07/15 12:43:39 DEBUG : 5GBFile.txt: multipart upload wrote chunk 51 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:43:39 DEBUG : 5GBFile.txt: multi-thread copy: chunk 51/52 (5242880000-5347737600) size 100Mi finished
2024/07/15 12:43:39 DEBUG : 5GBFile.txt: multipart upload wrote chunk 49 with 104857600 bytes and etag "2f282b84e7e608d5852449ed940bfc51"
2024/07/15 12:43:39 DEBUG : 5GBFile.txt: multi-thread copy: chunk 49/52 (5033164800-5138022400) size 100Mi finished
2024/07/15 12:43:40 DEBUG : 5GBFile.txt: multipart upload ".FOGNBNyDOaNZjzJrUzccGuG7B5ZqYJBsCqpscMms_81X5HTD4Q7XB4eXoErK50rDzFKP5i7LLFTdZDcbhsWdh5PoGpWOAdAPubzPoWzYxaYjLqB38XSHLESOy3QPN2K4zWoHrt2GnUdZy8NBlQ4E6tKCk1ejSzAeSg4JOD403E-" finished
2024/07/15 12:43:40 DEBUG : 5GBFile.txt: Finished multi-thread copy with 52 parts of size 100Mi
2024/07/15 12:43:40 DEBUG : 5GBFile.txt: Src hash empty - aborting Dst hash check
2024/07/15 12:43:40 DEBUG : 5GBFile.txt: Dst hash empty - aborting Src hash check
2024/07/15 12:43:40 INFO  : 5GBFile.txt: Multi-thread Copied (new)
Transferred:            5 GiB / 5 GiB, 100%, 16.333 MiB/s, ETA 0s
Transferred:            1 / 1, 100%
Elapsed time:      4m29.0s
2024/07/15 12:43:40 INFO  : 
Transferred:            5 GiB / 5 GiB, 100%, 16.333 MiB/s, ETA 0s
Transferred:            1 / 1, 100%
Elapsed time:      4m29.0s

2024/07/15 12:43:40 DEBUG : 27 go routines active

I think what might be happening (if I am not mistaken) is given I am passing in two sets of credentials, and there is no cross-account bucket policy to give either credential access to both of the buckets, then server sided copy is not possible; instead it happens as an upload. That said, what options would I have to try to optimize the current setup?

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.