Unable to copy files up to 5Gb to S3

What is the problem you are having with rclone?

Hello.
I'm facing errors when I copy files bigger than 1Gb from my local computer to AWS S3.
Generally speaking, the error is : Failed to copy: multi-thread copy: failed to write chunk: failed to upload chunk 1 with 5242880 bytes: NoSuchUpload: The specified upload does not exist. The upload ID may be invalid, or the upload may have been aborted or completed.

I don't have any issue with files smaller than 1GB. Here is the command for files smaller than 1GB

rclone copy C:\_WORK\_POC_rclone\File.bak qad-aws:Test

But, when the file is bigger than 1GB, the same command fails with the following error

C:\_WORK\_POC_rclone\rclone-v1.66.0-windows-amd64>rclone copy C:\_WORK\_POC_rclone\File_Large.bak qad-aws:Test -vv
2024/04/03 15:26:55 DEBUG : rclone: Version "v1.66.0" starting with parameters ["rclone" "copy" "C:\\_WORK\\_POC_rclone\\File_Large.bak" "qad-aws:Test" "-vv"]
2024/04/03 15:26:55 DEBUG : Creating backend with remote "C:\\_WORK\\_POC_rclone\\File_Large.bak"
2024/04/03 15:26:55 DEBUG : Using config file from "C:\\Users\\obp\\AppData\\Roaming\\rclone\\rclone.conf"
2024/04/03 15:26:55 DEBUG : fs cache: adding new entry for parent of "C:\\_WORK\\_POC_rclone\\File_Large.bak", "//?/C:/_WORK/_POC_rclone"
2024/04/03 15:26:55 DEBUG : Creating backend with remote "qad-aws:Test"
2024/04/03 15:26:55 DEBUG : Resolving service "s3" region "us-east-2"
2024/04/03 15:26:56 DEBUG : File_Large.bak: Need to transfer - File not found at Destination
2024/04/03 15:26:56 DEBUG : File_Large.bak: multi-thread copy: disabling buffering because source is local disk
2024/04/03 15:26:56 INFO  : S3 bucket Test: Bucket "Test" created with ACL "private"
2024/04/03 15:26:58 DEBUG : File_Large.bak: open chunk writer: started multipart upload: yc2uCHfQoGxKEBSVVkZw7_qRYzRUquABUkTRnXP1MWFcDay.xYUZvK8kYQZuDG5fuUMkLcckr49l6h98kPNAJD1qAZNtvwK4ovI7wGVErZJuqWqulBja8cKvGnV5fo14
2024/04/03 15:26:58 DEBUG : File_Large.bak: multi-thread copy: using backend concurrency of 4 instead of --multi-thread-streams 4
2024/04/03 15:26:58 DEBUG : File_Large.bak: Starting multi-thread copy with 243 chunks of size 5Mi with 4 parallel streams
2024/04/03 15:26:58 DEBUG : File_Large.bak: multi-thread copy: chunk 4/243 (15728640-20971520) size 5Mi starting
2024/04/03 15:26:58 DEBUG : File_Large.bak: multi-thread copy: chunk 2/243 (5242880-10485760) size 5Mi starting
2024/04/03 15:26:58 DEBUG : File_Large.bak: multi-thread copy: chunk 3/243 (10485760-15728640) size 5Mi starting
2024/04/03 15:26:58 DEBUG : File_Large.bak: multi-thread copy: chunk 1/243 (0-5242880) size 5Mi starting
2024/04/03 15:26:58 DEBUG : File_Large.bak: Seek from 5242880 to 0
2024/04/03 15:26:58 DEBUG : File_Large.bak: Seek from 5242880 to 0
2024/04/03 15:26:58 DEBUG : File_Large.bak: Seek from 5242880 to 0
2024/04/03 15:26:58 DEBUG : File_Large.bak: Seek from 5242880 to 0
2024/04/03 15:26:58 DEBUG : File_Large.bak: multi-thread copy: chunk 2/243 failed: multi-thread copy: failed to write chunk: failed to upload chunk 2 with 5242880 bytes: NoSuchUpload: The specified upload does not exist. The upload ID may be invalid, or the upload may have been aborted or completed.
        status code: 404, request id: CQWFZ397G2Y1J54W, host id: KL6WpWMUlVWEKNZD1eL0SdYE1GVld55XQ9QjDZ6ByiEazZRxQdGwNW3kn2ofhcePWKGGWuZVLxU=
2024/04/03 15:26:58 DEBUG : File_Large.bak: multi-thread copy: chunk 5/243 (20971520-26214400) size 5Mi starting
2024/04/03 15:26:58 DEBUG : File_Large.bak: multi-thread copy: chunk 4/243 failed: multi-thread copy: failed to write chunk: failed to upload chunk 4 with 5242880 bytes: RequestCanceled: request context canceled
caused by: context canceled
2024/04/03 15:26:58 DEBUG : File_Large.bak: multi-thread copy: chunk 3/243 failed: multi-thread copy: failed to write chunk: failed to upload chunk 3 with 5242880 bytes: RequestCanceled: request context canceled
caused by: context canceled
2024/04/03 15:26:58 DEBUG : File_Large.bak: multi-thread copy: chunk 1/243 failed: multi-thread copy: failed to write chunk: failed to upload chunk 1 with 5242880 bytes: RequestCanceled: request context canceled
caused by: context canceled
2024/04/03 15:26:58 DEBUG : File_Large.bak: Seek from 5242880 to 0
2024/04/03 15:26:58 DEBUG : File_Large.bak: multi-thread copy: chunk 5/243 failed: multi-thread copy: failed to write chunk: failed to upload chunk 5 with 5242880 bytes: RequestCanceled: request context canceled
caused by: context canceled
2024/04/03 15:26:58 DEBUG : File_Large.bak: multi-thread copy: cancelling transfer on exit
2024/04/03 15:26:58 DEBUG : File_Large.bak: multi-thread copy: abort failed: failed to abort multipart upload "yc2uCHfQoGxKEBSVVkZw7_qRYzRUquABUkTRnXP1MWFcDay.xYUZvK8kYQZuDG5fuUMkLcckr49l6h98kPNAJD1qAZNtvwK4ovI7wGVErZJuqWqulBja8cKvGnV5fo14": NoSuchUpload: The specified upload does not exist. The upload ID may be invalid, or the upload may have been aborted or completed.
        status code: 404, request id: CQW7KYZ08MKC2C1N, host id: qECT2Xh9e1uCe1yJ0ZKW84UlkTjHJXWm0bgdEK6qIRqvkdEib2bt7NfTLt4FLHMoJwTXchRPzpw=
2024/04/03 15:26:58 ERROR : File_Large.bak: Failed to copy: multi-thread copy: failed to write chunk: failed to upload chunk 2 with 5242880 bytes: NoSuchUpload: The specified upload does not exist. The upload ID may be invalid, or the upload may have been aborted or completed.
        status code: 404, request id: CQWFZ397G2Y1J54W, host id: KL6WpWMUlVWEKNZD1eL0SdYE1GVld55XQ9QjDZ6ByiEazZRxQdGwNW3kn2ofhcePWKGGWuZVLxU=
2024/04/03 15:26:58 ERROR : Attempt 1/3 failed with 1 errors and: multi-thread copy: failed to write chunk: failed to upload chunk 2 with 5242880 bytes: NoSuchUpload: The specified upload does not exist. The upload ID may be invalid, or the upload may have been aborted or completed.
        status code: 404, request id: CQWFZ397G2Y1J54W, host id: KL6WpWMUlVWEKNZD1eL0SdYE1GVld55XQ9QjDZ6ByiEazZRxQdGwNW3kn2ofhcePWKGGWuZVLxU=
2024/04/03 15:26:58 DEBUG : File_Large.bak: Need to transfer - File not found at Destination
2024/04/03 15:26:58 DEBUG : File_Large.bak: multi-thread copy: disabling buffering because source is local disk
2024/04/03 15:27:00 DEBUG : File_Large.bak: open chunk writer: started multipart upload: C4p_vl5Cct3gD.2RE32NOy0rVm0n73X3nMWD9qpwl8qzoA.H81T9rxRiA86xYYK7L_hWf.SF1aXr33HJLDXeIil8VtlqENygfbNp6Zc.16tpH.yQLyxyT5WLlwfyRbfg
2024/04/03 15:27:00 DEBUG : File_Large.bak: multi-thread copy: using backend concurrency of 4 instead of --multi-thread-streams 4
2024/04/03 15:27:00 DEBUG : File_Large.bak: Starting multi-thread copy with 243 chunks of size 5Mi with 4 parallel streams
2024/04/03 15:27:00 DEBUG : File_Large.bak: multi-thread copy: chunk 4/243 (15728640-20971520) size 5Mi starting
2024/04/03 15:27:00 DEBUG : File_Large.bak: multi-thread copy: chunk 2/243 (5242880-10485760) size 5Mi starting
2024/04/03 15:27:00 DEBUG : File_Large.bak: multi-thread copy: chunk 1/243 (0-5242880) size 5Mi starting
2024/04/03 15:27:00 DEBUG : File_Large.bak: multi-thread copy: chunk 3/243 (10485760-15728640) size 5Mi starting
2024/04/03 15:27:00 DEBUG : File_Large.bak: Seek from 5242880 to 0
2024/04/03 15:27:00 DEBUG : File_Large.bak: Seek from 5242880 to 0
2024/04/03 15:27:00 DEBUG : File_Large.bak: Seek from 5242880 to 0
2024/04/03 15:27:00 DEBUG : File_Large.bak: Seek from 5242880 to 0
2024/04/03 15:27:00 DEBUG : File_Large.bak: multi-thread copy: chunk 4/243 failed: multi-thread copy: failed to write chunk: failed to upload chunk 4 with 5242880 bytes: NoSuchUpload: The specified upload does not exist. The upload ID may be invalid, or the upload may have been aborted or completed.
        status code: 404, request id: ANH5FPRERJV8X0ZG, host id: jHUNcZ/VHeK+Y+n9+FfMpBRM/enu7wTtK8jc78dRKjU9KRstwutJvZBNZ1m2CTh+j2NmL6GtVL0=
2024/04/03 15:27:00 DEBUG : File_Large.bak: multi-thread copy: chunk 5/243 (20971520-26214400) size 5Mi starting
2024/04/03 15:27:00 DEBUG : File_Large.bak: multi-thread copy: chunk 2/243 failed: multi-thread copy: failed to write chunk: failed to upload chunk 2 with 5242880 bytes: RequestCanceled: request context canceled
caused by: context canceled
2024/04/03 15:27:00 DEBUG : File_Large.bak: multi-thread copy: chunk 3/243 failed: multi-thread copy: failed to write chunk: failed to upload chunk 3 with 5242880 bytes: RequestCanceled: request context canceled
caused by: context canceled
2024/04/03 15:27:00 DEBUG : File_Large.bak: multi-thread copy: chunk 1/243 failed: multi-thread copy: failed to write chunk: failed to upload chunk 1 with 5242880 bytes: RequestCanceled: request context canceled
caused by: context canceled
2024/04/03 15:27:00 DEBUG : File_Large.bak: Seek from 5242880 to 0
2024/04/03 15:27:00 DEBUG : File_Large.bak: multi-thread copy: chunk 5/243 failed: multi-thread copy: failed to write chunk: failed to upload chunk 5 with 5242880 bytes: RequestCanceled: request context canceled
caused by: context canceled
2024/04/03 15:27:00 DEBUG : File_Large.bak: multi-thread copy: cancelling transfer on exit
2024/04/03 15:27:01 DEBUG : File_Large.bak: multi-thread copy: abort failed: failed to abort multipart upload "C4p_vl5Cct3gD.2RE32NOy0rVm0n73X3nMWD9qpwl8qzoA.H81T9rxRiA86xYYK7L_hWf.SF1aXr33HJLDXeIil8VtlqENygfbNp6Zc.16tpH.yQLyxyT5WLlwfyRbfg": NoSuchUpload: The specified upload does not exist. The upload ID may be invalid, or the upload may have been aborted or completed.
        status code: 404, request id: 2X01JYFPHNN7JKYB, host id: lUwRkJNFZtjdIhEtOT4mM8fPPokoUG1yKg9y7IBIvLdCk+vZIBhkoQEtw1Aiz55WoxGEFLArPv0NDIr5dDOS9A==
2024/04/03 15:27:01 ERROR : File_Large.bak: Failed to copy: multi-thread copy: failed to write chunk: failed to upload chunk 4 with 5242880 bytes: NoSuchUpload: The specified upload does not exist. The upload ID may be invalid, or the upload may have been aborted or completed.
        status code: 404, request id: ANH5FPRERJV8X0ZG, host id: jHUNcZ/VHeK+Y+n9+FfMpBRM/enu7wTtK8jc78dRKjU9KRstwutJvZBNZ1m2CTh+j2NmL6GtVL0=
2024/04/03 15:27:01 ERROR : Attempt 2/3 failed with 1 errors and: multi-thread copy: failed to write chunk: failed to upload chunk 4 with 5242880 bytes: NoSuchUpload: The specified upload does not exist. The upload ID may be invalid, or the upload may have been aborted or completed.
        status code: 404, request id: ANH5FPRERJV8X0ZG, host id: jHUNcZ/VHeK+Y+n9+FfMpBRM/enu7wTtK8jc78dRKjU9KRstwutJvZBNZ1m2CTh+j2NmL6GtVL0=
2024/04/03 15:27:01 DEBUG : File_Large.bak: Need to transfer - File not found at Destination
2024/04/03 15:27:01 DEBUG : File_Large.bak: multi-thread copy: disabling buffering because source is local disk
2024/04/03 15:27:03 DEBUG : File_Large.bak: open chunk writer: started multipart upload: tjgAGNszNz4mttBU8bLBVOWZrdIuuysn6XHOM22rXlAD367kQHRZ2DJ2fmi_RQJL_GQQYjL6U1SapjUCi5bH8GDf7.dKcXOOHM2iLwKxl2M3gOhvy_HiBrex78RfjoUT
2024/04/03 15:27:03 DEBUG : File_Large.bak: multi-thread copy: using backend concurrency of 4 instead of --multi-thread-streams 4
2024/04/03 15:27:03 DEBUG : File_Large.bak: Starting multi-thread copy with 243 chunks of size 5Mi with 4 parallel streams
2024/04/03 15:27:03 DEBUG : File_Large.bak: multi-thread copy: chunk 4/243 (15728640-20971520) size 5Mi starting
2024/04/03 15:27:03 DEBUG : File_Large.bak: multi-thread copy: chunk 2/243 (5242880-10485760) size 5Mi starting
2024/04/03 15:27:03 DEBUG : File_Large.bak: multi-thread copy: chunk 1/243 (0-5242880) size 5Mi starting
2024/04/03 15:27:03 DEBUG : File_Large.bak: multi-thread copy: chunk 3/243 (10485760-15728640) size 5Mi starting
2024/04/03 15:27:03 DEBUG : File_Large.bak: Seek from 5242880 to 0
2024/04/03 15:27:03 DEBUG : File_Large.bak: Seek from 5242880 to 0
2024/04/03 15:27:03 DEBUG : File_Large.bak: Seek from 5242880 to 0
2024/04/03 15:27:03 DEBUG : File_Large.bak: Seek from 5242880 to 0
2024/04/03 15:27:03 DEBUG : File_Large.bak: multi-thread copy: chunk 4/243 failed: multi-thread copy: failed to write chunk: failed to upload chunk 4 with 5242880 bytes: NoSuchUpload: The specified upload does not exist. The upload ID may be invalid, or the upload may have been aborted or completed.
        status code: 404, request id: Q9R009THRBC0N3QZ, host id: HuZON2e85+QZDXAWq3J3aAd67DXacODyYcDEi8z7N5s3Fek4MkDoc73kvlTTMeTRLmV7crxe2mP4oRYwe5/wfA==
2024/04/03 15:27:03 DEBUG : File_Large.bak: multi-thread copy: chunk 5/243 (20971520-26214400) size 5Mi starting
2024/04/03 15:27:03 DEBUG : File_Large.bak: multi-thread copy: chunk 3/243 failed: multi-thread copy: failed to write chunk: failed to upload chunk 3 with 5242880 bytes: RequestCanceled: request context canceled
caused by: context canceled
2024/04/03 15:27:03 DEBUG : File_Large.bak: multi-thread copy: chunk 1/243 failed: multi-thread copy: failed to write chunk: failed to upload chunk 1 with 5242880 bytes: RequestCanceled: request context canceled
caused by: context canceled
2024/04/03 15:27:03 DEBUG : File_Large.bak: multi-thread copy: chunk 2/243 failed: multi-thread copy: failed to write chunk: failed to upload chunk 2 with 5242880 bytes: RequestCanceled: request context canceled
caused by: context canceled
2024/04/03 15:27:03 DEBUG : File_Large.bak: Seek from 5242880 to 0
2024/04/03 15:27:03 DEBUG : File_Large.bak: multi-thread copy: chunk 5/243 failed: multi-thread copy: failed to write chunk: failed to upload chunk 5 with 5242880 bytes: RequestCanceled: request context canceled
caused by: context canceled
2024/04/03 15:27:03 DEBUG : File_Large.bak: multi-thread copy: cancelling transfer on exit
2024/04/03 15:27:04 DEBUG : File_Large.bak: multi-thread copy: abort failed: failed to abort multipart upload "tjgAGNszNz4mttBU8bLBVOWZrdIuuysn6XHOM22rXlAD367kQHRZ2DJ2fmi_RQJL_GQQYjL6U1SapjUCi5bH8GDf7.dKcXOOHM2iLwKxl2M3gOhvy_HiBrex78RfjoUT": NoSuchUpload: The specified upload does not exist. The upload ID may be invalid, or the upload may have been aborted or completed.
        status code: 404, request id: Q9REG3VAM152WZJG, host id: 
...
        status code: 404, request id: Q9R009THRBC0N3QZ, host id: HuZON2e85+QZDXAWq3J3aAd67DXacODyYcDEi8z7N5s3Fek4MkDoc73kvlTTMeTRLmV7crxe2mP4oRYwe5/wfA==
2024/04/03 15:27:04 INFO  :
Transferred:              0 B / 0 B, -, 0 B/s, ETA -
Errors:                 1 (retrying may help)
Elapsed time:         9.7s

2024/04/03 15:27:04 DEBUG : 7 go routines active
2024/04/03 15:27:04 Failed to copy: multi-thread copy: failed to write chunk: failed to upload chunk 4 with 5242880 bytes: NoSuchUpload: The specified upload does not exist. The upload ID may be invalid, or the upload may have been aborted or completed.
        status code: 404, request id: Q9R009THRBC0N3QZ, host id: HuZON2e85+QZDXAWq3J3aAd67DXacODyYcDEi8z7N5s3Fek4MkDoc73kvlTTMeTRLmV7crxe2mP4oRYwe5/wfA==

But if I add " --s3-upload-cutoff=5000M --multi-thread-cutoff 5000M " it works perfectly...
If I add only " --s3-upload-cutoff=5000M" or only " --multi-thread-cutoff 5000M", it doesn't work...
And what is weird is the fact that even if the cutoff is set to 5000M, it seems to upload in 2 parts (check following log).

Command to copy file bigger than 1GB

rclone copy --s3-upload-cutoff=5000M --multi-thread-cutoff 5000M C:\_WORK\_POC_rclone\File_Large.bak qad-aws:Test -vv
C:\_WORK\_POC_rclone\rclone-v1.66.0-windows-amd64>rclone copy --s3-upload-cutoff=5000M --multi-thread-cutoff 5000M C:\_WORK\_POC_rclone\File_Large.bak qad-aws:Test -vv
2024/04/03 15:22:31 DEBUG : rclone: Version "v1.66.0" starting with parameters ["rclone" "copy" "--s3-upload-cutoff=5000M" "--multi-thread-cutoff" "5000M" "C:\\_WORK\\_POC_rclone\\File_Large.bak" "qad-aws:Test" "-vv"]
2024/04/03 15:22:31 DEBUG : Creating backend with remote "C:\\_WORK\\_POC_rclone\\File_Large.bak"
2024/04/03 15:22:31 DEBUG : Using config file from "C:\\Users\\obp\\AppData\\Roaming\\rclone\\rclone.conf"
2024/04/03 15:22:31 DEBUG : fs cache: adding new entry for parent of "C:\\_WORK\\_POC_rclone\\File_Large.bak", "//?/C:/_WORK/_POC_rclone"
2024/04/03 15:22:31 DEBUG : Creating backend with remote "qad-aws:Test"
2024/04/03 15:22:31 DEBUG : qad-aws: detected overridden config - adding "{0h7Di}" suffix to name
2024/04/03 15:22:31 DEBUG : Resolving service "s3" region "us-east-2"
2024/04/03 15:22:31 DEBUG : fs cache: renaming cache item "qad-aws:Test" to be canonical "qad-aws{0h7Di}:Test"
2024/04/03 15:22:32 DEBUG : File_Large.bak: Need to transfer - File not found at Destination
2024/04/03 15:22:32 INFO  : S3 bucket Test: Bucket "Test" created with ACL "private"
2024/04/03 15:23:31 INFO  :
Transferred:      460.746 MiB / 1.182 GiB, 38%, 8.014 MiB/s, ETA 1m33s
Transferred:            0 / 1, 0%
Elapsed time:       1m1.4s
Transferring:
 *                                File_Large.bak: 38% /1.182Gi, 8.017Mi/s, 1m33s

2024/04/03 15:24:31 INFO  :
Transferred:      940.090 MiB / 1.182 GiB, 78%, 7.978 MiB/s, ETA 33s
Transferred:            0 / 1, 0%
Elapsed time:       2m1.4s
Transferring:
 *                                File_Large.bak: 77% /1.182Gi, 7.979Mi/s, 33s

2024/04/03 15:25:05 DEBUG : File_Large.bak: md5 = 07516c0f9f10023c7f5218e2a3505cd5 OK
2024/04/03 15:25:05 INFO  : File_Large.bak: Copied (new)
2024/04/03 15:25:05 INFO  :
Transferred:        1.182 GiB / 1.182 GiB, 100%, 8.072 MiB/s, ETA 0s
Transferred:            1 / 1, 100%
Elapsed time:      2m35.6s

2024/04/03 15:25:05 DEBUG : 5 go routines active

Finally, with files bigger than 5GB, it fails, and even if I add "--s3-upload-cutoff=5000M --multi-thread-cutoff 5000M"

rclone copy --s3-upload-cutoff=5000M --multi-thread-cutoff 5000M C:\_WORK\_POC_rclone\File_HUGE.bak qad-aws:Test -vv
C:\_WORK\_POC_rclone\rclone-v1.66.0-windows-amd64>rclone copy --s3-upload-cutoff=5000M --multi-thread-cutoff 5000M C:\_WORK\_POC_rclone\File_HUGE.zip qad-aws:Test -vv
2024/04/03 15:41:15 DEBUG : rclone: Version "v1.66.0" starting with parameters ["rclone" "copy" "--s3-upload-cutoff=5000M" "--multi-thread-cutoff" "5000M" "C:\\_WORK\\_POC_rclone\\File_HUGE.zip" "qad-aws:Test" "-vv"]
2024/04/03 15:41:15 DEBUG : Creating backend with remote "C:\\_WORK\\_POC_rclone\\File_HUGE.zip"
2024/04/03 15:41:15 DEBUG : Using config file from "C:\\Users\\obp\\AppData\\Roaming\\rclone\\rclone.conf"
2024/04/03 15:41:15 DEBUG : fs cache: adding new entry for parent of "C:\\_WORK\\_POC_rclone\\File_HUGE.zip", "//?/C:/_WORK/_POC_rclone"
2024/04/03 15:41:15 DEBUG : Creating backend with remote "qad-aws:Test"
2024/04/03 15:41:15 DEBUG : qad-aws: detected overridden config - adding "{0h7Di}" suffix to name
2024/04/03 15:41:15 DEBUG : Resolving service "s3" region "us-east-2"
2024/04/03 15:41:15 DEBUG : fs cache: renaming cache item "qad-aws:Test" to be canonical "qad-aws{0h7Di}:Test"
2024/04/03 15:41:16 DEBUG : File_HUGE.zip: Need to transfer - File not found at Destination
2024/04/03 15:41:16 DEBUG : File_HUGE.zip: multi-thread copy: disabling buffering because source is local disk
2024/04/03 15:41:16 INFO  : S3 bucket Test: Bucket "Test" created with ACL "private"
2024/04/03 15:41:54 DEBUG : File_HUGE.zip: open chunk writer: started multipart upload: m6kqR4vwUZBAv.2LxW2ZYpxlQ4vP9DDlYeLis3ZCKqqeSLqJu_7ZGWYqncUhl2W9HS7bv1zyPXMjlHI8QMoPazmVQdw7uGQe1etIypZVi1hJpfmYv4x_HaXHLsiQCzP9
2024/04/03 15:41:54 DEBUG : File_HUGE.zip: multi-thread copy: using backend concurrency of 4 instead of --multi-thread-streams 4
2024/04/03 15:41:54 DEBUG : File_HUGE.zip: Starting multi-thread copy with 1749 chunks of size 5Mi with 4 parallel streams
2024/04/03 15:41:54 DEBUG : File_HUGE.zip: multi-thread copy: chunk 4/1749 (15728640-20971520) size 5Mi starting
2024/04/03 15:41:54 DEBUG : File_HUGE.zip: multi-thread copy: chunk 3/1749 (10485760-15728640) size 5Mi starting
2024/04/03 15:41:54 DEBUG : File_HUGE.zip: multi-thread copy: chunk 2/1749 (5242880-10485760) size 5Mi starting
2024/04/03 15:41:54 DEBUG : File_HUGE.zip: multi-thread copy: chunk 1/1749 (0-5242880) size 5Mi starting
2024/04/03 15:41:54 DEBUG : File_HUGE.zip: Seek from 5242880 to 0
2024/04/03 15:41:54 DEBUG : File_HUGE.zip: Seek from 5242880 to 0
2024/04/03 15:41:54 DEBUG : File_HUGE.zip: Seek from 5242880 to 0
2024/04/03 15:41:54 DEBUG : File_HUGE.zip: Seek from 5242880 to 0
2024/04/03 15:41:54 DEBUG : File_HUGE.zip: multi-thread copy: chunk 1/1749 failed: multi-thread copy: failed to write chunk: failed to upload chunk 1 with 5242880 bytes: NoSuchUpload: The specified upload does not exist. The upload ID may be invalid, or the upload may have been aborted or completed.
        status code: 404, request id: 7Z5A6RCETR9K7PSR, host id: 3VG2I/9peKFqmgrVUOwcQN5yufEbgHRzTSLGMOejFDFOTPCrSpjHK3Sah1t3TacmmlhJWb+J1KE=
2024/04/03 15:41:54 DEBUG : File_HUGE.zip: multi-thread copy: chunk 5/1749 (20971520-26214400) size 5Mi starting
2024/04/03 15:41:54 DEBUG : File_HUGE.zip: multi-thread copy: chunk 2/1749 failed: multi-thread copy: failed to write chunk: failed to upload chunk 2 with 5242880 bytes: RequestCanceled: request context canceled
caused by: context canceled
2024/04/03 15:41:54 DEBUG : File_HUGE.zip: multi-thread copy: chunk 4/1749 failed: multi-thread copy: failed to write chunk: failed to upload chunk 4 with 5242880 bytes: RequestCanceled: request context canceled
caused by: context canceled
2024/04/03 15:41:54 DEBUG : File_HUGE.zip: multi-thread copy: chunk 3/1749 failed: multi-thread copy: failed to write chunk: failed to upload chunk 3 with 5242880 bytes: RequestCanceled: request context canceled
caused by: context canceled
2024/04/03 15:41:54 DEBUG : File_HUGE.zip: Seek from 5242880 to 0
2024/04/03 15:41:54 DEBUG : File_HUGE.zip: multi-thread copy: chunk 5/1749 failed: multi-thread copy: failed to write chunk: failed to upload chunk 5 with 5242880 bytes: RequestCanceled: request context canceled
caused by: context canceled
2024/04/03 15:41:54 DEBUG : File_HUGE.zip: multi-thread copy: cancelling transfer on exit
2024/04/03 15:41:54 DEBUG : File_HUGE.zip: multi-thread copy: abort failed: failed to abort multipart upload "m6kqR4vwUZBAv.2LxW2ZYpxlQ4vP9DDlYeLis3ZCKqqeSLqJu_7ZGWYqncUhl2W9HS7bv1zyPXMjlHI8QMoPazmVQdw7uGQe1etIypZVi1hJpfmYv4x_HaXHLsiQCzP9": NoSuchUpload: The specified upload does not exist. The upload ID may be invalid, or the upload may have been aborted or completed.
        status code: 404, request id: 7Z56C4SPMM332S4R, host id: XU39rv5yFEcWzuhhA4/kwLfapb6QeIJobQIJEba9VNT7ouCEC6Z/WTOJ4AzQjZKLN8qxttrVlHY=
2024/04/03 15:41:54 ERROR : File_HUGE.zip: Failed to copy: multi-thread copy: failed to write chunk: failed to upload chunk 1 with 5242880 bytes: NoSuchUpload: The specified upload does not exist. The upload ID may be invalid, or the upload may have been aborted or completed.
        status code: 404, request id: 7Z5A6RCETR9K7PSR, host id: 3VG2I/9peKFqmgrVUOwcQN5yufEbgHRzTSLGMOejFDFOTPCrSpjHK3Sah1t3TacmmlhJWb+J1KE=
2024/04/03 15:41:54 ERROR : Attempt 1/3 failed with 1 errors and: multi-thread copy: failed to write chunk: failed to upload chunk 1 with 5242880 bytes: NoSuchUpload: The specified upload does not exist. The upload ID may be invalid, or the upload may have been aborted or completed.
        status code: 404, request id: 7Z5A6RCETR9K7PSR, host id: 3VG2I/9peKFqmgrVUOwcQN5yufEbgHRzTSLGMOejFDFOTPCrSpjHK3Sah1t3TacmmlhJWb+J1KE=
2024/04/03 15:41:54 DEBUG : File_HUGE.zip: Need to transfer - File not found at Destination
2024/04/03 15:41:54 DEBUG : File_HUGE.zip: multi-thread copy: disabling buffering because source is local disk
2024/04/03 15:42:15 INFO  :
Transferred:              0 B / 0 B, -, 0 B/s, ETA -
Transferred:            0 / 1, 0%
Elapsed time:       1m1.5s
Transferring:
 *                                 File_HUGE.zip: transferring

2024/04/03 15:42:35 DEBUG : File_HUGE.zip: open chunk writer: started multipart upload: p9jvppGgRwj5GC37zEc2R8PRagTOQwvRxmHExMaR7xD1DODkJnp4G.SgLGzu7Dt3RL7C70o9KZBjdWfddEBXHjKITgbq.Wlt8GmSjzr4sUXDfl4aMzxTmMEi70zgxi49
2024/04/03 15:42:35 DEBUG : File_HUGE.zip: multi-thread copy: using backend concurrency of 4 instead of --multi-thread-streams 4
2024/04/03 15:42:35 DEBUG : File_HUGE.zip: Starting multi-thread copy with 1749 chunks of size 5Mi with 4 parallel streams
2024/04/03 15:42:35 DEBUG : File_HUGE.zip: multi-thread copy: chunk 4/1749 (15728640-20971520) size 5Mi starting
2024/04/03 15:42:35 DEBUG : File_HUGE.zip: multi-thread copy: chunk 2/1749 (5242880-10485760) size 5Mi starting
2024/04/03 15:42:35 DEBUG : File_HUGE.zip: multi-thread copy: chunk 1/1749 (0-5242880) size 5Mi starting
2024/04/03 15:42:35 DEBUG : File_HUGE.zip: multi-thread copy: chunk 3/1749 (10485760-15728640) size 5Mi starting
2024/04/03 15:42:35 DEBUG : File_HUGE.zip: Seek from 5242880 to 0
2024/04/03 15:42:35 DEBUG : File_HUGE.zip: Seek from 5242880 to 0
2024/04/03 15:42:35 DEBUG : File_HUGE.zip: Seek from 5242880 to 0
2024/04/03 15:42:35 DEBUG : File_HUGE.zip: Seek from 5242880 to 0
2024/04/03 15:42:35 DEBUG : File_HUGE.zip: multi-thread copy: chunk 4/1749 failed: multi-thread copy: failed to write chunk: failed to upload chunk 4 with 5242880 bytes: NoSuchUpload: The specified upload does not exist. The upload ID may be invalid, or the upload may have been aborted or completed.
        status code: 404, request id: 5F2DXXB1J8SN90Q7, host id: Pm88lyciWYa68YrzvbU7mzOYfSWlxkdTesdJlu3CG9aQGJAtiXKOYjl1WFPxNVI6Jul10Wg8EGCYboDNJyI5dg==
2024/04/03 15:42:35 DEBUG : File_HUGE.zip: multi-thread copy: chunk 5/1749 (20971520-26214400) size 5Mi starting
2024/04/03 15:42:35 DEBUG : File_HUGE.zip: multi-thread copy: chunk 3/1749 failed: multi-thread copy: failed to write chunk: failed to upload chunk 3 with 5242880 bytes: RequestCanceled: request context canceled
caused by: context canceled
2024/04/03 15:42:35 DEBUG : File_HUGE.zip: multi-thread copy: chunk 1/1749 failed: multi-thread copy: failed to write chunk: failed to upload chunk 1 with 5242880 bytes: RequestCanceled: request context canceled
caused by: context canceled
2024/04/03 15:42:35 DEBUG : File_HUGE.zip: multi-thread copy: chunk 2/1749 failed: multi-thread copy: failed to write chunk: failed to upload chunk 2 with 5242880 bytes: RequestCanceled: request context canceled
caused by: context canceled
2024/04/03 15:42:35 DEBUG : File_HUGE.zip: Seek from 5242880 to 0
2024/04/03 15:42:35 DEBUG : File_HUGE.zip: multi-thread copy: chunk 5/1749 failed: multi-thread copy: failed to write chunk: failed to upload chunk 5 with 5242880 bytes: RequestCanceled: request context canceled
caused by: context canceled
2024/04/03 15:42:35 DEBUG : File_HUGE.zip: multi-thread copy: cancelling transfer on exit
2024/04/03 15:42:36 DEBUG : File_HUGE.zip: multi-thread copy: abort failed: failed to abort multipart upload "p9jvppGgRwj5GC37zEc2R8PRagTOQwvRxmHExMaR7xD1DODkJnp4G.SgLGzu7Dt3RL7C70o9KZBjdWfddEBXHjKITgbq.Wlt8GmSjzr4sUXDfl4aMzxTmMEi70zgxi49": NoSuchUpload: The specified upload does not exist. The upload ID may be invalid, or the upload may have been aborted or completed.
        status code: 404, request id: 5F29E779CF78KGRM, host id: uvTGFM94ILn0y74CQw0zppHxwNaFoLJahBHCfBrZ2JOK6L5DawNaTucQdMhTTcSMGGkTAmJ16eO3rOQTy2dYPQ==
2024/04/03 15:42:36 ERROR : File_HUGE.zip: Failed to copy: multi-thread copy: failed to write chunk: failed to upload chunk 4 with 5242880 bytes: NoSuchUpload: The specified upload does not exist. The upload ID may be invalid, or the upload may have been aborted or completed.
        status code: 404, request id: 5F2DXXB1J8SN90Q7, host id: Pm88lyciWYa68YrzvbU7mzOYfSWlxkdTesdJlu3CG9aQGJAtiXKOYjl1WFPxNVI6Jul10Wg8EGCYboDNJyI5dg==
2024/04/03 15:42:36 ERROR : Attempt 2/3 failed with 1 errors and: multi-thread copy: failed to write chunk: failed to upload chunk 4 with 5242880 bytes: NoSuchUpload: The specified upload does not exist. The upload ID may be invalid, or the upload may have been aborted or completed.
        status code: 404, request id: 5F2DXXB1J8SN90Q7, host id: Pm88lyciWYa68YrzvbU7mzOYfSWlxkdTesdJlu3CG9aQGJAtiXKOYjl1WFPxNVI6Jul10Wg8EGCYboDNJyI5dg==
2024/04/03 15:42:36 DEBUG : File_HUGE.zip: Need to transfer - File not found at Destination
2024/04/03 15:42:36 DEBUG : File_HUGE.zip: multi-thread copy: disabling buffering because source is local disk
2024/04/03 15:43:15 DEBUG : File_HUGE.zip: open chunk writer: started multipart upload: 7t6CzbQCt5W4mjHb2DGXRb8r4bBh6yU2_McGabIjuoUOiqsYEEZ4bNcE4ZZi1heUw2KJWMW5QMB6rK0XRY.xgF47XhQjHZxvbHuvuGaPH2ppch.cu1.ZzdLGcRdOkAx3
2024/04/03 15:43:15 DEBUG : File_HUGE.zip: multi-thread copy: using backend concurrency of 4 instead of --multi-thread-streams 4
...

2024/04/03 15:43:15 DEBUG : 6 go routines active
2024/04/03 15:43:15 Failed to copy: multi-thread copy: failed to write chunk: failed to upload chunk 4 with 5242880 bytes: NoSuchUpload: The specified upload does not exist. The upload ID may be invalid, or the upload may have been aborted or completed.
        status code: 404, request id: 40DA7J308V9CQCM0, host id: Yl3jqxvttLi0ytwG0Yi1BjvdPXaIyiYrqroPh0FuCyh0kCrfQqNseFjE4D7nYsgEVox/l6esz7Q=


Thanks for your help !

Run the command 'rclone version' and share the full output of the command.

rclone v1.66.0
- os/version: Microsoft Windows 10 Enterprise 22H2 (64 bit)
- os/kernel: 10.0.19045.4170 (x86_64)
- os/type: windows
- os/arch: amd64
- go/version: go1.22.1
- go/linking: static
- go/tags: cmount

Which cloud storage system are you using? (eg Google Drive)

S3

Please run 'rclone config redacted' and share the full output. If you get command not found, please make sure to update rclone.

[qad-aws]
type = s3
provider = Other
env-auth = false
access_key_id = XXX
secret_access_key = XXX
session_token = XXX
region = us-east-2
location_constraint = us-east-2
acl = private
bucket_acl = private
endpoint = https://REMOVED.us-east-2.amazonaws.com/

welcome to the forum,

rclone is in transition from this being controlled by the backend to this being controlled by the rclone core
fwiw, i intentionally disable that with --multi-thread-streams=0

Thanks,

I saw yesterday the topic you just shared and I tried the following workaround, but it doesn't work ..

"This suggests a workaround for you though, raise --multi-thread-cutoff to something very large, say --multi-thread-cutoff 1P and this will disable secenario 3 uploads. Or set --multi-thread-streams 0 which should have the same effect I think."

perhaps that should be provider = AWS
i think that could be the issue

from the rclone source code

case "Other":
		useMultipartEtag = false

that should be env_auth = false
tho, not sure it is required, as false is default value


ok, for a deeper look, --dump=headers --retries=1 -vv

Something a little weird is going on here

The first chunk appears to say the upload ID is invalid which is strange

However when we come to cancel it later it says it doesn't exist

So it is as if the multipart upload did not get created properly.

I guess it is something to do with

Which is the only odd thing in your config (apart from provider = Other).

Why do you have that? Does it work if you comment it out?

Hello,

Thanks for your replies.

@Nick : Without the endpoint, I get an invalid bucket error

ERROR : File.bak: Failed to copy: failed to prepare upload: InvalidBucketName: The specified bucket is not valid.

So, first, I fixed my config file (env-auth ==> env_auth) and switch to provider = AWS as suggested

[qad-aws]
type = s3
provider = AWS
env_auth = false
access_key_id = XXX
secret_access_key = XXX
session_token = XXX
region = us-east-2
location_constraint = us-east-2
acl = private
bucket_acl = private
endpoint = https://__REMOVED__.us-east-2.amazonaws.com/

Again, it works perfectly for small files (smaller than 1GB), but for larger files than 1GB, I still have the same error. But, in addition, I also get a certificate error : "tls: failed to verify certificate: x509: certificate is valid for *.s3.us-east-2.amazonaws.com"

Here is the log for that command - File larger than 1GB

rclone copy C:\_WORK\_POC_rclone\File_Large.bak qad-aws:Test
C:\_WORK\_POC_rclone\rclone-v1.66.0-windows-amd64>rclone copy C:\_WORK\_POC_rclone\File_Large.bak qad-aws:Test
2024/04/04 08:11:58 ERROR : File_Large.bak: Failed to copy: multi-thread copy: failed to write chunk: failed to upload chunk 3 with 5242880 bytes: RequestError: send request failed
caused by: Put "https://qad-cloud-support-only-00092315-non-prod-s3-bucket.qad-cloud-support-only-00092315-non-prod-s3-bucket.s3.us-east-2.amazonaws.com/Test/File_Large.bak?partNumber=3&uploadId=XFdt87AR1owk9HinrtMR.wHfSxgQSJE9uQSjEuEYXGt.UJz.qSWZCxCBWp2RN15Eoju9BFZ3._hkh0cSEKbV2ztVuEj17i0T.HD5j_.0xzVpU1Nomv6KF6234va8DUbz": tls: failed to verify certificate: x509: certificate is valid for *.s3.us-east-2.amazonaws.com, s3.us-east-2.amazonaws.com, *.s3-us-east-2.amazonaws.com, s3-us-east-2.amazonaws.com, *.s3.dualstack.us-east-2.amazonaws.com, s3.dualstack.us-east-2.amazonaws.com, *.s3.amazonaws.com, *.s3-control.us-east-2.amazonaws.com, s3-control.us-east-2.amazonaws.com, *.s3-control.dualstack.us-east-2.amazonaws.com, s3-control.dualstack.us-east-2.amazonaws.com, *.s3-accesspoint.us-east-2.amazonaws.com, *.s3-accesspoint.dualstack.us-east-2.amazonaws.com, *.s3-deprecated.us-east-2.amazonaws.com, s3-deprecated.us-east-2.amazonaws.com, not qad-cloud-support-only-00092315-non-prod-s3-bucket.qad-cloud-support-only-00092315-non-prod-s3-bucket.s3.us-east-2.amazonaws.com
2024/04/04 08:11:58 ERROR : Attempt 1/3 failed with 1 errors and: multi-thread copy: failed to write chunk: failed to upload chunk 3 with 5242880 bytes: RequestError: send request failed

Then, I get back to provider = other and add --dump=headers --retries=1 -vv as suggested, with a file larger than 1GB. Here is the result.

C:\_WORK\_POC_rclone\rclone-v1.66.0-windows-amd64>rclone copy --dump=headers --retries=1 -vv C:\_WORK\_POC_rclone\File_Large.bak qad-aws:Test
2024/04/04 08:24:52 DEBUG : rclone: Version "v1.66.0" starting with parameters ["rclone" "copy" "--dump=headers" "--retries=1" "-vv" "C:\\_WORK\\_POC_rclone\\File_Large.bak" "qad-aws:Test"]
2024/04/04 08:24:52 DEBUG : Creating backend with remote "C:\\_WORK\\_POC_rclone\\File_Large.bak"
2024/04/04 08:24:52 DEBUG : Using config file from "C:\\Users\\obp\\AppData\\Roaming\\rclone\\rclone.conf"
2024/04/04 08:24:52 DEBUG : fs cache: adding new entry for parent of "C:\\_WORK\\_POC_rclone\\File_Large.bak", "//?/C:/_WORK/_POC_rclone"
2024/04/04 08:24:52 DEBUG : Creating backend with remote "qad-aws:Test"
2024/04/04 08:24:52 DEBUG : You have specified to dump information. Please be noted that the Accept-Encoding as shown may not be correct in the request and the response may not show Content-Encoding if the go standard libraries auto gzip encoding was in effect. In this case the body of the request will be gunzipped before showing it.
2024/04/04 08:24:52 DEBUG : Resolving service "s3" region "us-east-2"
2024/04/04 08:24:52 DEBUG : You have specified to dump information. Please be noted that the Accept-Encoding as shown may not be correct in the request and the response may not show Content-Encoding if the go standard libraries auto gzip encoding was in effect. In this case the body of the request will be gunzipped before showing it.
2024/04/04 08:24:52 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2024/04/04 08:24:52 DEBUG : HTTP REQUEST (req 0xc000672360)
2024/04/04 08:24:52 DEBUG : HEAD /Test/File_Large.bak HTTP/1.1
Host: qad-cloud-support-only-00092315-non-prod-s3-bucket.s3.us-east-2.amazonaws.com
User-Agent: rclone/v1.66.0
Authorization: XXXX
X-Amz-Content-Sha256: e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855
X-Amz-Date: 20240404T062452Z
X-Amz-Security-Token: IQoJb3JpZ2luX2VjEF4aCXVzLWVhc3QtMSJHMEUCIQCi+ImAYLqzvyS7Iggn/10wY9oxAMdSbS9vvCcVLiiGKAIgHv57t4WzKrClCXJKrrES2nIPu8v21UaPNzGRNEmOEyEq9wIIh///////////ARABGgw0MjM0NTEwNjkyNDQiDOJTZT+0KGaOC11MISrLAlo9Sjx5UK9R1ODtdaHkWPDQ0y0NnM9vhd4mtFyUsxERmxZhOi+knbdWLLOJsSnAsMfCaSzB6EhNUU3GlmbtlL2a0sT6wkvJWq9/MMtjfSzUNXM0AmQdecKsr+V8Uxtb+iyrsAHLSt0YVvxVYYSlvkvtMwVydmj4qAPBYAzJMoAV5clziM0k3Ao8D0khSuAWRjr0LpKzLPnDcaXQN1Yt/JAiExod/qLGFr/jiasVBwelrHL/FHwAF+JvGa+UKvW715HGNyfrePbLNLNh4ENpOyzYZrB0VzOVnHotc7zMgg5ORUH2KGcfWFI2I/o32kdpQ9ySGhIkFcnGj7uNuCCUNKDZ9UcWPT3cMImpSlLK1DjIB9kCs2VX+BRubIXUFFg3fzS7SkVGwQADOX62Zko4rr18CUugLU9LBzuEFzQyOfhT7ImGt3jO0hOUyDgw6IG5sAY6pwGNq3BkjLw0/aU7mcdVsxruyyvGIp67rc2R3oNw4Epxt1asyJ3xGfZiOcXSBghaVFu1zOCj7y+r3TH6Zxh82RgRyb81YJlE1oWOTI5Sdq2zlqqHG2jZak34dnyVuekTjv3zomOsL+ptyrxFxn8GxM5QhHNX0UFSmYL28aaHrzX45juQzOB+klGYaQj2jPkX4w7JZoO7Z+tGQ1w+mIdoDjs4ZJY+pIgs5w==

2024/04/04 08:24:52 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2024/04/04 08:24:53 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2024/04/04 08:24:53 DEBUG : HTTP RESPONSE (req 0xc000672360)
2024/04/04 08:24:53 DEBUG : HTTP/1.1 404 Not Found
Connection: close
Content-Type: application/xml
Date: Thu, 04 Apr 2024 06:24:52 GMT
Server: AmazonS3
X-Amz-Id-2: /U41m4OHPq5O+ooA5Mghqdkgccuqu+D90q7cwIyOjxrGhbmCCi1A7NEV1+/cpyY188KqbMI3Dr8=
X-Amz-Request-Id: N2FZNFXVH45J2776

2024/04/04 08:24:53 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2024/04/04 08:24:53 DEBUG : File_Large.bak: Need to transfer - File not found at Destination
2024/04/04 08:24:53 DEBUG : File_Large.bak: multi-thread copy: disabling buffering because source is local disk
2024/04/04 08:24:53 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2024/04/04 08:24:53 DEBUG : HTTP REQUEST (req 0xc000b3a5a0)
2024/04/04 08:24:53 DEBUG : PUT /Test HTTP/1.1
Host: qad-cloud-support-only-00092315-non-prod-s3-bucket.s3.us-east-2.amazonaws.com
User-Agent: rclone/v1.66.0
Content-Length: 153
Authorization: XXXX
X-Amz-Acl: private
X-Amz-Content-Sha256: 70cae86320841ea73b0bdc759f99920c7caa405e61af2742575750c6586272c9
X-Amz-Date: 20240404T062453Z
X-Amz-Security-Token: IQoJb3JpZ2luX2VjEF4aCXVzLWVhc3QtMSJHMEUCIQCi+ImAYLqzvyS7Iggn/10wY9oxAMdSbS9vvCcVLiiGKAIgHv57t4WzKrClCXJKrrES2nIPu8v21UaPNzGRNEmOEyEq9wIIh///////////ARABGgw0MjM0NTEwNjkyNDQiDOJTZT+0KGaOC11MISrLAlo9Sjx5UK9R1ODtdaHkWPDQ0y0NnM9vhd4mtFyUsxERmxZhOi+knbdWLLOJsSnAsMfCaSzB6EhNUU3GlmbtlL2a0sT6wkvJWq9/MMtjfSzUNXM0AmQdecKsr+V8Uxtb+iyrsAHLSt0YVvxVYYSlvkvtMwVydmj4qAPBYAzJMoAV5clziM0k3Ao8D0khSuAWRjr0LpKzLPnDcaXQN1Yt/JAiExod/qLGFr/jiasVBwelrHL/FHwAF+JvGa+UKvW715HGNyfrePbLNLNh4ENpOyzYZrB0VzOVnHotc7zMgg5ORUH2KGcfWFI2I/o32kdpQ9ySGhIkFcnGj7uNuCCUNKDZ9UcWPT3cMImpSlLK1DjIB9kCs2VX+BRubIXUFFg3fzS7SkVGwQADOX62Zko4rr18CUugLU9LBzuEFzQyOfhT7ImGt3jO0hOUyDgw6IG5sAY6pwGNq3BkjLw0/aU7mcdVsxruyyvGIp67rc2R3oNw4Epxt1asyJ3xGfZiOcXSBghaVFu1zOCj7y+r3TH6Zxh82RgRyb81YJlE1oWOTI5Sdq2zlqqHG2jZak34dnyVuekTjv3zomOsL+ptyrxFxn8GxM5QhHNX0UFSmYL28aaHrzX45juQzOB+klGYaQj2jPkX4w7JZoO7Z+tGQ1w+mIdoDjs4ZJY+pIgs5w==
Accept-Encoding: gzip

2024/04/04 08:24:53 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2024/04/04 08:24:53 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2024/04/04 08:24:53 DEBUG : HTTP RESPONSE (req 0xc000b3a5a0)
2024/04/04 08:24:53 DEBUG : HTTP/1.1 200 OK
Content-Length: 0
Date: Thu, 04 Apr 2024 06:24:54 GMT
Etag: "b95c6c46ea8203dd0db622cdc5900277"
Server: AmazonS3
X-Amz-Id-2: af1XnhHVN4SxqHnu89OQF2jbwwLSlSoqFdImSHmRf0LYyewvsCBKQqF3Nr2oQt7wE7e9YtD3VjY=
X-Amz-Request-Id: N2FV8C1ZK80NMS9F
X-Amz-Server-Side-Encryption: AES256

2024/04/04 08:24:53 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2024/04/04 08:24:53 INFO  : S3 bucket Test: Bucket "Test" created with ACL "private"
2024/04/04 08:24:56 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2024/04/04 08:24:56 DEBUG : HTTP REQUEST (req 0xc000672c60)
2024/04/04 08:24:56 DEBUG : POST /Test/File_Large.bak?uploads= HTTP/1.1
Host: qad-cloud-support-only-00092315-non-prod-s3-bucket.s3.us-east-2.amazonaws.com
User-Agent: rclone/v1.66.0
Content-Length: 0
Authorization: XXXX
Content-Type: application/octet-stream
X-Amz-Acl: private
X-Amz-Content-Sha256: e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855
X-Amz-Date: 20240404T062456Z
X-Amz-Meta-Md5chksum: B1FsD58QAjx/Uhjio1Bc1Q==
X-Amz-Meta-Mtime: 1711017352.843103
X-Amz-Security-Token: IQoJb3JpZ2luX2VjEF4aCXVzLWVhc3QtMSJHMEUCIQCi+ImAYLqzvyS7Iggn/10wY9oxAMdSbS9vvCcVLiiGKAIgHv57t4WzKrClCXJKrrES2nIPu8v21UaPNzGRNEmOEyEq9wIIh///////////ARABGgw0MjM0NTEwNjkyNDQiDOJTZT+0KGaOC11MISrLAlo9Sjx5UK9R1ODtdaHkWPDQ0y0NnM9vhd4mtFyUsxERmxZhOi+knbdWLLOJsSnAsMfCaSzB6EhNUU3GlmbtlL2a0sT6wkvJWq9/MMtjfSzUNXM0AmQdecKsr+V8Uxtb+iyrsAHLSt0YVvxVYYSlvkvtMwVydmj4qAPBYAzJMoAV5clziM0k3Ao8D0khSuAWRjr0LpKzLPnDcaXQN1Yt/JAiExod/qLGFr/jiasVBwelrHL/FHwAF+JvGa+UKvW715HGNyfrePbLNLNh4ENpOyzYZrB0VzOVnHotc7zMgg5ORUH2KGcfWFI2I/o32kdpQ9ySGhIkFcnGj7uNuCCUNKDZ9UcWPT3cMImpSlLK1DjIB9kCs2VX+BRubIXUFFg3fzS7SkVGwQADOX62Zko4rr18CUugLU9LBzuEFzQyOfhT7ImGt3jO0hOUyDgw6IG5sAY6pwGNq3BkjLw0/aU7mcdVsxruyyvGIp67rc2R3oNw4Epxt1asyJ3xGfZiOcXSBghaVFu1zOCj7y+r3TH6Zxh82RgRyb81YJlE1oWOTI5Sdq2zlqqHG2jZak34dnyVuekTjv3zomOsL+ptyrxFxn8GxM5QhHNX0UFSmYL28aaHrzX45juQzOB+klGYaQj2jPkX4w7JZoO7Z+tGQ1w+mIdoDjs4ZJY+pIgs5w==
Accept-Encoding: gzip

2024/04/04 08:24:56 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2024/04/04 08:24:56 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2024/04/04 08:24:56 DEBUG : HTTP RESPONSE (req 0xc000672c60)
2024/04/04 08:24:56 DEBUG : HTTP/1.1 200 OK
Transfer-Encoding: chunked
Date: Thu, 04 Apr 2024 06:24:57 GMT
Server: AmazonS3
X-Amz-Id-2: FCm/mMi7CWUy4k2PV6HGw7MuYWRX0ga5VivZ8f6r+W/9dZmiKvQzfAwZ7FB7+9L9HoS62HC9DxE=
X-Amz-Request-Id: XRY3DJ9A5V1X09F9
X-Amz-Server-Side-Encryption: AES256

2024/04/04 08:24:56 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2024/04/04 08:24:56 DEBUG : File_Large.bak: open chunk writer: started multipart upload: jTi9OphTRe3dW2Vh2KVlMVAKCiHRHcDlZokZMSo7NuEhRqMI1maqIGZXoy_YVBj6.EudglE4.NspFM_1_lxkFV2F9AjiI7q2oe1JoVWybkaI4s7pjYo55IVXLZSvvmiz
2024/04/04 08:24:56 DEBUG : File_Large.bak: multi-thread copy: using backend concurrency of 4 instead of --multi-thread-streams 4
2024/04/04 08:24:56 DEBUG : File_Large.bak: Starting multi-thread copy with 243 chunks of size 5Mi with 4 parallel streams
2024/04/04 08:24:56 DEBUG : File_Large.bak: multi-thread copy: chunk 4/243 (15728640-20971520) size 5Mi starting
2024/04/04 08:24:56 DEBUG : File_Large.bak: multi-thread copy: chunk 2/243 (5242880-10485760) size 5Mi starting
2024/04/04 08:24:56 DEBUG : File_Large.bak: multi-thread copy: chunk 1/243 (0-5242880) size 5Mi starting
2024/04/04 08:24:56 DEBUG : File_Large.bak: multi-thread copy: chunk 3/243 (10485760-15728640) size 5Mi starting
2024/04/04 08:24:56 DEBUG : File_Large.bak: Seek from 5242880 to 0
2024/04/04 08:24:56 DEBUG : File_Large.bak: Seek from 5242880 to 0
2024/04/04 08:24:56 DEBUG : File_Large.bak: Seek from 5242880 to 0
2024/04/04 08:24:56 DEBUG : File_Large.bak: Seek from 5242880 to 0
2024/04/04 08:24:56 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2024/04/04 08:24:56 DEBUG : HTTP REQUEST (req 0xc000673200)
2024/04/04 08:24:56 DEBUG : PUT /qad-cloud-support-only-00092315-non-prod-s3-bucket/Test/File_Large.bak?partNumber=4&uploadId=jTi9OphTRe3dW2Vh2KVlMVAKCiHRHcDlZokZMSo7NuEhRqMI1maqIGZXoy_YVBj6.EudglE4.NspFM_1_lxkFV2F9AjiI7q2oe1JoVWybkaI4s7pjYo55IVXLZSvvmiz HTTP/1.1
Host: qad-cloud-support-only-00092315-non-prod-s3-bucket.s3.us-east-2.amazonaws.com
User-Agent: rclone/v1.66.0
Content-Length: 5242880
Authorization: XXXX
Content-Md5: +MuF7HJgUwhekS2Xl9rJRw==
Expect: 100-continue
X-Amz-Content-Sha256: fbdfa30fc01c64300830678290062b13d5361cc3a95fce2e8058b5164db0b480
X-Amz-Date: 20240404T062456Z
X-Amz-Security-Token: IQoJb3JpZ2luX2VjEF4aCXVzLWVhc3QtMSJHMEUCIQCi+ImAYLqzvyS7Iggn/10wY9oxAMdSbS9vvCcVLiiGKAIgHv57t4WzKrClCXJKrrES2nIPu8v21UaPNzGRNEmOEyEq9wIIh///////////ARABGgw0MjM0NTEwNjkyNDQiDOJTZT+0KGaOC11MISrLAlo9Sjx5UK9R1ODtdaHkWPDQ0y0NnM9vhd4mtFyUsxERmxZhOi+knbdWLLOJsSnAsMfCaSzB6EhNUU3GlmbtlL2a0sT6wkvJWq9/MMtjfSzUNXM0AmQdecKsr+V8Uxtb+iyrsAHLSt0YVvxVYYSlvkvtMwVydmj4qAPBYAzJMoAV5clziM0k3Ao8D0khSuAWRjr0LpKzLPnDcaXQN1Yt/JAiExod/qLGFr/jiasVBwelrHL/FHwAF+JvGa+UKvW715HGNyfrePbLNLNh4ENpOyzYZrB0VzOVnHotc7zMgg5ORUH2KGcfWFI2I/o32kdpQ9ySGhIkFcnGj7uNuCCUNKDZ9UcWPT3cMImpSlLK1DjIB9kCs2VX+BRubIXUFFg3fzS7SkVGwQADOX62Zko4rr18CUugLU9LBzuEFzQyOfhT7ImGt3jO0hOUyDgw6IG5sAY6pwGNq3BkjLw0/aU7mcdVsxruyyvGIp67rc2R3oNw4Epxt1asyJ3xGfZiOcXSBghaVFu1zOCj7y+r3TH6Zxh82RgRyb81YJlE1oWOTI5Sdq2zlqqHG2jZak34dnyVuekTjv3zomOsL+ptyrxFxn8GxM5QhHNX0UFSmYL28aaHrzX45juQzOB+klGYaQj2jPkX4w7JZoO7Z+tGQ1w+mIdoDjs4ZJY+pIgs5w==
Accept-Encoding: gzip

Hope it helps !!

Hello,

Any news ?

bucket names need to be lowercase

why must you use that, what is the requirement?
none of my aws remotes use endpoint. only used with other s3 providers such as wasabi and idrive.

again, never had to use that.

given that rclone cannot create the session token and env_auth = false
how are you creating the session token and inserting that into the config file?
for example, my backup script creates the session token and feeds it to rclone as an environment variable.

Hello,

Thanks for your reply.

Without the endpoint, I get an invalid bucket error

ERROR : File.bak: Failed to copy: failed to prepare upload: InvalidBucketName: The specified bucket is not valid.

Without the token, it doesn't work. I got 403 error.

The only solution I found is to add the endpoint and the token. Without that, I can not communicate with AWS.

And to create token, I go to the AWS Console, section "Access key / Get credntials" and copy key_id, access_key and session token.

Edit : https://REMOVED.us-east-2.amazonaws.com/ is not the exact endpoint. I have removed some sensible information and replaced it by REMOVED to post on the forum.

i would start over, get this working.
--- create a new bucket using a valid name
--- no env_auth, session_token, location_constraint, acl, endpoint

fwiw, this is a aws remote from my config.

[aws_vserver03_en07_rcloner_remote]
type = s3
provider = AWS
access_key_id = 
secret_access_key = 
region = us-east-1
storage_class = DEEP_ARCHIVE

note: storage_class is not required.

With

[qad-aws-test]
type = s3
provider = AWS
access_key_id = XXX
secret_access_key = XXX
region = us-east-1
storage_class = DEEP_ARCHIVE

I get

2024/04/05 16:15:09 ERROR : Attempt 1/3 failed with 1 errors and: Forbidden: Forbidden
        status code: 403, request id: CV2NYKG5WASE09HQ, host id: GANycbPXJ6OwCEUbudtLvo1TwDF2vHirLIh7xcbLd4fpDauk7F5hT2qaqFE+8LiLuImQHa00IHQ=
2024/04/05 16:15:10 ERROR : Attempt 2/3 failed with 1 errors and: Forbidden: Forbidden
        status code: 403, request id: CV2HZ57CJ3JV2DZM, host id: SfvuRfmobuVGRgGPNyaZ/f+Fil2K01/NyQ1w6uOzmSTxr2bDc5NrichjrMLRKVQpm8EKbSXVijg=
2024/04/05 16:15:10 ERROR : Attempt 3/3 failed with 1 errors and: Forbidden: Forbidden
        status code: 403, request id: VGD1J8889S4B1693, host id: KWhKyoxICg05p8BmyU9wcUYxOsyeVgukt6YhgPrjhHE0bGu2rDqGMOHEUFTsj1LI9mPWjK09xhE=
2024/04/05 16:15:10 Failed to copy: Forbidden: Forbidden
        status code: 403, request id: VGD1J8889S4B1693, host id: KWhKyoxICg05p8BmyU9wcUYxOsyeVgukt6YhgPrjhHE0bGu2rDqGMOHEUFTsj1LI9mPWjK09xhE=

--- remove storage_class = DEEP_ARCHIVE
--- use --retries=1
--- try --s3-no-check-bucket
--- post the full debug output, which includes the exact command.

I removed DEEP_Archive.

So the command is :

rclone copy --s3-no-check-bucket --retries=1 C:\_WORK\_POC_rclone\File.bak qad-aws-test:Test -vv

And the result is :

C:\_WORK\_POC_rclone\rclone-v1.66.0-windows-amd64>rclone copy --s3-no-check-bucket --retries=1 C:\_WORK\_POC_rclone\File.bak qad-aws-test:Test -vv
2024/04/05 16:23:40 DEBUG : rclone: Version "v1.66.0" starting with parameters ["rclone" "copy" "--s3-no-check-bucket" "--retries=1" "C:\\_WORK\\_POC_rclone\\File.bak" "qad-aws-test:Test" "-vv"]
2024/04/05 16:23:40 DEBUG : Creating backend with remote "C:\\_WORK\\_POC_rclone\\File.bak"
2024/04/05 16:23:40 DEBUG : Using config file from "C:\\Users\\obp\\AppData\\Roaming\\rclone\\rclone.conf"
2024/04/05 16:23:40 DEBUG : fs cache: adding new entry for parent of "C:\\_WORK\\_POC_rclone\\File.bak", "//?/C:/_WORK/_POC_rclone"
2024/04/05 16:23:40 DEBUG : Creating backend with remote "qad-aws-test:Test"
2024/04/05 16:23:40 DEBUG : qad-aws-test: detected overridden config - adding "{Dn7qA}" suffix to name
2024/04/05 16:23:40 DEBUG : fs cache: renaming cache item "qad-aws-test:Test" to be canonical "qad-aws-test{Dn7qA}:Test"
2024/04/05 16:23:40 ERROR : Attempt 1/1 failed with 1 errors and: Forbidden: Forbidden
        status code: 403, request id: 45MHGHP330SDP09M, host id: 45w4synVaC387vgH/a6mWEcOymBo3iS1kqGYqw19FWt1KaO4U/Aiwkt+mKPsPHeeJRr8F+alwd4=
2024/04/05 16:23:40 INFO  :
Transferred:              0 B / 0 B, -, 0 B/s, ETA -
Errors:                 1 (retrying may help)
Elapsed time:         1.8s

2024/04/05 16:23:40 DEBUG : 4 go routines active
2024/04/05 16:23:40 Failed to copy: Forbidden: Forbidden
        status code: 403, request id: 45MHGHP330SDP09M, host id: 45w4synVaC387vgH/a6mWEcOymBo3iS1kqGYqw19FWt1KaO4U/Aiwkt+mKPsPHeeJRr8F+alwd4=

that is not a valid bucket name???

can you post the iam/bucket polices?

Sorry, I forgot the bucket... But still the same message with the bucket

Request

rclone copy --s3-no-check-bucket --retries=1 C:\_WORK\_POC_rclone\File.bak qad-aws-test:qad-cloud-support-only-00092315-non-prod-s3-bucket/Test -vv

Result

2024/04/05 16:48:02 DEBUG : rclone: Version "v1.66.0" starting with parameters ["rclone" "copy" "--s3-no-check-bucket" "--retries=1" "C:\\_WORK\\_POC_rclone\\File.bak" "qad-aws-test:qad-cloud-support-only-00092315-non-prod-s3-bucket/Test" "-vv"]
2024/04/05 16:48:02 DEBUG : Creating backend with remote "C:\\_WORK\\_POC_rclone\\File.bak"
2024/04/05 16:48:02 DEBUG : Using config file from "C:\\Users\\obp\\AppData\\Roaming\\rclone\\rclone.conf"
2024/04/05 16:48:02 DEBUG : fs cache: adding new entry for parent of "C:\\_WORK\\_POC_rclone\\File.bak", "//?/C:/_WORK/_POC_rclone"
2024/04/05 16:48:02 DEBUG : Creating backend with remote "qad-aws-test:qad-cloud-support-only-00092315-non-prod-s3-bucket/Test"
2024/04/05 16:48:02 DEBUG : qad-aws-test: detected overridden config - adding "{Dn7qA}" suffix to name
2024/04/05 16:48:02 NOTICE: S3 bucket qad-cloud-support-only-00092315-non-prod-s3-bucket: Switched region to "us-east-2" from "us-east-1"
2024/04/05 16:48:02 DEBUG : pacer: low level retry 1/2 (error BucketRegionError: incorrect region, the bucket is not in 'us-east-1' region at endpoint '', bucket is in 'us-east-2' region
        status code: 301, request id: 6F5RRSJQ4WXWTYRM, host id: PhhhyqGqEWQz0V4F7vegvN6ziNeUxPQ7B+RKWRYHekxYtQkiCMOLwiHn05BlLpq1kV0YFPf1JSc=)
2024/04/05 16:48:02 DEBUG : pacer: Rate limited, increasing sleep to 10ms
2024/04/05 16:48:03 DEBUG : pacer: Reducing sleep to 0s
2024/04/05 16:48:03 DEBUG : fs cache: renaming cache item "qad-aws-test:qad-cloud-support-only-00092315-non-prod-s3-bucket/Test" to be canonical "qad-aws-test{Dn7qA}:qad-cloud-support-only-00092315-non-prod-s3-bucket/Test"
2024/04/05 16:48:03 ERROR : Attempt 1/1 failed with 1 errors and: Forbidden: Forbidden
        status code: 403, request id: NQ1G8TEA99BA31RT, host id: UyQuvyE4uOJLCx8ifCdTGelAYWOItR3Zfs1Yzpy9NsDbDNJm9/T+d8iJJ6L+NNWqkzcR61bP4LrqCpAlBL1Uvg==
2024/04/05 16:48:03 INFO  :
Transferred:              0 B / 0 B, -, 0 B/s, ETA -
Errors:                 1 (retrying may help)
Elapsed time:         2.5s

2024/04/05 16:48:03 DEBUG : 6 go routines active
2024/04/05 16:48:03 Failed to copy: Forbidden: Forbidden
        status code: 403, request id: NQ1G8TEA99BA31RT, host id: UyQuvyE4uOJLCx8ifCdTGelAYWOItR3Zfs1Yzpy9NsDbDNJm9/T+d8iJJ6L+NNWqkzcR61bP4LrqCpAlBL1Uvg==

Not sure, but it is what you need ?

{
    "Version": "2012-10-17",
    "Id": "Policy1651064685958",
    "Statement": [
        {
            "Sid": "Stmt1651064677217",
            "Effect": "Allow",
            "Principal": "*",
            "Action": [
                "s3:List*",
                "s3:Get*",
                "s3:*Object"
            ],
            "Resource": [
                "arn:aws:s3:::qad-cloud-support-only-00092315-non-prod-s3-bucket",
                "arn:aws:s3:::qad-cloud-support-only-00092315-non-prod-s3-bucket/*"
            ],
            "Condition": {
                "StringEquals": {
                    "aws:PrincipalAccount": "901366391253"
                }
            }
        }
    ]
}

need to fix that.

fwiw, for testing, really, you need to simply tings as much as possible.
each step of the way, you reveal yet another layer.
best to follow the rclone docs, get that working, then tweak it, one step at a time.

Hi,

Finally, I succeeded to push to AWS S3 whatever the file size !

To resume, of course I followed the official documentation. That one Amazon S3, section " Configuration".
But it doesn't mention to use the AWS "session_token". So as it didn't work (forbidden), I switched for provider "Other". And with "Other" it worked perfectly for small files... It was the root cause of the confusion :slight_smile: ... Then after that, I tried a lot of configuration settings.

Maybe you can just add a mention in the documentation about the session_token (In my case, it is mandatory, not optional as explained in the documentation)

Anyway, the good configuration for me is that one (but again : session_token is mandatory !)

[qad-aws-test]
type = s3
provider = AWS
access_key_id = XXX
secret_access_key = XXX
session_token = XXX
region = us-east-2

Thanks very much for your help !!

1 Like

good you got it working.

session tokens are optional and based on forum posts, 99% of rclone users do not use it.
but you can create an issue at github with your proposed changes.

fwiw, given that you could upload some files, your issue was not about session tokens.
if it were about invalid/expired session token, would have gotten 403 forbidden errors.

so, i think the issue was with your original config
--- setting provider incorrectly.
--- using strange value for endpoint, when endpoint was not even required.
--- using location_constraint, when it is not required.

Without token, I have a "403 forbidden" error.

yes, that is expected behavior.

the session token is an advanced feature of s3 and it always optional, never required.
we both choose to have a policy that requires it.
you create it via the aws console, i create it in python using boto3.