Unable to migrate huge S3 Bucket (500 millions objects / around 100Tb)

hi,

another way to reduce memory usage is to reduce --transfers.
https://rclone.org/s3/#multipart-uploads
"Multipart uploads will use --transfers * --s3-upload-concurrency * --s3-chunk-size extra memory"

and might want to use multiple runs of rclone.

  1. get list of source files using rclone ls > file.lst
  2. split file.lst into multiple files, file01.lst, file02.lst, filexx.lst
  3. rclone copy source dest --files-from=file01.lst`
1 Like