Rclone utilizing more bandwidth while copy the objects from aws s3

What is the problem you are having with rclone?

I am using rclone for mounting s3 buckets. This mount is very helpful for some etl jobs. But when i trying to start the etl job the database connection is failing due to the network load. After I checked, rclone utilizing more netwrok pockets both IN and OUT. Please help me to resolve this issue.

Run the command 'rclone version' and share the full output of the command.

~/bin/rclone mount remote-test:test1-2etl-dev-bucket-us-west-2 /mnt/test_bucket --allow-other --dir-perms 0777 --file-perms 0777 --log-file=/mnt/rclone.log --log-level INFO &

Which cloud storage system are you using? (eg Google Drive)

S3

The command you were trying to run (eg rclone copy /tmp remote:tmp)

Paste command here

The rclone config contents with secrets removed.

Paste config here

A log from the command with the -vv flag

Paste  log here

If you want rclone to use less network bandwidth check out the --bwlimit flag

Hey Nick,

I already used --bwlimit with 1 GB again it utilizing more bandwidth. Now I upgraded the system with 18 core and 192 GB Memory. What would be the best bwlimit I can use ?

--bwlimit 1GB is 1 GByte/s

If you want rclone to run as fast as possible don't set --bwlimit.

If you want to leave bandwidth for other things, then set to 80% (say) of your bandwidth. So assuming your server has a 1 GBit/s connection, that is 125 MByte/s and 80% of that is 100 MByte/s so you'd set --bwlimit 100M.

I'm not sure exactly what your problem is, can you explain again please?

I am using rclone to mount the s3 buckets in one of my ec2 instance. I am using that instance to run some etl jobs. The files are located in s3 bucket like in the mount point. When I start the etl job, I am facing database connection issue due to the network load. As I checked, rclone using more bandwidth while copy the files from the mounting.

What is the upload bandwidth of your external connection to the internet?

Let's assume 100 Mbit/s

Divide this number by 8 to get 12.5 Mbyte/s.

Now reduce it a bit to leave some free space in your upload. Let's say 10 Mbyte/s

Now use that value as "--bwlimit 10M"

If you don't know the bandwidth, experiment with the value. Start with 10M and halve or double until you find the highest value which works and doesn't cause the db to drop