Which flag to stop a transfer after x amount of files?

What is the problem you are having with rclone?

Making a script and want to have rclone stop transferring after x amount of files. So every 100 files transfered, stop the transfer.
I see theres ' --max-transfer SizeSuffix ' but thats in terms of file size> i want the files to complete during transfer and at least copy a certain amount instead of by size

What is your rclone version (output from rclone version)

Which OS you are using and how many bits (eg Windows 7, 64 bit)

rclone v1.53.3

  • os/arch: linux/arm
  • go version: go1.15.5

Which cloud storage system are you using? (eg Google Drive)

GDrive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone copy source:sourcepath dest:destpath --transfers  4 --progress --(new flag)

The rclone config contents with secrets removed.

Not needed

A log from the command with the -vv flag

not needed

There isn't a flag on number of files.

You could generate a file list to transfer and feed that into rclone.

unfortunately the way i want my script to work, im always going to have data to be transferred, yet alone tbs worth. Would there be another flag that might do x amount over time?

Would --max-backlog techincally work? I can do --max-backlog 100 and itll run 100 files at a time?

If you mean amount transferred over time then --bwlimit might be what you are looking for? That will limit the transfer to so many bytes per second. So if you set --bwlimit 1M rclone will transfer no more than 1 MB per second.

If you want to limit files per time, you can't do that exactly but --tpslimit will work pretty nearly. There are usually 2-4 transactions per uploaded file so you can set a file per second limit approximately with that.

That will mean rclone keeps 100 files in the backlog of files to transfer so probably isn't what you want.

No --bwlimit isnt going to work how i want to. I need max upload speed in my case and isnt --tpslimit the same as --transfers?

For --max-backlog, i would like it transfer at 100 at a time. Would it look for 100 files in the queue, upload those, then stop? And if those files are uploaded not count towards the 100 queue?

No, TPS is transaction per second. So you have transfers/checkers/getting directory lists/file sizes/etc all going on and those would be transactions that happen.

You can limit transfers/checkers as that limits the amount of things happening at the same time.

If you limit TPS, that impacts everything.

Say you have 1000 files.
You can configure # of uploads the same time with transfers.
Max backlog would be how far ahead it's going to get a list of things to transfer. So if you set max-backlog to 10, it's going to iterate through those 1000 files and keep 10 files in the backlog for the next items to be uploaded.

I see. So i think tpslimit isnt going to be what i want. Maxbacklog is similar to what i need. Let me explain what exactly the script does. So I have photos and videos in a folder. Lets say it has 1000 files mixed with videos and photos of different sizes of memory.

What im wanting rclone to do is lets say upload 25 photos/videos at a time regardless of size. then itll run a totally different set of commands. After those commands are done, upload another 25 photos/videos, stop rclone completely, run the separate commands, then upload another 25. In a nutshell, in psudocode
rclone upload 25 photos
print done
rclone upload another set of 25 photos
print done

If im understanding correctly, with maxbacklog, itll upload 10 at a time like you mentioned, but itll still be running the same rclone command til 1000 files are uploaded in one shot.
So like this
rclone upload 10
print done

but done wont get printed til all 1000 files are uploaded

There still is not a way to upload x number of items and stop though which was your leading question.

You'd have to use something else to get those 25 files and use files-from or something along those lines.

Hopefully a --queue flag can be made in an update :pray:

Ill keep looking onto this since i need to backup files asap. Thanks for some guidance

You can do this with a small amount of scripting...

First use rclone lsf to get a list of files which need transferring.

rclone lsf --files-only /path/to/source > files

Now in a loop

# Get the first 25
head -n 25 files > top25
# Cut the 25 off the top of `files`
tail -n +25 files > new.files
mv new.files files

Now transfer the data

rclone copy --files-from-raw top25 /path/to/source remote:destination

Putting that all together in a script

#!/bin/bash

SOURCE=/path/to/source
DEST=remote:dest
FILES=/tmp/files
BATCH=25

rclone lsf --files-only "${SOURCE}" > ${FILES}

while [ -s ${FILES} ]; do
    # Get the first ${BATCH}
    head -n ${BATCH} ${FILES} > ${FILES}-batch
    # Cut the ${BATCH} off the top of `${FILES}`
    tail -n +${BATCH} ${FILES} > ${FILES}-new
    mv ${FILES}-new ${FILES}

    # Now transfer the data
    rclone copy --files-from-raw ${FILES}-batch /path/to/source remote:destination
done

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.