Is it possible to resume an interrupted download or upload?

I want to write a new backend, is it possible to resume interrupted download or upload?
If I do this by writing info into a temp file, how can I tell rclone?
And, when fs.PutStream() will be called?
Items in fs/operations/multithread.go are not exported, does it mean these items are only used internally?

Hi, anyone there...

When doing a copy the source can break and rclone will reopen it and seek to the right point and retry. That is automatic and built in to rclone for all backends.

Rclone doesn't have a resume download/upload if you stop rclone and restart it yet.

When rclone doesn't know the size of the file, so for instance using rclone rcat or copying a google document (we don't know their sizes until after we've downloaded them).

That is the intention - you shouldn't need them from a backend. Which items were you interested in?

Thanks.
I want to know how to do multi-thread upload before, now I understand, rclone will implement it by itself.
No questions now.

1 Like

I'm going to write backend for Baidu Net Disk (I'v read rclone/rclone/issues/2099), it can upload quickly by scanning local file's md5sum, but I haven't found this interface in fs/fs.go, so I should add this upload method code in fs.Put(), right?
I thought about it again, I write it just for backup, so there should not be same file as on cloud.

The mailru backend does this if you want to take a look.

I would say, make it work first, then add the additional features :slight_smile:

OK, thanks. :+1:

1 Like

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.