Is it a big no-no to download directly to Rclone mount using wget or aria2c? How does Rclone mount handle all the different segments and checks? How about resuming downloads on the mount?
The problem I’m having is I’m trying to download about ~400Gb in ~7000 files to my Gdrive from HTTP, but the HTTP server doesn’t allow listing (i.e. adding it as Rclone remote doesn’t work). I have all the urls in a text file, line by line which is great for aria2c/wget but I can’t store much locally as I’m using a VPS with only 50Gb drive.
What’s the best way around this?
p.s. It would be AMAZING if Rclone allowed similar option to aria2c/wget to download links from file (-i file.txt)!
rclone mount is quite restrictive on what it accepts so it can only write files which are mounted write only and it can’t seek in them. So it is unlikely to work at the moment with torrent files.
Download to a local directory, then use rclone move when it is successful. rclone move will do retries too, unlike the mount so it will be more reliable.
Ah! This looks promising, wasn’t even aware of that.
Just tried it but it seems that Rclone still needs to list directories, rather than downloading ‘blindly’? The problem is that the HTTP remote I’m using doesn’t allow listing directories. Wget and Aria2c works as they don’t list the directories, just downloads the exact file you give them, is that not the same with Rclone?
2017/11/06 17:46:16 DEBUG : LPRD-0x000B0037-000000.lf3: Excluded from sync (and deletion)
2017/11/06 17:46:16 DEBUG : LPRD-0x000B001A-000000.lf3: Excluded from sync (and deletion)
2017/11/06 17:46:16 DEBUG : LPRD-0x000B0015-000000.lf3: Excluded from sync (and deletion)
2017/11/06 17:46:16 DEBUG : LPRD-0x000B000B-000000.lf3: Excluded from sync (and deletion)
2017/11/06 17:46:16 ERROR : : error reading source directory: error listing "": directory not found
2017/11/06 17:46:16 INFO : Google drive root 'Software/LF/Download': Waiting for checks to finish
2017/11/06 17:46:16 INFO : Google drive root 'Software/LF/Download': Waiting for transfers to finish
2017/11/06 17:46:16 ERROR : Attempt 3/3 failed with 1 errors
Ah, any idea which version still had that and is it worth using it if the Gdrive portion of the code is too outdated for things like rate limiting etc?
Or would it be too cheeky to make a github request to add that in in the future?
I think you’d have to go back to 1.35 - however that doesn’t support the http remote!
I had to take that out for lots of good reasons so I don’t think I would put it back in the sync command. However I could envision a different command.
Thinking of that, I think you could use rclone copyto for this for copying one at a time.
If you make an http remote then try rclone copyto httpsource:file drive:file and that should copy a single file with no directory traversing. You could script that easily enough.