I am running Rclone with OneDrive, and using downloading series via Usenet.
The cache fills up once SABNZD is extracting, and brings everything to halt.
I need advice with what command to run that ensures, I can download, and extract files without crashing.
If I am doing something incorrectly - please advise what.
Thanks
Run the command 'rclone version' and share the full output of the command.
Which cloud storage system are you using? (eg Google Drive)
OneDrive
The command you were trying to run (eg rclone copy /tmp remote:tmp)
Paste command here
rclone --vfs-cache-mode writes mount "onedata1": --vfs-cache-max-size 25G /root/storage.onedata --allow-other &
#### The rclone config contents with secrets removed.
<!-- You should use 3 backticks to begin and end your paste to make it readable. -->
Name Type
==== ====
onedata1 onedrive
e) Edit existing remote
n) New remote
d) Delete remote
r) Rename remote
c) Copy remote
s) Set configuration password
q) Quit config
e/n/d/r/c/s/q>
Paste config here
How do I generate a log file which showed what's going on?
The only would be for me to rerun everything, and wait for the cache to fill up..
Rclone works fine for smaller files, and streaming.
It's when Usenet files have to extract as they download - this fills up the cache, and in no time gets filled up, then everything grids to a halt. I am running an Oracle VPS which has about 40gb of fixed SSD storage
All I'm looking for a suggestion to be able to extract my large usenet downloads (some of them are 60+GB) to my onedrive account - without filling the cache.
I have attempted to mount the drive without vfs-cache, but I got I/O errors galore.
Someone suggested rclone move
something like this
/home/cavedog/mda1/rdtclient/downloads gdrive0:/Uploads4 -v --transfers=2 --drive-chunk-size 256M, but not sure how that is going to help.