What is the problem you are having with rclone?
I have managed to fill up my dropbox where I have lots of files in lots of directories. the dropbox program isn't synching anymore and all subdirectories in the local "dropbox folder" on my laptop appear empty. I'm trying to copy the entire remote to a local directory (other than the official "dropbox folder"). I got problems with rsync, or rather the remote, complaining about too many requests.
To overcome this I generated a list of all the top level directories in the remote starting from the output of rclone lsd
and wrote a Perl script which goes throug this list and runs rclone copy
once for each item in this list, then sleeps for 10 seconds. The idea is to later empty the local "dropbox folder", except for shared directories, then doing the same on the remote using the same script but running rclone delete
on each item.
I was in a bit of a fix so I resorted to using tools/techniques which I already know, but now my question is how I should do to have rclone itself pause between requests so as to not get those "too many requests" complaints. I assume that it should be possible.
Also may there be problems because rclone keeps the connexion to the remote open rather than making a new connexion for each top level directory on the remote as I do with the perl script?
Run the command 'rclone version' and share the full output of the command.
Can't do this ATM as I can't get the output from the command from my laptop to my phone. Sorry. Will fill in later.
Which cloud storage system are you using? (eg Google Drive)
Dropbox
The command you were trying to run (eg rclone copy /tmp remote:tmp
)
rclone copy dropbox: ~/dbx-clone
The rclone config contents with secrets removed.
Same problem: can't get info from laptop to phone, nor send from laptop as laptop is busy with the stuff described above.
Paste config here
A log from the command with the -vv
flag
Same problem.
Paste log here