Hi guys,
i am syncing to my dropbox by
rclone sync /mnt/backup2/data/ dropbox:Backups/data --tpslimit 1 --transfers 1 --checkers 1 --progress --exclude-from "/home/myuser/rclone/filter.txt"
and getting errors like "error reading destination directory". Its not always the same directory, it's random and i don't see any problems with the destination directories.
What is causing this, how can i prevent it?
Thank you
The full command you’re attempting to use
rclone sync /mnt/backup2/data/ dropbox:Backups/data --tpslimit 1 --transfers 1 --checkers 1 --progress --exclude-from "/home/myuser/rclone/filter.txt"
A logfile of rclone’s output with personal information removed. If it is large, you can use services like pastebin.com . It’s usually helpful to increase the logging with -v or -vv depending.
2024/11/04 17:29:59 INFO : Starting transaction limiter: max 1 transactions/s with burst 1
2024/11/04 17:32:31 ERROR : Financies: error reading destination directory:
The rclone config you’re using. If you don’t know where to find it, check here. Before posting ensure you’ve removed any confidential information like credentials.
[dropbox]
type = dropbox
token = {"access_token":"Here is my access token","token_type":"bearer","refresh_token":"Here is my refresh token","expiry":"2024-11-04T18:40:55.110518942Z"}
4.What version of rclone you’re using. It’s also helpful to try rclone with the latest beta if you’re using a stable release to understand if your issue was recently fixed.
rclone v1.60.1-DEV
os/version: raspbian 12.7 (64 bit)
os/kernel: 6.6.51+rpt-rpi-v8 (aarch64)
os/type: linux
os/arch: arm64
go/version: go1.19.8
go/linking: dynamic
go/tags: none
asdffdsa
(jojothehumanmonkey)
November 4, 2024, 3:04pm
2
Welcome to the forum,
When you posted there was a template of questions for you to answer. Please answer all of them so we can help you...
asdffdsa
(jojothehumanmonkey)
November 4, 2024, 11:05pm
3
adkins42:
rclone v1.60.1-DEV
that is an old custom compiled version.
rclone selfupdate
or
uninstall the old version and install the latest version
https://rclone.org/install/#script-installation
and test again.
1 Like
Thank you, i installed the latest version and that seemed to help, it's been running a long while, but now getting these errors:
2024/11/05 19:09:37 ERROR : Dropbox root 'Backups/MyData': sync batch commit: failed to commit batch length 1: batch had 1 errors: last error: upload failed: too_many_write_operations
2024/11/05 19:09:37 ERROR : xyz.xaml: Failed to copy: upload failed: batch upload failed: upload failed: too_many_write_operations
2024/11/05 19:09:50 ERROR : Dropbox root 'Backups/MyData': sync batch commit: failed to commit batch length 1: batch had 1 errors: last error: upload failed: too_many_write_operations
2024/11/05 19:09:50 ERROR : abc.xaml: Failed to copy: upload failed: batch upload failed: upload failed: too_many_write_operations
asdffdsa
(jojothehumanmonkey)
November 5, 2024, 7:37pm
5
ok, please post the output of
rclone version
rclone config redacted dropbox:
top 20 lines of a debug
log, not just snippets.
1 Like
rclone version
rclone v1.68.1
os/version: raspbian 12.7 (64 bit)
os/kernel: 6.6.51+rpt-rpi-v8 (aarch64)
os/type: linux
os/arch: arm64 (ARMv8 compatible)
go/version: go1.23.1
go/linking: static
go/tags: none
rclone config redacted dropbox
[dropbox]
type = dropbox
token = XXX
Double check the config for sensitive info before posting publicly
top 20 lines of a debug log, not just snippets.
Well i am having many many info-lines from copying, so i hope this is enough:
2024/11/05 00:08:18 INFO : Starting transaction limiter: max 1 transactions/s with burst 1
2024/11/05 01:25:58 NOTICE: too_many_requests/..: Too many requests or write operations. Trying again in 5 seconds.
2024/11/05 03:48:36 INFO : /4fe18ce34709afb657c07ff0e9b2wajj51fa57830: Copied (new)
2024/11/05 19:09:26 INFO : LAS.xaml.cs: Copied (new)
...
2024/11/05 19:09:37 ERROR : Dropbox root 'Backups/MyData': sync batch commit: failed to commit batch length 1: batch had 1 errors: last error: upload failed: too_many_write_operations
2024/11/05 19:09:37 ERROR : LAS_Request.xaml: Failed to copy: upload failed: batch upload failed: upload failed: too_many_write_operations
2024/11/05 19:09:50 ERROR : Dropbox root 'Backups/MyData': sync batch commit: failed to commit batch length 1: batch had 1 errors: last error: upload failed: too_many_write_operations
2024/11/05 19:09:50 ERROR : LAS_Request.xaml.cs: Failed to copy: upload failed: batch upload failed: upload failed: too_many_write_operations
...
2024/11/05 19:12:36 INFO : TouchKeyboardLayoutResources.de-DE.baml: Copied (new)
asdffdsa
(jojothehumanmonkey)
November 5, 2024, 8:43pm
7
looks like you are using a app id that is shared with all rclone users.
that could be the reason for the errors.
When you use rclone with Dropbox in its default configuration you are using rclone's App ID.
This is shared between all the rclone users.
create and use your own app id
test again.
I did everything from your link as described (creating app, set permissions), created a new connection and added app-key and secret.
After 14 min i got the error:
2024/11/05 23:41:05 INFO : Starting transaction limiter: max 1 transactions/s with burst 1
2024/11/05 23:55:25 ERROR : FolderA/FolderB/FolderC/FolderB: error reading destination directory:
rclone config redacted dropbox:
[dropbox]
type = dropbox
client_id = XXX
client_secret = XXX
token = XXX
That's the only error within 40 minutes though!
asdffdsa
(jojothehumanmonkey)
November 6, 2024, 1:01am
9
ok, making progress.
at this point, not sure what the issue is.
the only thing i can suggest is using --dump=headers
, for a deeper look
Hi,
thank you very much for your support!
the only thing i can suggest is using --dump=headers
, for a deeper look
I did this and Request/Response is:
2024/11/06 01:17:48 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2024/11/06 01:17:48 DEBUG : HTTP REQUEST (req 0x5c7894000d)
2024/11/06 01:17:48 DEBUG : POST /2/files/list_folder HTTP/1.1
Host: api.dropboxapi.com
User-Agent: rclone/v1.68.1
Content-Length: 236
Authorization: XXXX
Content-Type: application/json
Accept-Encoding: gzip
2024/11/06 01:17:48 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2024/11/06 01:19:40 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2024/11/06 01:19:40 DEBUG : HTTP RESPONSE (req 0x5c7894000d)
2024/11/06 01:19:40 DEBUG : Error: read tcp <<IP & PORT>>-><<IP & PORT>>: read: connection reset by peer
2024/11/06 01:19:40 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2024/11/06 01:19:40 DEBUG : pacer: low level retry 1/10 (error Post "https://api.dropboxapi.com/9/files/list_folder ": read tcp <<IP & PORT>>-><<IP & PORT>>: read: connection reset by peer)
2024/11/06 01:19:40 DEBUG : pacer: Rate limited, increasing sleep to 20ms
asdffdsa
(jojothehumanmonkey)
November 6, 2024, 1:45pm
11
adkins42:
connection reset by peer
agai, not sure exactly what is going on.
could be a networking isssue on your end, or an issue on dropbox end.
agai, not sure exactly what is going on.
could be a networking isssue on your end, or an issue on dropbox end.
Ok.
I try to backup my most important data from my NAS to Dropbox. So it would be good to be stable, but not so sure what to do now. Adding "--ignore-errors" ?
Would you recommend another cloud hosting service?
asdffdsa
(jojothehumanmonkey)
November 6, 2024, 2:59pm
13
never depend on that flag.
sure,
what is the total size of all data to be backed-up?
outside of a one time disaster, do you plan to ever download the data?
currently, do you keep copies of the data in multiple locations?
sure
If you do recommend something, can you be sure i won't have these issues most likely?
what is the total size of all data to be backed-up?
260 GB. Might grow a little. 1 TB should be fine
outside of a one time disaster, do you plan to ever download the data?
No
currently, do you keep copies of the data in multiple locations?
Yes, got them mirrored once in the NAS and backed up to 2 external HDDs. But all in my house and no other physical place.
asdffdsa
(jojothehumanmonkey)
November 6, 2024, 3:18pm
15
there are so many ways to store data in cloud.
rclone has great support for S3 providers.
i keep recent backups in wasabi. in a disaster, they have great download speeds.
older data stored in aws s3 deep glacier, approx $1.00/TiB/Month.
idrive is a very good choice, they offer a free plan, that you could test with.
and for backups, check --backup-dir
flag
rclone sync /path/to/files remote:current --backup-dir=remote:archive/`date +%Y%m%d.%I%M%S`