Error 4xx - dropbox

I am new to rclone and I am finding some difficulties to synchronize with dropbox.
When I try to sync my folders, the dropbox randomly shows 4xx errors in relation to some subfolders. I don't see any specific reason, because when I give the command to sync only those subfolders, the dropbox is able to sync them normally.
The same error occurs with the "tree" command.
It seems that it is a specific error of the dropbox, because I can sync these same folders without any issues on google drive and koofr using rclone commands.
I also noticed that each time rclone retries to sync these folders with an error, it checks all the others files and folders again, doing unnecessary work.
Does anyone know why the dropbox displays this 4xx error and how to avoid that?
Is there any way for rclone to just retries folders that have failed, since sometimes the retry works, without having to check all files again?

I'm using debian 10 (64 bits), rclone v1.51.0 (- os / arch: linux / amd64 - go version: go1.13.7).
Also, I am using my own app id (but there were errors before that too)

The log of my last sync (notice that the retries worked this time, but it is not alwalys like that):

https://bin.disroot.org/?81b7cb786d932b52#D2YVYgD3Rwi4TWqwxbbC9M5XnK5cypSExAbfmDjKqAnd

Thank you very much for your help

Can you run the command with -vv and share the debug log?

Sure!

https://bin.disroot.org/?77d3776091eee555#8UQ2A376aconrcto5f8V1UzhU18z8heBVqTMTf1S1ofb

Note that I only have about 8000 files, but rclone checked approximately twice that amount

There's no errors I can see in that log.

On your source, you can check files by running something like:

find "/home/mdgp/data/Magis/" -type f | wc -l

Rclone retries the sync as there was an error

2020/04/14 11:30:47 ERROR : m/UA/5I: error reading destination directory:

The error message is an HTML page though! I think it is the 404 page.

That looks like it is a bug in dropbox rather than rclone. It might be worth checking the dropbox developer forums to see what they have to say.

is it always the same folder that is the problem? Can you try rclone lsf on it with -vv --dump bodies to try to get it to go wrong with a smaller log?

find "/home/mdgp/data/Magis/" -type f | wc -l

It shows 8846

is it always the same folder that is the problem?

No, each time it shows different folders, which leads me to believe that it's a dropbox problem in dealing with many reading requests. It seems that it is really a specific dropbox problem, since google drive and koofr do not have the same problem with the same folders.
Since retries sometimes work, I don't see any option but to increase the number of retries with --retries and let rclone try until it succeeds.

Can you try rclone lsf on it with -vv --dump bodies to try to get it to go wrong with a smaller log?

I tried on that specific folder that you mencioned:
https://bin.disroot.org/?dda25722cdffa0e3#GMAvQ8ZEB5gikGdV47yZVNs7eZ2sApSFfan6xHBn5z1H

I can't see anything wrong with that dump.

I think you must be right, it is some kind of intermittent problem at dropbox. A trawl through the forums might help.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.