However! I might have found the problem - Going on 1h13m now - When I was uploading the new file config file via FTP, it seems it only got the file permissions for user to read/write (600) - Note, the user running rclone and the user uploading via FTP are the same, so it should not give any permissions errors, but seems there is something going on.
I changed the permissions on the rclone config file to just allow everyone(777) and it seems to have worked on the first refresh.
I just checked the config now and it has been reset to filepermission (600) after the token refresh. I’ll try to see if it can still refresh next time.
rclone resets the permissions to 600. However it needs to write to the file so if the file isn’t owned by the user running rclone then there will be trouble… There is an issue here which may be relevant: https://github.com/ncw/rclone/issues/1467
What I could really do with is a log with --vv --dump-bodies of the problem happening. However that isn’t going to work well while doing lots of uploads! If you could do one with -vv --dump-headers that would be useful - I’d like to see the transaction that caused the 401.
I can try to make a synthetic test uploading some files really slowly to see if I can replicate the problem.
If you quit rclone after it has that error and restart it, does it start working again?
Making an issue is a good idea - it won’t get lost then!
Can confirm that it is working for now. Currently pulling all my data from ACD. Thank you @ncw and to the very kind rClone user who provided this proxy.
Amazing ! That is working great and is really promising for the future.
I tried other programs suggested by Amazon that are so badly documented, ncw @rclone your far the best.
Thank you thousand times.
If enough people abuse the system (i.e storing >10TB), they will do for sure, as did Amazon. Unlimited cloud storage is not a personal NAS as far as I can tell, unless you are prepared to pay ACD new prices.
With just 2 days left of my free ACD trial this could not have come at a better time. I was thinking of just re-downloading all my data again!
Currently getting between 42-50MB (400Mb) pulling from ACD to Google so may just make the full transfer of 12.3TB if Amazon don;t cut me off too quickly.
I’ve managed to track down one cause of this from a user which used the latest beta on their PC to get the tocken, but were running an older version on the place where the files were being copied. You need the latest beta in both places.