I run a backup script every night (using rclone v1.42) which syncs files from GDrive to an SFTP server. It ran smoothly for almost a year but since some weeks most script runs stop with fatal errors.
rclone sync gdrive: sftpserver:backup --backup-dir sftpserver:backup/$DAY -v --log-file=$LOGDIR/$NOW.log
In the log file I see almost this type of error messages:
ERROR : my/path: error reading destination directory: List failed: dirExists: couldn't initialise SFTP: ssh: unexpected packet in response to channel open: <nil>
Every month a copy job is also done which still runs smoothly
rclone copy gdrive: sftpserver:backup -v --log-file=$LOGDIR/$NOW.log
On my site no changes were made to the setup. Perhaps the SFTP server of the backup provider behaves somewhat differently now. Do you have any hints to debug or solve this issue?
1.42 is pretty ancient. Try wit the latest.
Do you have a limit with the # of connections to the SFTP server?
I will try the latest version although 1.42 ran fine until a few weeks ago. Never change a running system
I think the connection limit is set to 12 on the server. Do you think lowering
--checkers could help?
I wonder that
rclone copy seems still to work fine. Is there any difference in SFTP connection handling?
Yes, if the connection limit is 12, you definitely want to lower the checkers and transfers down to something smaller. That would be a great first step if you don't want to move up.
Rclone can use up to 12 connections by default - 8 checkers and 4 transfers so yes lowering these would be a good idea.
Since lowering transfers and checkers to 2 connections each, the last script runs happened without any error.
So it seems the backup server provider decreased the connection limit some weeks ago. But doesn't matter, even with the lowered settings the sync runs fast enough.
Thanks for help.
This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.