Update Create failed: sftp: "Bad message" (SSH_FX_BAD_MESSAGE)

STOP and READ USE THIS TEMPLATE NO EXCEPTIONS - By not using this, you waste your time, our time and really hate puppies. Please remove these two lines and that will confirm you have read them.

What is the problem you are having with rclone?

I found some topics with this error message on this forum, but none seemed to really explain what happened or how can I avoid it.

I copied a series of folders to one rclone mount (it's a crypt that in turns connects to a sftp on a remote server). After transferring about 90000 files, I got ~50 errors like the one in the title:

2024/07/20 23:53:26 INFO : path/to/file/file.zip.veqaxin1.partial: Failed to remove failed partial copy: stat failed: "Bad message" (SSH_FX_BAD_MESSAGE)

2024/07/20 23:53:26 ERROR : path/to/file/file.zip Failed to copy: Update Create failed: sftp: "Bad message" (SSH_FX_BAD_MESSAGE)

They always seem to be in pairs. It is possible the issue appears due to the long path and filename, but the problem here is that the file was moved from the original location to the rclone mount, so now the files doesn't exist anymore, neither in the original location or the remote host.

I think I can retrieve the missing files from a backup, but it would be interesting to know how to prevent that in the feature. I believe rclone should return the files to the original location if they couldn't be moved for whatever reason.

Run the command 'rclone version' and share the full output of the command.

rclone v1.66.0

  • os/version: debian 12.6 (64 bit)
  • os/kernel: 6.1.0-22-amd64 (x86_64)
  • os/type: linux
  • os/arch: amd64
  • go/version: go1.22.1
  • go/linking: static
  • go/tags: none

Which cloud storage system are you using? (eg Google Drive)

SFTP + Crypt on a remote virtual private server.

The command you were trying to run (eg rclone copy /tmp remote:tmp)

This is how I mount my

 rclone mount "crypt rclone_VPS:" /media/rclone_VPS/ \
--allow-non-empty \
--rc \
--rc-web-gui \
--rc-web-gui-no-open-browser \
--rc-addr 0.0.0.0:5572 \
--rc-user myusername \
--rc-pass <redacted> \
--vfs-cache-mode full \
--vfs-cache-max-size 70G \
--vfs-cache-max-age 48h \
--cache-dir /mnt/md1/.rclone_cache/ \
--stats-one-line \
-P \
--stats 2s \
--log-level INFO \
--allow-other \
--uid 1000 \
--gid 1003 \
--dir-perms 0770 \
--file-perms 0770

The "rclone_VPS:" is just a regular sftp remote.

Please run 'rclone config redacted' and share the full output. If you get command not found, please make sure to update rclone.

[crypt rclone_VPS]
type = crypt
remote = rclone_VPS:/mnt/vdb1/data
password = XXX
password2 = XXX



[rclone_VPS]
type = sftp
host = XXX
user = XXX
key_pem = XXX
shell_type = unix
md5sum_command = md5sum
sha1sum_command = sha1sum
### Double check the config for sensitive info before posting publicly

A log from the command that you were trying to run with the -vv flag

  • Couldn't get a log file : (
Paste  log here

welcome to the forum,

might try --inplace

--log-level DEBUG --log-file=/path/to/rclone.log

Thanks. I was displaying the logs on screen using the --stats, this is whyit didn't save the logfile.

Anyway, I think I figured out the cause, when using a crypt remote the filename character limit seems to be 143 characters.

My issue here is that there's no way to prevent the issue until it happened, and once it happens the file has already been lost.

There should be a mechanism that when rclone detects you moved a file to a mount and the mount couldn't upload the file, it should leave the file in the original location instead of the limbo that is the rclone cache.

can use rclone move

tho, that would be nice, but i cannot imagine how that would work in practice.
perhaps try --magic ;wink

I usually use the Rclone WebUI to supervise if there have been errors, and this is how I first found about the one in the post. I miss a way to manage those cases in the WebUI. It would be great to be able to see the current upload/download queue, which files are stuck in there retrying, and being able to cancel or retry specific files in there (just like the upload/download queue in FileZilla).

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.