Process killed on Synology NAS with 2 GB memory

What is the problem you are having with rclone?

Sync process is "killed", I believe because system has only 2 GB of memory.

Run the command 'rclone version' and share the full output of the command.

rclone v1.64.2

  • os/version: unknown
  • os/kernel: 3.10.108 (x86_64)
  • os/type: linux
  • os/arch: amd64
  • go/version: go1.21.3
  • go/linking: static
  • go/tags: none

Which cloud storage system are you using? (eg Google Drive)

drive + crypt

The command you were trying to run (eg rclone copy /tmp remote:tmp)

export RCLONE_CRYPT_REMOTE=gdrive:bucketname
export RCLONE_DRIVE_CHUNK_SIZE=128M
export RCLONE_FAST_LIST=1

rclone sync \
  --crypt-filename-encryption=off \
  --crypt-directory-name-encryption=false \
  --crypt-password "$CRYPT_PASSWORD" \
  --local-no-check-updated \
  --stats=10s \
  -v \
  --ignore-errors \
  --ignore-case \
  --filter-from filter.txt \
  /volume1 :crypt:
[gdrive]
type = drive
client_id = XXX
client_secret = XXX
scope = drive.file
token = XXX
team_drive =

Can you help me what should I lower? Also, are the other defaults OK for such a limited system?

I'm trying out these values not, it seems to work much better:

export RCLONE_DRIVE_CHUNK_SIZE=8M
# export RCLONE_FAST_LIST=1 disabled, as it uses a lot of memory!
export RCLONE_TRANSFERS=4
export RCLONE_CHECKERS=4
export RCLONE_BUFFER_SIZE=16M
export RCLONE_USE_MMAP=true

hi, i also run rclone on a synbox with 2GiB.

if those values work, what do you need help with?

as a side issue, that is an old version of rclone, can try rclone selfupdate

I basically used Claude Sonnet to try to help me with those values. I wanted to ask here what is recommended. The process says 1 million files, and it takes about 2 hours with those values.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.