Rclone auto-zips arbitrary files upon download

What is the problem you are having with rclone?

Upload 1000 tiny text files and 1000 tiny html files with, even with --no-gzip-encoding option, then cat them to find that arbitrary files are being downloaded as gzip files. Special Note: when downloaded with s3cmd, the files are not gzipped (they are correct, as-is), so cloud vendor settings do not seem to be the problem. Additionally, system admins ensure that all auto-zip on their end is turned off.

Run the command 'rclone version' and share the full output of the command.

rclone v1.74.0-DEV

  • os/version: rocky 8.9 (64 bit)
  • os/kernel: 4.18.0-513.18.1.el8_9.x86_64 (x86_64)
  • os/type: linux
  • os/arch: amd64
  • go/version: go1.26.2
  • go/linking: static
  • go/tags: none

Which cloud storage system are you using? (eg Google Drive)

Redhat Ceph (s3)

The command you were trying to run (eg rclone copy /tmp remote:tmp)

```
### create small text/html files and upload

#!/usr/bin/env bash

set -euo pipefail

outd="MYPROFILE:MYBUCKET/projects/tryme"

for i in $(seq 1 1000); do
    html_file="${i}.html"
    txt_file="${i}.txt"

    cat > "$html_file" <<EOF
<!DOCTYPE html>
<html lang="en">
<head>
    <meta charset="UTF-8">
    <meta name="viewport" content="width=device-width, initial-scale=1.0">
    <title>Sequence ${i}</title>
</head>
<body>
    <p>Sequence Counter: ${i}</p>
</body>
</html>
EOF

    printf 'Hello world. My favorite number is %s\n' "$i" > "$txt_file"

    rclone copy "$html_file" "$outd" --no-gzip-encoding
    rclone copy "$txt_file" "$outd" --no-gzip-encoding

    rm -f "$html_file" "$txt_file"
done

exit 0


### then, cat the dir of files to find arbitrary files are gzipped upon download

### run1; receive warning 2026/05/08 10:45:20 NOTICE: 385.html: Not decompressing 'Content-Encoding: gzip' compressed file. Use --s3-decompress to override
rclone cat MYPROFILE:MYBUCKET/projects/tryme > output.txt

### run2; receive warning 2026/05/08 11:04:29 NOTICE: 1.html: Not decompressing 'Content-Encoding: gzip' compressed file. Use --s3-decompress to override
rclone cat MYPROFILE:MYBUCKET/projects/tryme > output.txt

### importantly, in the first run where file 385.html gives a warning 1.html is fine.  In the second run, 385.html is fine.
###  the files that download gzipped are arbitrary and no pattern emerges, other than about 1-2 files of the 2000 incorrectly download as gzipped.

### also important, I can download these files using s3cmd and they are all fine (no gzip), using this script:

#!/usr/bin/env bash
set -euo pipefail

BUCKET="MYBUCKET"
PREFIX="projects/tryme/"
OUTPUT="output.txt"

# clear output file
: > "$OUTPUT"

# list objects, extract keys, then stream each one
s3cmd ls "s3://$BUCKET/$PREFIX" | awk '{print $4}' | while read -r obj; do
    # skip directories (s3cmd may list them)
    [[ "$obj" == */ ]] && continue

    s3cmd get "$obj" - >> "$OUTPUT"
done

exit 0

### equivalently, individually downloading the files via rclone cat produces many more that are gzipped than
###   when I cat an entire directory's contents in 1 call.  The problem is much more pronounced here.
#!/usr/bin/env bash
set -euo pipefail

OUTPUT="output.rclone.txt"

: > "$OUTPUT"

BASE="MYPROFILE:MYBUCKET/projects/tryme"

rclone lsf "$BASE" --recursive --files-only | while read -r file; do
    rclone cat "$BASE/$file" >> "$OUTPUT"
done

exit 0

### Finally, using --s3-decompress is not a good option as gzip files are uploaded in general and need to be downloaded as-is
###   Suggestion: Do not do people 'favors' by auto-zipping/unzipping but instead save and deliver content as-is

```

The rclone config contents with secrets removed.

```
[MYPROFILE]
type = s3
provider = Ceph
access_key_id = REDACTED
secret_access_key = REDACTED
endpoint = https://s3.kopah.uw.edu
use_already_exists = false

```

A log from the command with the -vv flag

```
2026/05/08 11:39:36 DEBUG : rclone: Version "v1.74.0-DEV" starting with parameters ["rclone" "cat" "MYPROFILE:MYBUCKET/projects/tryme" "-vv"]
2026/05/08 11:39:36 DEBUG : Creating backend with remote "MYPROFILE:MYBUCKET/projects/tryme"
2026/05/08 11:39:36 DEBUG : Using config file from "~/.config/rclone/rclone.conf"
2026/05/08 11:40:05 NOTICE: 369.html: Not decompressing 'Content-Encoding: gzip' compressed file. Use --s3-decompress to override
...

hi, have you tried that?

I have, thanks. It’s not too useful in my case as I often upload gzip files as well.

Try your test on AWS S3 or Minio - I suspect this may be a ceph oddity

That’s a good call, thank you. I did the same test with rclone using aws s3 and everything went through fine.