I know there was discussion of this in the past, but before I get too deep into something I know little about (the rclone source) has there been any conclusion on adding support for gzip transcoding upload for GCS?
For those wondering what;
This is documented here: https://cloud.google.com/storage/docs/transcoding
In summary if you pass the -z "exts" or -Z to gsutil cp then files are stream gzipped and the Content-Encoding is set to gzip. This allows pre-compression for storage and also compressed HTTP serving of files from a static website.
I just tested this with rclone manually:
gzip index.html
mv index.html.gz index.html
rclone copy ./index.html gcs:www.literature.org/test/ --header-upload "Content-Encoding: gzip"
shows as 2k instead of 5k in the browser. Then
rm index.html
wget https://www.literature.org/test/index.html
mv index.html index.html.gz
gunzip index.html.gz
and the file is intact. Accessing the page in a browser works as expected.
My website is - as should be obvious from the above - served this way and one of the analytics suggestions was to gzip files as then they serve them with optional compression to the client-side and all processing is client side and there are a couple of mid-sized JS files which could do with compression (only 16k for the main one but still...)
So, at this point I think the "plumbing" is more about processing options and applying gzip stream compression to files that match a pattern (I would think using rclone include/exclude syntax rather than using googles own extension syntax) or everything, but leaving other metadata (except Content-Length, which I'm not sure is required) alone?
The other question is what flag name(s) to use?
I've started looking at the codebase, but what may take me a week may take someone else 1/2 hour...
Peter