Overcoming WebDav's limitations

What is the problem you are having with rclone?

I am a game mod developer and I am currently using rclone to synchronize files between different player to as a way to deploy updates to my peers.

However I only recently noticed that currently plain WebDav does not support mod time and checksum checks, which are quite important for me since I usually deploy minimal game file updates with no change in file sizes. I know that this is well documented, but may I ask is it possible to workaround the limitation?

I've noticed that on Teracloud if I asks for the file's information using HEAD request, I can get the mod-time of the file. May I ask is it possible to incorporate this into the WebDav standard like how Owncloud or Nextcloud supports checksums?

Or are there any other more creative workarounds that I can use to achieve my goals?

Thank you for your time.

Run the command 'rclone version' and share the full output of the command.

rclone v1.65.2

  • os/version: Microsoft Windows 11 Pro 22H2 (64 bit)
  • os/kernel: 10.0.22621.3155 (x86_64)
  • os/type: windows
  • os/arch: amd64
  • go/version: go1.21.6
  • go/linking: static
  • go/tags: cmount

Which cloud storage system are you using? (eg Google Drive)

I am currently using plain WebDav from Teracloud. If possible I would like to refrain from changing this.

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone check "teracloud-master:/game_files/patches" "I:/game_files/patches"
2024/03/07 23:53:33 ERROR : No common hash found - not using a hash for checks
2024/03/07 23:53:33 NOTICE: Local file system at //?/I:/game_files/patches: 0 differences found
2024/03/07 23:53:33 NOTICE: Local file system at //?/I:/game_files/patches: 2 hashes could not be checked
2024/03/07 23:53:33 NOTICE: Local file system at //?/I:/game_files/patches: 2 matching files

I have two files to sync, one has the file contents changed (without any size changes)

Please run 'rclone config redacted' and share the full output. If you get command not found, please make sure to update rclone.

[teracloud-master]
url = https://wani.teracloud.jp/dav/xxx/
headers = Cookie,"tcs_ticket=xxxxxxx;Version=1;Path=/ds/dav/;Max-Age=3600;Secure"
type = webdav
vendor = other
user = XXX
pass = XXX
### Double check the config for sensitive info before posting publicly

A log from the command that you were trying to run with the -vv flag

2024/03/08 08:12:48 DEBUG : rclone: Version "v1.65.2" starting with parameters ["rclone" "check" "teracloud-master:/game_files/patches" "I:\\game_files_patches" "-vv"]
2024/03/08 08:12:48 DEBUG : Creating backend with remote "teracloud-master:/game_files/patches"
2024/03/08 08:12:48 DEBUG : Using config file from "C:\\Users\\descatal\\AppData\\Roaming\\rclone\\rclone.conf"
2024/03/08 08:12:48 DEBUG : found headers: Cookie,tcs_ticket=xxxxxxxx;Version=1;Path=/ds/dav/;Max-Age=3600;Secure
2024/03/08 08:12:48 DEBUG : fs cache: renaming cache item "teracloud-master:/game_files/patches" to be canonical "teracloud-master:game_files/patches"
2024/03/08 08:12:48 DEBUG : Creating backend with remote "I:\\game_files_patches"
2024/03/08 08:12:48 DEBUG : fs cache: renaming cache item "I:\\game_files_patches" to be canonical "//?/I:/game_files/patches"
2024/03/08 08:12:48 ERROR : No common hash found - not using a hash for checks
2024/03/08 08:12:48 DEBUG : Local file system at //?/I:/game_files/patches: Waiting for checks to finish
2024/03/08 08:12:48 DEBUG : PATCH_backup.json: OK - could not check hash
2024/03/08 08:12:48 DEBUG : PATCH.TBL: OK - could not check hash
2024/03/08 08:12:48 NOTICE: Local file system at //?/I:/game_files/patches: 0 differences found
2024/03/08 08:12:48 NOTICE: Local file system at //?/I:/game_files/patches: 2 hashes could not be checked
2024/03/08 08:12:49 NOTICE: Local file system at //?/I:/game_files/patches: 2 matching files
2024/03/08 08:12:49 INFO  :
Transferred:              0 B / 0 B, -, 0 B/s, ETA -
Checks:                2 / 2, 100%
Elapsed time:         1.0s

2024/03/08 08:12:52 DEBUG : 31 go routines active

welcome to the forum,

rclone supports modtime and checksum on the following webdav servers
Fastmail Files, Owncloud and Nextcloud only

in addition, rclone can act as a webdav server using
rclone serve webdav

what is the website for Teraclone/Teracloud?
from what i could see, they use a custom webdav server, so would need to modify rclone source code.

Sorry, I meant Teracloud, it was a typo.

Teracloud is just a cloud hosting service, and it natively supports WebDav. I am using it primarily because it is the one of the only services allowing me to distribute files to China player base, and does not require user to have authentication to download files.

By modifying rclone source code do you mean adding a special support case for this hosting service?

yes, that is what i mean.
you can write the code yourself

could sponsor the author to add modtime support and possibly checksums.

Alright gotcha, I'll see if it is feasible for me to do that (new to go lang) or just change my source.

Thanks for the answer

ok, let's us know what you do in the end.

I've decided to bite the bullet and use a more proper S3 bucket service that works in China (Qiniu), and currently it works fine for the most part.

However I have a question regarding the S3 key secrets being embedded in the configuration files. Since rclone requires both my keys to access the bucket to do synchronization, isn't this a security issue that my keys are exposed to the end user if I need to embed the whole rclone application in my tool?

I know there's a Configuration Encryption option in rclone, but since my project is open sourced, the passphrase to the Encryption still lives in the repo, which is not good. May I ask is there any way for rclone to just use Qiniu's bucket as a one directional sync download source?

Sorry if this is deviation from the original topic, I'll open a new topic if needed too.

You can create multiple keys with various privileges. From your description I would imagine that you need "read only" key your share in your repo and additional full access right keys you can use yourself to upload new content for example.

Yes, that's what I planned to, but afaik Qiniu does not have that functionality, and the closest thing I can find is IAM role based authorizations, but it seems like they are using it primarily as a token generator for upload / download / manage, which I am not sure how to use those tokens on rclone.

Here's the documentations I found (in Chinese)

Sorry, quite new to the S3 ecosystem so I am not well versed in these stuff, if I missed anything please let me know.

yes it is.
fwiw, with S3 remotes, i do not use a config file at all.
that config file is optional, not required by rclone.

perhaps make the bucket public.

store the client id/secrets in a separate file that is stored on local.
in my case, my backup script runs as client and server.
the client has to login to the server to get the credentials.

that is what my backup script does.

  1. create an IAM user, that requires MFA login and session tokens.
  2. create the session token, MFA code and temporarily client id/secret and have rclone use that.
    note, rclone cannot create the session tokens. you have to do that yourself.
    can use aws cli command line tool.

also, i have python code that will create the session token and MFA code,
then feed it to rclone, restic and any other tool compatible with aws s3 profiles

1 Like

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.