Escaping special characters

What is your rclone version (output from rclone version)

rclone v1.50.2

  • os/arch: windows/amd64
  • go version: go1.13.4

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Windows 10 x64

Which cloud storage system are you using? (eg Google Drive)

SFTP

This might be an edge case, but I figured I'd ask. I'm looking for an option like --escape-special-characters.

I have a server where I dump a lot of files. A lot of the directories have special characters, such as {, }, [, ], etc. A quick search revealed that these characters have special meaning. I want to make a script that will download new files, but exclude everything in an exclude.txt file, using --exclude-from. After rclone finishes the download, the script will create a new exclude.txt file using rclone lsl remote: > exclude.txt. It would be great if rclone could escape special characters in the exclude.txt file with \. Possible? Too much of an edge case?

--exclude-from is really meant for sets of filters, not complete pathnames. That is probably why such a feature doesn't exist. You could end up matching potentially thousands of "patterns" against each name like that. Even if these are simple checks, if you have to do thousands of them against every file the remote lists then it's going to use a lot of CPU. It's not the intended purpose at all - even if it does work I suppose.

--files-from on the other hand is not for filters, so in that the special characters will not be a problem for rclone.

You could always just escape it yourself in the script. Doing simple text-manipulation is usually not too hard to do.

But your setup is pretty weird to begin with here. Can you explain the reason for trying to do this and what problem you are trying to solve? Perhaps I can offer a better solution to the underlying problem rather than offer a solution to this strange workaround?

Thank you for your reply, and for the explanation. I'll try and explain my use case.
I have a remote server used as a file downloader. I then later on want to copy files from the remote server to a directory on my local computer, using a modified version of your most excellent Archivesync v1.3 script. I leave a copy of the original file behind on the remote server until I'm ready to delete it. I will move files out of the directory on my local computer and I figured I would use rclone lsf to make a file with a list of the files I have already copied from the remote server, so they won't be copied again. Sometimes the file names and directory names on the remote server will contain special characters as noted in my original post. I'd like to escape those characters so that rclone understands. Does that make sense?

Hmm, I understand. You want to kind of mark them as downloaded but keep a copy of them for a while anyway. I presume that this is an automatic process such that it is not practical to simply pick and choose what you want to download manually on a mount?

I would think that the easiest and cleanest way to solve this is just to use the --backup-dir functionality.

Then when you use a regular rclone move to download stuff - instead of actually being deleted we can just make those files go into an "alreadydownloaded" folder. That keeps your working directories neat and clean and only containing the useful stuff - but you can always go back and re-download something again if needed.

At least that sounds to me like it would achieve the same goal, but in a much simpler way.
I don't know what you are using the existing archive for (in reference to archivesync) but if you wanted to keep these things separate then there's no reason you can't just use a separate command for this purpose. This could even be used on a separate mount if that is how you prefer to access your files normally.

Does this sound reasonable?

Sorry for being vague! While certainly a great solution, the files downloaded to the remote server need to remain in their original directory as they are being seeded by a torrent client. Otherwise --backup-dir would've been perfect for the job.

I used your script as a learning tool for batch and as a template. It's been modified quite a lot and I removed --backup-dir="%archivepath%\%date%.

I'm experimenting, poking at things, breaking things and generally having fun learning, while trying to come up with solutions I can use to make my life a little easier. What I'm trying to do is to find out if rclone can replace my old method of manually downloading files using an FTP client. :slight_smile:

Ah I see. I suspected that might be the case :stuck_out_tongue:

This is maybe a silly question, but you are aware of rclone's ability to mount the remote and present it as a normal harddrive in your OS right? Because this obviously makes it really easy and convenient to manually pick stuff you want - when you want - straight from the server. Heck - potentially even run programs or view media straight from it too if you want. The only reason to make a more complicated solution is if it has to be an automated system, so please clarify that first before I make further suggestions.

Absolutely, I've been using rclone mount with Google Drive for a while now and it works perfectly. I did however not even consider I could use it with a FTP remote as well... I've tested it now and it does work really well!

Has to be is maybe exaggerating a little. It would be very nice though as I could set up scheduling for automation. I'm looking in to the Find And Replace Text command line utility (which has an amazing acronym) right now and maybe that utility can solve what I'm trying to do. :wink:

Yea you can mount anything rclone can connect to - and it can connect to almost anything. It's quite nice for the convenience factor - and also for letting other applications access things they otherwise were not designed for.

Sure, or if not - it's not that hard to script stuff like this in batch. No reason to re-invent the wheel if you find something that does it easily out of the box though. I might have to make a mental note of that utility myself. If you find it works well for a scenario like this then please let me know as I will probably find a good use for that at some point.

Another idea that might work for you however is to use filtering.
If you just did an rclone copy --max-age 24h then you would only get files new in the last 24 hrs. Combining this with a daily pull from the server (or more frequently with a lower --max-age) seems like it could be a workable solution too. If you wanted to be very detailed about it - you could log a timestamp of the time the last operation ran. You could even use that timestamp to calculate down to the millisecond what period to pull if you wanted the ability to run these as arbitrary times, although that might be overkill for what you need.

To avoid pulling down actively downloading stuff you might want to use both --max-age 24h --min-age 30m. Just so you don't download anything half finished. However that problem could be solved more cleanly by using a "temp download folder" function in the torrent client itself that can be just excluded from rclone. That would guarantee it much better.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.