So, I’m new to seedboxes, I am currently using one for a few torrents.
I discovered rClone and it’s perfect for syncing with pCloud for me, however I have one issue.
I want to either only sync/copy new files/folders, or skip copying/syncing a folder if the name of that folder already exists, even if the contents are different, or only sync/copy files/folders that are new since the last sync/copy. Is there a way to do this?
This is so my torrent can directly upload to the cloud, then I can edit it while its there, and then when I next sync/copy all the folders, it won’t mess with the ones I’ve just changed. If you know another way to do this, please let me know.
I’m new with SSH and Unix commands, so I’d need some serious guidance here.
Would --ignore-existing ignore existing folders, or the entire folder contents?
Because, if I rearranged files inside the folders and then I did --ignore-existing, and the --ignore-existing flag only recognised if the folder was exactly the same, it’d affect the folder by copying it over again since although the folder exists it’s not exactly the same. Am I correct with that?
That doesn’t really solve the problem. To try and clear this up, I’ll do my best to explain:
In my case, I get very messy torrents (which each consist of one folder, with either many files or many subfolders inside of them) directly on a seedbox, and then use rClone to sync it to my preferred cloud system. Then, I want to be able to reorganise the contents of each single folder of the torrent in my cloud system to organise them a bit. Only problem is, next time I copy it will detect a change and just re-copy.
The max age thing makes sense I guess, and I can manage by only syncing every 5 days and doing --max-age 5. I will only be syncing completed torrents by the way.
If there’s a better way you know of, please let me know.
You could do what the last comment said to do too. I just don’t like dealing with timing things like that. if you change something on your destination within the interval that those dates define then you’re going to screw up your destination with those new files again.
The only comment i’ll make about that script is if a rclone transfer fails, it’ll ignore it on the next run as that assumes all success between time stamps. The way you really should be doing this is creating a include file based on the contents of the directory and then that script combing the rclone log file to determine success/failure and removing those files from the include file. That include file then serves as a queue.