What is the problem you are having with rclone?
We need to sync video and audio files from a local Mac (well, a whole bunch of Macs) to a centralized location for video editing, etc. We currently use rclone
to do this, but I would like to make some changes. The process I'd like to follow is below:
Process:
- User records video and audio. These files are written to a folder that looks like
[YYYY-MM-DD SESS#]/[files]
rclone
runs (on a cronjob or whatnot) and sees the SOURCE files, pushes them to DEST:to-be-processed/[machine #]/[YYYY-MM-DD SESS#]/[files]
(cloud storage)- A human or a script moves the files out of
to-be-processed
to work their magic. - The SOURCE files stay where they are. Subsequent runs of
rclone
do not attempt to upload them again.
Requirements:
- automated. These machines are headless "video / podcast studios".
- minimal delay. These files can be quite large (hundreds of GB) so once the files are written to disk, I would like
rclone
to run with minimal latency (I do this now with a combination of flock and a cronjob that runs once every few minutes). - only uploads once.
rclone
is doing an amazing job of copying the files, restarting if things go wrong, etc. but since cloud storage (for this purpose) is quite expensive if it kept re-uploading files it would be a major problem. Since we rely on a "inbox" of sorts for unprocessed files, out-of-the-boxrclone
will happily keep updating the DESTINATION files after we process them. - local files stay there. Local machines have multi-TB drives and we want to use them as a backup-of-last-resort. So I don't want
rclone
to delete the SOURCE files, although if it renamed them (so we wouldn't process them again) or moved them to a new local folder that would be fine. They just need to be handy local in case something terrible happens upstream which rarely, but not never, happens. We can go to the source machine "by hand" and pluck the file. Maintaining a separate copy of the files in cloud storage for this purpose is prohibitively expensive (the local Mac has plenty of storage so we are fine using it, with the appropriate caveats that it will eventually be deleted etc.)
Any ideas about how best to do this?
Run the command 'rclone version' and share the full output of the command.
rclone --version
rclone v1.69.1
- os/version: darwin 15.3.2 (64 bit)
- os/kernel: 24.3.0 (arm64)
- os/type: darwin
- os/arch: arm64 (ARMv8 compatible)
- go/version: go1.24.0
- go/linking: dynamic
- go/tags: none
Which cloud storage system are you using? (eg Google Drive)
GCP Cloud Storage
The command you were trying to run (eg rclone copy /tmp remote:tmp
)
N/A
Please run 'rclone config redacted' and share the full output. If you get command not found, please make sure to update rclone.
[ATEM]
type = ftp
pass = XXX
user = XXX
host = XXX
[GOOGLE_CLOUD_STORAGE]
type = gcs
service_account_file = <>
project_number = XXX
A log from the command that you were trying to run with the -vv
flag
N/A