I have a process which files are created but on buffering( example he can do 100ko then 2Mo after 30s) , how can i upload files every moment to avoid the loss of his data then on a cycle of 30 days the files in the source folder is deleted so you have to keep these files on the server (GoogleDrive and Mega for example)?
If you run rclone copy /path/to/local remote: --max-age 10m then that is a quick way of copying files modified within 10 minutes. You can then run this on cron every 5 minutes.
You can use rclone delete remote: --min-age 30d to delete all the files older than 30 days. Check first with --dry-run.