In my current setup I have created a remote for s3 bucket and running rclone sync command via cron tab every 5 mins. I am wondering how rclone sync internally works, I have couple of queries which i am looking an answer for.
a. For rsync are all the files scanned on source and destination every-time or there is some mechanism using which rclone keep track of files previously synced and they are skipped. I presume it keeps some record of files already synced somewhere, Kindly let me know where it keeps track of such information and how can we see and interpret it.
b. Do we need to do any special setting for syncing big files for e.g of 10GB or more.
c. Is there any better way of running it than scheduling it to run via cron.
d. If during a sync a file is deleted from source how it is handled. I give a use case here for better clarity.
Say there are 10 files marked for sync say 8 files of 10 GB and 2 files of 1MB. Sync started with 10 GB file but during this time 1 MB file got deleted how it will be handled during the sync.
Your replies with be highly appreciated.
Thanks,
Ali Sajjad
I observed a strange behavior, not sure if rclone provides a way to override this behavior.
If there is a process say A which is continuously writing to a file and then rclone sync starts , process A fails to write file any longer and throws an error message which says
"The process cannot access the file 'D:\1.txt' because it is being used by another process"
He's not running Unix/Linux because he asked about:
The issue is you have a file in use and on Windows, you either need to use VSS as someone has a nice tutorial on that or close the file out before you copy it.
Thanks mates for taking out time to answer my queries. It seems I missed few details which caused some confusion. Let me explain the scenario below, I wish to understand if we have any flag or workaround to override this behavior.
Test Case: If a file is continuously getting written/updated and rclone is triggered to sync this file to s3 bucket, write operations should continue without any impact. Result= Failed. Actual Behavior: Rcone locks the file and write operations fail.
Setup: We have a powershell script which is just appending lines to a file in a windows environment.
**PowerShell script:**This script is just updating a a file D:\alpha\Test-structure\1.txt
param([string]$src)
while($true)
{
$NumArray = (1..1)
ForEach ($number in $numArray)
{
if (Test-Path $src$number.txt)
{
Add-Content $src$number.txt " Hello"
write-host "$(Get-Date) : New Content appended to $src$number.txt file"
}
}
}
Add-Content : The process cannot access the file
'D:\alpha\Test-structure\1.txt' because it is being used by another process.
At E:\script\modify_present_file_in_D.ps1:9 char:2
+ Add-Content $src\$number.txt " Charlie one 2 three testing 221144 'n
Charlie on ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
~~~~~~
+ CategoryInfo : WriteError: (D:\alpha\Test-structure\1.txt:Strin
g) [Add-Content], IOException
+ FullyQualifiedErrorId : GetContentWriterIOError,Microsoft.PowerShell.Com
mands.AddContentCommand