For mount sftp,why right click on exe file is so slow until it freezes

Thanks for the logs @prizi - it doesn't look like fast fingerprint helps much here - it will help the first time the fingerprint is checked, but the time will be cached in the object after that. It looks like the OS already read the time so we had it cached.

(If you are using s3 based backend then using --use-server-modtime will make your mount a lot snappier at the expense of only seeing the times the objects were uploaded rather then their actual modtimes).

The problem seems to be in here

12:37:18.585687 DEBUG : wasabi/media/apps/3d/zbrush/install/ZBrush_4R4_Installer_WIN.exe(0xc0004da3c0): openPending:
12:37:18.589093 DEBUG : wasabi/media/apps/3d/zbrush/install/ZBrush_4R4_Installer_WIN.exe: vfs cache: checking remote fingerprint "701579568,2022-11-01 17:59:46 +0000 UTC,debebb104c00cbfb3b123c2f75299367" against cached fingerprint "701579568,2022-11-01 17:59:46 +0000 UTC,debebb104c00cbfb3b123c2f75299367"
12:37:18.589633 DEBUG : wasabi/media/apps/3d/zbrush/install/ZBrush_4R4_Installer_WIN.exe: vfs cache: truncate to size=701579568
12:37:19.417656 DEBUG : wasabi/media/apps/3d/zbrush/install: Added virtual directory entry vAddFile: "ZBrush_4R4_Installer_WIN.exe"
12:37:19.418188 DEBUG : wasabi/media/apps/3d/zbrush/install/ZBrush_4R4_Installer_WIN.exe(0xc0004da3c0): >openPending: err=<nil>

or

12:37:23.151494 DEBUG : wasabi/media/apps/3d/zbrush/install/ZBrush_4R4_Installer_WIN.exe(0xc000061100): openPending:
12:37:23.151494 DEBUG : wasabi/media/apps/3d/zbrush/install/ZBrush_4R4_Installer_WIN.exe: vfs cache: checking remote fingerprint "701579568,2022-11-01 17:59:46 +0000 UTC,debebb104c00cbfb3b123c2f75299367" against cached fingerprint "701579568,2022-11-01 17:59:46 +0000 UTC,debebb104c00cbfb3b123c2f75299367"
12:37:23.151494 DEBUG : wasabi/media/apps/3d/zbrush/install/ZBrush_4R4_Installer_WIN.exe: vfs cache: truncate to size=701579568
12:37:23.974167 DEBUG : wasabi/media/apps/3d/zbrush/install: Added virtual directory entry vAddFile: "ZBrush_4R4_Installer_WIN.exe"
12:37:23.974715 DEBUG : wasabi/media/apps/3d/zbrush/install/ZBrush_4R4_Installer_WIN.exe(0xc000061100): >openPending: err=<nil>

There is an 800ms gap here which is almost certainly networking. What was rclone doing since rclone already has the file cached here? It shouldn't need to make any network transactions.

There are 8 of these sequences in the log. For some reason 2 of them don't have any delay. The other 6 seem to have about 800ms delay. In total that is 4.8s which seems like it would go a long way to explaining this problem.

To get an idea of what is happening can you run your test with with --dump headers as well as --vfs-cache-mode full --use-server-modtime --network-mode --buffer-size=0 please?

This will make an even more bulky log, but it will have the HTTP transaction of what rclone was doing which will hopefully give a clue.

Using --fast-fingerprint is probably better than using a cache. With fast fingerprint there should be no need to make any transactions to the remote.

The difference it makes with S3 is that we need the modtime without it which is stored as metadata on the object. This takes an extra HEAD request to retrieve. However we only fetch it once and cache it in the object, so effectively we already have a fast fingerprint cache (at least for s3 though the other backends all cache stuff if it is expensive to calculate).

No prob. Here's a new log.

The full mount command was:

rclone mount wasabi: Z:
-vv
--dump headers
--log-format time,microseconds
--vfs-cache-mode full
--cache-dir "%cd%\cache"
--use-server-modtime
--buffer-size=0
--network-mode

Happy to provide further logs if needed.

1 Like

Well colour me surprised, but there are no http transactions in that 800ms gap.

16:39:02.017488 DEBUG : wasabi/media/apps/3d/zbrush/install/ZBrush_4R4_Installer_WIN.exe: vfs cache: checking remote fingerprint "701579568,2022-11-01 17:59:46 +0000 UTC,debebb104c00cbfb3b123c2f75299367" against cached fingerprint "701579568,2022-11-01 17:59:46 +0000 UTC,debebb104c00cbfb3b123c2f75299367"
16:39:02.018031 DEBUG : wasabi/media/apps/3d/zbrush/install/ZBrush_4R4_Installer_WIN.exe: vfs cache: truncate to size=701579568
16:39:02.826896 DEBUG : wasabi/media/apps/3d/zbrush/install: Added virtual directory entry vAddFile: "ZBrush_4R4_Installer_WIN.exe"

I wonder - could it take Windows 800ms to truncate a file to the size it is already at?

Here is a binary which doesn't do the truncate if it doesn't need to.

v1.61.0-beta.6555.857ec7001.fix-vfs-pause on branch fix-vfs-pause (uploaded in 15-30 mins)

If that isn't any faster then I'll have to send you a binary with lots of extra debug to pin down what is happening.

That would be quintessential Windows behaviour in my opinion lol.

Using v1.61.0-beta.6555.857ec7001.fix-vfs-pause I generated this log based on this mount command:

rclone mount wasabi: Z:
-vv
--log-format time,microseconds
--vfs-cache-mode full
--cache-dir "%cd%\cache"
--use-server-modtime
--buffer-size=0
--network-mode

Thanks a lot, just tried right-clicking on a 1.8GB exe and the response was almost instant, BUT then I checked the log and found this:

11:14:52.901862 DEBUG : Some/Path/Sequel_3_Trial_Windows.exe: vfs cache: truncate to size=1834024472
11:14:53.115853 DEBUG : Some/Path/Sequel 3: Added virtual directory entry vAddFile: "Sequel_3_Trial_Windows.exe"

That is 210ms without network communication on a very fast laptop, so there may be an issue here. I am using v1.60 on a fully up-to-date Windows 11 with OneDrive as backend.

I will see if I can make it more pronounced by slowing CPU or similar, and then try reproducing with a debugger, if I can figure out how to link/debug with cmount (using Windows).

I am interested if this is something you already have, otherwise it may be quicker to find in my debugger.

You're welcome. That's a huge executable! How long does it take to show file properties?

Less than a second, my immediate reaction was "unable to reproduce" until I checked the log.

Now I have reduced CPU to 25% and can see it clearly, the above sequence now takes 800ms so something CPU bound is going on.

Using VSCode? If so, some real quick pointers based on my setup, if it helps...

In launch.json (for debug):

"name": "Launch",
"type": "go",
"request": "launch",
"mode": "debug",
"program": "${workspaceFolder}",
"buildFlags": "-tags cmount",
"env": {
	"CGO_ENABLED": "0"
},

In settings.json (for intellisense etc)

{
    "go.buildTags": "cmount"
}

In tasks.json (for build):

"label": "Go build task",
"type": "shell",
"options": {
	"env": {
		"CGO_ENABLED": "0"
	}
},
"command": "go build -x -tags cmount",
"group": "build",

1 Like

Its immediately clear from the log that my attempt to not truncate the file didn't work :frowning:

I just tested it locally on Linux and it does appear to work....

So either it is something to do with your system (eg Windows) or could you have run the wrong binary? Your log doesn't have the very start where rclone prints its version so I can't check.

I've been attempting to replicate this locally. I've written a python program which simulates what windows does to the exe (based on the above).

Can either of you try this on the exe file in the mount and report back its output.

"""
Read parts out of the executable like windows does
"""

import sys
import time

if len(sys.argv) < 2:
    print(f"Syntax {sys.argv} <file>")
    raise SystemExit(1)

file_name = sys.argv[1]

t0 = time.time()

def dt():
    return time.time() - t0

def readAt(f, size, off):
    """
    Read size bytes from offset
    """
    f.seek(off)
    bs = f.read(size)
    assert len(bs) == size
    print(f"{dt():.6f} {file_name}: read {size:6} bytes at {off:6}")

f = open(file_name, "rb")
readAt(f, 4096, 0)
readAt(f, 16384, 840192)
readAt(f, 15360, 872960)
f.close()

f = open(file_name, "rb")
readAt(f, 4096, 0)
f.close()

f = open(file_name, "rb")
readAt(f, 32768, 0)
readAt(f, 32768, 839680)
readAt(f, 32768, 880640)
readAt(f, 32768, 0)
f.close()

f = open(file_name, "rb")
readAt(f, 32768, 0)
f.close()

f = open(file_name, "rb")
readAt(f, 32768, 0)
f.close()

f = open(file_name, "rb")
readAt(f, 4096, 0)
readAt(f, 16384, 840192)
readAt(f, 15360, 872960)
readAt(f, 4096, 0)
readAt(f, 16384, 840192)
readAt(f, 15360, 872960)
f.close()

Run it like this

$ python3 test_bin.py /mnt/tmp/bin.exe 
0.001172 /mnt/tmp/bin.exe: read   4096 bytes at      0
0.001438 /mnt/tmp/bin.exe: read  16384 bytes at 840192
0.001769 /mnt/tmp/bin.exe: read  15360 bytes at 872960
0.002156 /mnt/tmp/bin.exe: read   4096 bytes at      0
0.002891 /mnt/tmp/bin.exe: read  32768 bytes at      0
0.003285 /mnt/tmp/bin.exe: read  32768 bytes at 839680
0.003428 /mnt/tmp/bin.exe: read  32768 bytes at 880640
0.003438 /mnt/tmp/bin.exe: read  32768 bytes at      0
0.003915 /mnt/tmp/bin.exe: read  32768 bytes at      0
0.004429 /mnt/tmp/bin.exe: read  32768 bytes at      0
0.004951 /mnt/tmp/bin.exe: read   4096 bytes at      0
0.005035 /mnt/tmp/bin.exe: read  16384 bytes at 840192
0.005164 /mnt/tmp/bin.exe: read  15360 bytes at 872960
0.005169 /mnt/tmp/bin.exe: read   4096 bytes at      0
0.005175 /mnt/tmp/bin.exe: read  16384 bytes at 840192
0.005181 /mnt/tmp/bin.exe: read  15360 bytes at 872960

And give it the path to the file.

The timestamps are how long the program has been running so we should see delays in there.

I haven't managed to reproduce this on Linux yet.

I made a new binary to try

v1.61.0-beta.6556.7eefa3428.fix-vfs-pause on branch fix-vfs-pause (uploaded in 15-30 mins)

This has extra debugging in it to show how long the truncated bits and pieces take. There are two operations, setting the file sparse and truncating it and with microsecond log we should see what is going on.

11:30:56.253333 DEBUG : 20M: _truncateToCurrentSize: need truncate - current size -1 new size 0
11:30:56.253335 DEBUG : 20M: _truncate: starting
11:30:56.253368 DEBUG : 20M: _truncate: start SetSparse
11:30:56.253371 DEBUG : 20M: _truncate: end SetSparse
11:30:56.253372 DEBUG : 20M: vfs cache: truncate to size=0
11:30:56.253374 DEBUG : 20M: _truncate: start Truncate
11:30:56.253388 DEBUG : 20M: _truncate: end Truncate

You can drop the --dump headers now - I think we've eliminated that.

Could it be disk bound? Any way of checking that?

1 Like

Sorry Nick, that'll be my fault! :man_facepalming: My previous comment should now lead you to the correct log.

You were right, not truncating is indeed faster. Using that version of rclone I was able to show file properties in <1s (reduced from ~4s).

Thanks @prizi

That certainly is an improvement - maybe that is the problem fixed? I don't know!

There still is one 800ms delay now don't know why.

11:53:15.564783 DEBUG : wasabi/media/apps/3d/zbrush/install/ZBrush_4R4_Installer_WIN.exe: vfs cache: checking remote fingerprint "701579568,2022-11-01 17:59:46 +0000 UTC,debebb104c00cbfb3b123c2f75299367" against cached fingerprint "701579568,2022-11-01 17:59:46 +0000 UTC,debebb104c00cbfb3b123c2f75299367"
11:53:16.413351 DEBUG : wasabi/media/apps/3d/zbrush/install: Added virtual directory entry vAddFile: "ZBrush_4R4_Installer_WIN.exe"

This binary has more timing on the truncate and sparse calls - it would be very interesting to see the timings on those

Yes, big improvement! Fixed from my point of view, but my usage probably differs from @wslrj.

Using v1.61.0-beta.6556.7eefa3428.fix-vfs-pause I generated this log. The first line shows version and params, the rest is just output from a right click > properties action.

Thank you for testing @prizi

I'm still not happy that we've come to the bottom of the slowdown as there is still one unexplained 800ms slowdown in your log.

Here is another version from which I've temporarily removed the fix which speeded it up and added a lot more logging. Hopefully this will pinpoint exactly what the problem is.

Do you think you could do a log for me with it?

Thank you

v1.61.0-beta.6558.9fbb52610.fix-vfs-pause on branch fix-vfs-pause (uploaded in 15-30 mins)

No prob!

Using v1.61.0-beta.6558.9fbb52610.fix-vfs-pause I generated this log. The first line shows version and params, the rest is just output from a right click > properties action.

Neat! Here's what I get:

0.579700 R:\wasabi\media\apps\3d\zbrush\install\ZBrush_4R4_Installer_WIN.exe: read   4096 bytes at      0
0.690440 R:\wasabi\media\apps\3d\zbrush\install\ZBrush_4R4_Installer_WIN.exe: read  16384 bytes at 840192
0.694472 R:\wasabi\media\apps\3d\zbrush\install\ZBrush_4R4_Installer_WIN.exe: read  15360 bytes at 872960
1.561880 R:\wasabi\media\apps\3d\zbrush\install\ZBrush_4R4_Installer_WIN.exe: read   4096 bytes at      0
2.400448 R:\wasabi\media\apps\3d\zbrush\install\ZBrush_4R4_Installer_WIN.exe: read  32768 bytes at      0
2.400448 R:\wasabi\media\apps\3d\zbrush\install\ZBrush_4R4_Installer_WIN.exe: read  32768 bytes at 839680
2.400990 R:\wasabi\media\apps\3d\zbrush\install\ZBrush_4R4_Installer_WIN.exe: read  32768 bytes at 880640
2.400990 R:\wasabi\media\apps\3d\zbrush\install\ZBrush_4R4_Installer_WIN.exe: read  32768 bytes at      0
3.231740 R:\wasabi\media\apps\3d\zbrush\install\ZBrush_4R4_Installer_WIN.exe: read  32768 bytes at      0
4.063332 R:\wasabi\media\apps\3d\zbrush\install\ZBrush_4R4_Installer_WIN.exe: read  32768 bytes at      0
4.896236 R:\wasabi\media\apps\3d\zbrush\install\ZBrush_4R4_Installer_WIN.exe: read   4096 bytes at      0
4.896236 R:\wasabi\media\apps\3d\zbrush\install\ZBrush_4R4_Installer_WIN.exe: read  16384 bytes at 840192
4.896763 R:\wasabi\media\apps\3d\zbrush\install\ZBrush_4R4_Installer_WIN.exe: read  15360 bytes at 872960
4.897322 R:\wasabi\media\apps\3d\zbrush\install\ZBrush_4R4_Installer_WIN.exe: read   4096 bytes at      0
4.897322 R:\wasabi\media\apps\3d\zbrush\install\ZBrush_4R4_Installer_WIN.exe: read  16384 bytes at 840192
4.897875 R:\wasabi\media\apps\3d\zbrush\install\ZBrush_4R4_Installer_WIN.exe: read  15360 bytes at 872960

Thanks again @prizi

@ncw A quick update on work in progress (take it with a grain of salt):

The log shows calls to _createFile: when the file is already created in the cache and these calls are slow. Guessing it is file.SetSparse that it may cause Windows to unsparse/resparse.

Also made another observation: Seems like the exe files does not clean from the cache, even if explorer is closed. I am using --vfs-cache-max-age=10s --vfs-cache-poll-interval=15s to clean the cache, but always have to restart the mount to make it clean. Didn't pay attention to it before, but reading the code, I guess it is important.

Together this makes me suspect an issue with the cache item.opens counter causing this piece of code to recreate the file instead of just returning nil:

VS Code debugger up and running on branch fix-vfs-pause on Windows, thanks a lot @albertony

Time to systematically prove/disprove my suspicion, but first a break...

1 Like

I have gained a better understanding and located the OS function causing the long response time.

First a small log snippet with some additional timing info:

# first time opening the exe file (not present in cache) - time: 1ms
18:02:40.504120 DEBUG : Sequel3/Sequel_3_Trial_Windows.exe: _createFile: item=0xc0008d2700 osPath="\\\\?\\C:\\Users\\olefr\\AppData\\Local\\rclone\\vfs\\OneDrive\\NoBackupFolder\\rclone\\Sequel3\\Sequel_3_Trial_Windows.exe"
18:02:40.504120 DEBUG : Sequel3/Sequel_3_Trial_Windows.exe: file.OpenFile BEGIN
18:02:40.504120 DEBUG : Sequel3/Sequel_3_Trial_Windows.exe: file.OpenFile END
18:02:40.504120 DEBUG : Sequel3/Sequel_3_Trial_Windows.exe: file.SetSparse BEGIN
18:02:40.504120 DEBUG : Sequel3/Sequel_3_Trial_Windows.exe: file.SetSparse END
18:02:40.504652 DEBUG : Sequel3/Sequel_3_Trial_Windows.exe: _save: item=0xc0008d2700
18:02:40.505348 DEBUG : Sequel3/Sequel_3_Trial_Windows.exe: >_save: err=<nil>
18:02:40.505348 DEBUG : Sequel3/Sequel_3_Trial_Windows.exe: >_createFile: err=<nil>

# second time opening the exe file (now present in cache) - time: 168ms
18:02:42.393932 DEBUG : Sequel3/Sequel_3_Trial_Windows.exe: _createFile: item=0xc0008d2700 osPath="\\\\?\\C:\\Users\\olefr\\AppData\\Local\\rclone\\vfs\\OneDrive\\NoBackupFolder\\rclone\\Sequel3\\Sequel_3_Trial_Windows.exe"
18:02:42.393932 DEBUG : Sequel3/Sequel_3_Trial_Windows.exe: file.OpenFile BEGIN
18:02:42.555946 DEBUG : Sequel3/Sequel_3_Trial_Windows.exe: file.OpenFile END
18:02:42.555946 DEBUG : Sequel3/Sequel_3_Trial_Windows.exe: file.SetSparse BEGIN
18:02:42.555946 DEBUG : Sequel3/Sequel_3_Trial_Windows.exe: file.SetSparse END
18:02:42.555946 DEBUG : Sequel3/Sequel_3_Trial_Windows.exe: _save: item=0xc0008d2700
18:02:42.561497 DEBUG : Sequel3/Sequel_3_Trial_Windows.exe: >_save: err=<nil>
18:02:42.561497 DEBUG : Sequel3/Sequel_3_Trial_Windows.exe: >_createFile: err=<nil>

The above shows that the time consuming operation is to open the sparse cache file for read/write. file.OpenFile performs an OS call that @ncw probably knows a lot better than me.

I think it could be very interesting to see if this is only happening to rclone/Go or is a more general Windows behavior that also happens when e.g. Python opens the sparse file. Perhaps I can persuade @prizi to do this little test:

  1. Start the mount and run the python script against the mount (Z:) - no need for the log we have one already
  2. Stop the mount while the (sparse) exe is still in the cache
  3. Run the python script and point it to the sparse file in the cache folder you can see in the top of you debug log, that is something like C:\Users....\AppData\Local\rclone\vfs...

Is it faster or is it the same?


Finally, I mentioned above that there might be an issue with the cache item.opens counter, that is not the case. It looks like Windows isn't (immediately) closing the file after the last read only operation. I will do some more tests to understand this and report back if I find something that need to be taken into account in rclone.

Here you are! I've simplified the paths to make them easier to read:

# Running test_bin.py on wasabi-mount version of exe.
0.677459 Z:\wasabi-mount\ZBrush_4R4_Installer_WIN.exe: read   4096 bytes at      0
0.789495 Z:\wasabi-mount\ZBrush_4R4_Installer_WIN.exe: read  16384 bytes at 840192
0.793837 Z:\wasabi-mount\ZBrush_4R4_Installer_WIN.exe: read  15360 bytes at 872960
1.658077 Z:\wasabi-mount\ZBrush_4R4_Installer_WIN.exe: read   4096 bytes at      0
2.529078 Z:\wasabi-mount\ZBrush_4R4_Installer_WIN.exe: read  32768 bytes at      0
2.529615 Z:\wasabi-mount\ZBrush_4R4_Installer_WIN.exe: read  32768 bytes at 839680
2.529615 Z:\wasabi-mount\ZBrush_4R4_Installer_WIN.exe: read  32768 bytes at 880640
2.530153 Z:\wasabi-mount\ZBrush_4R4_Installer_WIN.exe: read  32768 bytes at      0
3.369196 Z:\wasabi-mount\ZBrush_4R4_Installer_WIN.exe: read  32768 bytes at      0
4.207057 Z:\wasabi-mount\ZBrush_4R4_Installer_WIN.exe: read  32768 bytes at      0
5.033581 Z:\wasabi-mount\ZBrush_4R4_Installer_WIN.exe: read   4096 bytes at      0
5.034149 Z:\wasabi-mount\ZBrush_4R4_Installer_WIN.exe: read  16384 bytes at 840192
5.034687 Z:\wasabi-mount\ZBrush_4R4_Installer_WIN.exe: read  15360 bytes at 872960
5.034687 Z:\wasabi-mount\ZBrush_4R4_Installer_WIN.exe: read   4096 bytes at      0
5.034687 Z:\wasabi-mount\ZBrush_4R4_Installer_WIN.exe: read  16384 bytes at 840192
5.035250 Z:\wasabi-mount\ZBrush_4R4_Installer_WIN.exe: read  15360 bytes at 872960

# Running test_bin.py on sparse-cache version of exe (in --cache-dir).
0.000000 D:\sparse-cache\ZBrush_4R4_Installer_WIN.exe: read   4096 bytes at      0
0.000000 D:\sparse-cache\ZBrush_4R4_Installer_WIN.exe: read  16384 bytes at 840192
0.001164 D:\sparse-cache\ZBrush_4R4_Installer_WIN.exe: read  15360 bytes at 872960
0.001164 D:\sparse-cache\ZBrush_4R4_Installer_WIN.exe: read   4096 bytes at      0
0.001164 D:\sparse-cache\ZBrush_4R4_Installer_WIN.exe: read  32768 bytes at      0
0.001164 D:\sparse-cache\ZBrush_4R4_Installer_WIN.exe: read  32768 bytes at 839680
0.001164 D:\sparse-cache\ZBrush_4R4_Installer_WIN.exe: read  32768 bytes at 880640
0.002177 D:\sparse-cache\ZBrush_4R4_Installer_WIN.exe: read  32768 bytes at      0
0.002177 D:\sparse-cache\ZBrush_4R4_Installer_WIN.exe: read  32768 bytes at      0
0.002177 D:\sparse-cache\ZBrush_4R4_Installer_WIN.exe: read  32768 bytes at      0
0.002177 D:\sparse-cache\ZBrush_4R4_Installer_WIN.exe: read   4096 bytes at      0
0.003179 D:\sparse-cache\ZBrush_4R4_Installer_WIN.exe: read  16384 bytes at 840192
0.003179 D:\sparse-cache\ZBrush_4R4_Installer_WIN.exe: read  15360 bytes at 872960
0.003179 D:\sparse-cache\ZBrush_4R4_Installer_WIN.exe: read   4096 bytes at      0
0.003179 D:\sparse-cache\ZBrush_4R4_Installer_WIN.exe: read  16384 bytes at 840192
0.003179 D:\sparse-cache\ZBrush_4R4_Installer_WIN.exe: read  15360 bytes at 872960

Thank you. That is very useful.

The Python log indicates that the OS isn't doing anything special here as the python script replicates the problem perfectly.

It doesn't replicate it on Linux though indicating that this is a Windows only problem.

The call _createFile is somewhat mis named. What it does is create the file handle, namely open the file. What SetSparse does is mark the handle so that the OS should create sparse files. I think it just sets a bit in the file handle so it should be very quick. I don't think it sparses the file.

It does look like it is something in the _createFile function that is causing the slowdown.

Looking @prizi s logs the candidate must be one of these calls. We have a SetSparse measured in the log at it takes < 1uS so it must be the OpenFile.

	fd, err := file.OpenFile(osPath, os.O_RDWR, 0600)
	err = file.SetSparse(fd)

Digging into what file.OpenFile does... It calls Windows CreateFile with flags

access = GENERIC_READ | GENERIC_WRITE
sharemode = FILE_SHARE_READ | FILE_SHARE_WRITE | FILE_SHARE_DELETE
createmode = OPEN_EXISTING

The only unusual flags here are the sharemode flags. These are set so rclone can rename and delete open files to try to make Windows file handles a bit more compatible with Unix ones. In particular the FILE_SHARE_DELETE is one rclone adds which the go runtime doesn't normally include.

I managed to replicate this on my test VM and I can see it is the Open call in _createFile which is taking all the time, taking about 500ms.

11:50:46.144759 DEBUG : rclone.exe: _createFile Open: starting
11:50:46.637412 DEBUG : rclone.exe: createFile Open: ending
11:50:46.754571 DEBUG : rclone.exe: _createFile Open: starting
11:50:47.251635 DEBUG : rclone.exe: createFile Open: ending
11:50:58.133531 DEBUG : rclone.exe: _createFile Open: starting
11:50:58.661945 DEBUG : rclone.exe: createFile Open: ending
11:50:59.331610 DEBUG : rclone.exe: _createFile Open: starting
11:50:59.820970 DEBUG : rclone.exe: createFile Open: ending
11:51:01.769478 DEBUG : rclone.exe: _createFile Open: starting
11:51:02.228604 DEBUG : rclone.exe: createFile Open: ending
11:51:02.545052 DEBUG : rclone.exe: _createFile Open: starting
11:51:02.841924 DEBUG : rclone.exe: createFile Open: ending
11:51:02.863319 DEBUG : rclone.exe: _createFile Open: starting
11:51:03.360622 DEBUG : rclone.exe: createFile Open: ending
11:51:03.434117 DEBUG : rclone.exe: _createFile Open: starting
11:51:03.921204 DEBUG : rclone.exe: createFile Open: ending
11:51:03.975281 DEBUG : rclone.exe: _createFile Open: starting
11:51:04.472218 DEBUG : rclone.exe: createFile Open: ending

I managed to replicate this on my test VM. Here is the code with more debug still which times the Open call

v1.61.0-beta.6559.f21568564.fix-vfs-pause on branch fix-vfs-pause (uploaded in 15-30 mins)

I've tried changing the parameters to CreateFile (removing the shared) reverting back to the go standard create file and a few other things and I haven't managed to speed things up yet!

Will continue to experiment but interested in any ideas anyone might have!

Yes the closes to the file are somewhat asynchronous to the application calling close(). This causes a lot of extra work for the VFS but its how linux works too.

1 Like

I've tried to make the equivalent of test_bin.py in Go.

This uses exactly the same calls as rclone and I'd hope would replicate the behaviour when used on the file in the cache.

It doesn't though and I don't know why :frowning:

Here is test_bin.zip it contains the source code and a linux and windows binary test_bin.exe - it should work just like the test_bin.py program.