Make Mounted Google Drive to Search Faster in Windows Explorer

What is the problem you are having with rclone?

When I mounted my Google Drive, it takes like an hour to do a file name search on the drive in Windows Explorer. I just want to do a word search on videos and pic file names, no document content. How to make these searches as fast as going to Google Drive and type in the search box as in Windows Explorer? I use Windows 10 64bits

What is your rclone version (output from rclone version)

v1.56.0

Which cloud storage system are you using? (eg Google Drive)

Google Drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone mount --vfs-cache-mode off --cache-dir local:/temp/ remote:/ local:/mount

The rclone config contents with secrets removed.

Paste config here

A log from the command with the -vv flag

Paste  log here

hello and welcome to the forum,

rclone maintains a cache for the directory and file names.
normally, rclone builds that on the fly as you navigate into a folder.

let's say that i start my rclone mount and need emby to re-scan the entire thing to looks for new media files.
that can take a very long time for rclone to navigate the entire dir/file structure of the remote.
so before i have emby scan do that scan, i have rclone pre-cache the file details.
now, when emby scans for new media, all that needed info in local on the machine and that scan goes very fast.

  1. add --rc the the mount command
  2. after the mount is running, do rclone rc vfs/refresh recursive=true

local:/temp/, unless there is a specific reason to use a remote, just use the local path.
for example, c:\temp

The second step is to input this line, right?

rclone rc vfs/refresh recursive=true C:/temp cache/

By the way, does this "rc" commend temporary folder needs to be remained there after rclone closes in order to have that data again at the next rclone startup? Reason I asking this is that my temp drive is a RAM drive that deletes with every computer restarts.

or just this line

rclone rc vfs/refresh recursive=true

run that command as given.

the dir cache info only exists while the rclone mount is running.
ram drive or not, each time you rclonemount, you need to rclone rc vfs/refresh

however, the files that have been downloaded do survive a reboot.
that does not apply if using a ram disk.

Ok I will give it a try. Will mark your reply as solution if it works. Thanks.

Maybe I am doing this wrong but. I applied my first line in a cmd window. That cmd window now could no longer accept new commends, so I open another commend windows pointing to my rclone and applied the second line with the "rc" commend.

If this was indeed the correct way to apply the second commend, then it didn't work. My searches were just as slow as before.

not sure that that means?
please post a debug log

https://forum.rclone.org/t/search-a-file-on-windows-explorer/26169

second commend, then it didn't work

I meant the "rc" commend line. The first line is just the mount commend line.

C:\Users\A\Desktop\rclone>rclone rc vfs/refresh recursive=true -vv
2021/09/05 15:04:42 DEBUG : rclone: Version "v1.56.0" starting with parameters ["rclone" "rc" "vfs/refresh" "recursive=true" "-vv"]
2021/09/05 15:09:43 DEBUG : 2 go routines active
2021/09/05 15:09:43 Failed to rc: connection failed: Post "http://localhost:5572/vfs/refresh": net/http: timeout awaiting response headers
C:\Users\A\Desktop\rclone>PAUSE
Press any key to continue . . .

I checked the option "Don't use the index when searching in file folder for system files (searches might take longer)"

That didn't improve search speed.

the rclone rc command failed.

post the mount debug log, if it is very large in size, then post the top 20 lines.

as long as a command is running, the cmd window will not accept new commands.
to work around that prefix the command with start
start rclone mount ....

might want to add --no-console to the rclone mount command.

Ok before getting the debug log, I run a batch file for my commends. Maybe you help me to correct anything I am doing wrong.

@ECHO OFF
cd C:\Users\A\Desktop\rclone
start rclone mount --rc --vfs-cache-mode off --no-console --cache-dir D:/temp remote:/ C:/mount -vv
rclone rc vfs/refresh recursive=true -vv

  1. since you are not using a log file, would not use --no-console.
  2. start forks the rclone mount and returns immediately before the rclone mount is active.
    so there is a chance that rclone rc will try to access a mount that is not ready and error out.
    i would add timeout /t 20 between the rclone mount and the rclone rc
  3. would be better to use cd /d, not cd

@ECHO OFF
cd /d C:\Users\A\Desktop\rclone
start rclone mount --rc --vfs-cache-mode off --no-console --log-file=mylogfile.txt --cache-dir D:/temp remote: C:/mount -vv
timeout /t 20
rclone rc vfs/refresh recursive=true -vv

Log File

2021/09/06 19:24:41 DEBUG : rclone: Version "v1.56.0" starting with parameters ["rclone" "mount" "--rc" "--vfs-cache-mode" "off" "--no-console" "--log-file=mylogfile.txt" "--cache-dir" "D:/temp" "remote:" "C:/mount" "-vv"]
2021/09/06 19:24:41 NOTICE: Serving remote control on http://localhost:5572/
2021/09/06 19:24:41 DEBUG : Creating backend with remote "remote:"
2021/09/06 19:24:41 DEBUG : Using config file from "C:\Users\A\.config\rclone\rclone.conf"
2021/09/06 19:24:42 DEBUG : Network mode mounting is disabled
2021/09/06 19:24:42 DEBUG : Mounting on "C:\mount" ("remote")
2021/09/06 19:24:42 DEBUG : Google drive root '': Mounting with options: ["-o" "attr_timeout=1" "-o" "uid=-1" "-o" "gid=-1" "--FileSystemName=rclone" "-o" "volname=remote"]
2021/09/06 19:24:42 DEBUG : Google drive root '': Init:
2021/09/06 19:24:42 DEBUG : Google drive root '': >Init:
2021/09/06 19:24:42 DEBUG : /: Statfs:
2021/09/06 19:24:42 DEBUG : Google drive root '': read info from Shared Drive "remote drive [1]"
2021/09/06 19:24:42 DEBUG : /: >Statfs: stat={Bsize:4096 Frsize:4096 Blocks:274877906944 Bfree:274877906944 Bavail:274877906944 Files:1000000000 Ffree:1000000000 Favail:0 Fsid:0 Flag:0 Namemax:255}, errc=0
2021/09/06 19:24:42 DEBUG : /: Getattr: fh=0xFFFFFFFFFFFFFFFF
2021/09/06 19:24:42 DEBUG : /: >Getattr: errc=0
2021/09/06 19:24:42 DEBUG : /: Readlink:
2021/09/06 19:24:42 DEBUG : /: >Readlink: linkPath="", errc=-40
The service rclone has been started.
2021/09/06 19:24:42 DEBUG : /: Statfs:
2021/09/06 19:24:42 DEBUG : /: >Statfs: stat={Bsize:4096 Frsize:4096 Blocks:274877906944 Bfree:274877906944 Bavail:274877906944 Files:1000000000 Ffree:1000000000 Favail:0 Fsid:0 Flag:0 Namemax:255}, errc=0
2021/09/06 19:24:42 DEBUG : /: Getattr: fh=0xFFFFFFFFFFFFFFFF
2021/09/06 19:24:42 DEBUG : /: >Getattr: errc=0
2021/09/06 19:24:42 DEBUG : /: Getattr: fh=0xFFFFFFFFFFFFFFFF
2021/09/06 19:24:42 DEBUG : /: >Getattr: errc=0
2021/09/06 19:24:42 DEBUG : /: Opendir:
2021/09/06 19:24:42 DEBUG : /: OpenFile: flags=O_RDONLY, perm=-rwxrwxrwx
2021/09/06 19:24:42 DEBUG : /: >OpenFile: fd=/ (r), err=
2021/09/06 19:24:42 DEBUG : /: >Opendir: errc=0, fh=0x0
2021/09/06 19:24:42 DEBUG : /: Releasedir: fh=0x0
2021/09/06 19:24:42 DEBUG : /: >Releasedir: errc=0
2021/09/06 19:25:00 DEBUG : rc: "vfs/refresh": with parameters map[recursive:true]
2021/09/06 19:25:00 DEBUG : : Reading directory tree
2021/09/06 19:25:09 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: Rate Limit Exceeded, rateLimitExceeded)
2021/09/06 19:25:09 DEBUG : pacer: Rate limited, increasing sleep to 1.004651028s
2021/09/06 19:25:09 DEBUG : pacer: Reducing sleep to 0s
2021/09/06 19:25:09 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: Rate Limit Exceeded, rateLimitExceeded)
2021/09/06 19:25:09 DEBUG : pacer: Rate limited, increasing sleep to 1.510359739s
2021/09/06 19:25:09 DEBUG : pacer: Reducing sleep to 0s
2021/09/06 19:25:09 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: Rate Limit Exceeded, rateLimitExceeded)
2021/09/06 19:25:09 DEBUG : pacer: Rate limited, increasing sleep to 1.385573975s
2021/09/06 19:25:10 DEBUG : pacer: Reducing sleep to 0s
2021/09/06 19:25:42 DEBUG : Google drive root '': Checking for changes on remote

look like rclone is getting throttled by gdrive.
that could be the reason search using windows explorer is so slow.

have you done this, if not, should do so and test again.
https://rclone.org/drive/#making-your-own-client-id

You have to use the search function via https://drive.google.com/ unless there are just a few files.

Windows Explorer has to crawl each file which takes forever when you have many files, because google limits it heavily to prevent abuses.

The throttling here seems to be happening while the refresh command is running. We need to know if it ever completes successfully. OP, you should see a status of "OK" at the end. Searching should not be slow at all once priming is done. I regularly search through tens of thousands of files and folders.

good point, the truth is i made a choice to pretend not to notice.
that i would miss something we know could not happen

so it must be that, i set a trap for you, yes, yes, now that i am thinking about it....
from a lurking vigilante in the background to a poster in public :wink:

i know of your experience with vfs/refresh and gdrive, i deflected, made mention about the need for a client id, without mention that the pacer began after that vfs/refresh

it all makes perfect sense to me!

now, back to reality.

in your rclone vfs/refresh commands, there are no flags to deal with pacer issues, correct?
given that the OP has not posted the config, do you think having a client id would resolve that or should the OP add some flags?

It shouldn't be necessary to run with additional flags, but I do have a few:

rclone rc vfs/refresh recursive=true --drive-pacer-burst 200 --drive-pacer-min-sleep 10ms --timeout 30m --user-agent *******

I never have any issues running a refresh, but I also never look at any logs.

The above runs through roughly 820 TB in four minutes.

Well, my rclone config is everything to default.

The google drive I am connecting is a shared drive and it usually runs into bandwidth limits. By reading what is written in this thread, I guess I can do nothing about the slow search when google is throttling my connection. It is kinda stupid that I cannot use Explorer to find files simply by file names.

that is why i do not use gdrive.

you can try the vfs/refresh command that was shared by @VBB