Rclone mount folder dissapears?

What is the problem you are having with rclone?

I started with rclone just yesterday to mount google drive to my mac. I first mounted the drive without sudo but there were many errors while copy pasting files and could not get that to work except for if i use sudo which gives me maybe some permissions to read write.

I generally unmounted the drive through the eject button in sidebar in finder.
my mount command was:
sudo rclone mount --daemon --allow-other --vfs-read-chunk-size 32M --vfs-cache-max-size 10GiB --poll-interval 15s --vfs-cache-mode full --cache-dir /Users/ishaanrathod/rclone/cache --disable About GDrive: /Users/ishaanrathod/GDrive

I tried many values above before ending up with the command values above like --vfs-read-chunk-size 256M,etc. Today suddenly while browsing through the mounted drive my folders in the drive started dissapearing and finally the complete folder /Users/ishaanrathod/GDrive just disappeared completely.

Tried to restart, the folder appeared again, i mounted it again, worked great for sometime but then again disappeared. Created new empty folder with path /Users/ishaanrathod/rclone/GDrive and mounted that with same commad as i was not able to create a new empty folder with same name under /Users/ishaanrathod/ as it said folder with same name existed already but i cant visually locate it anywhere on the system. Currently i am working with the new folder created under /Users/ishaanrathod/rclone/ but i dont know if this will disappear too.

Please let me know what im doing wrong here, im a complete noob in this.

Run the command 'rclone version' and share the full output of the command.

rclone v1.65.2

  • os/version: darwin 14.3 (64 bit)
  • os/kernel: 23.3.0 (arm64)
  • os/type: darwin
  • os/arch: arm64 (ARMv8 compatible)
  • go/version: go1.21.6
  • go/linking: dynamic
  • go/tags: cmount

Are you on the latest version of rclone? You can validate by checking the version listed here: Rclone downloads
-->YES

Which cloud storage system are you using? (eg Google Drive)

Google Drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

sudo rclone mount --daemon --allow-other --vfs-read-chunk-size 32M --vfs-cache-max-size 10GiB --poll-interval 15s --vfs-cache-mode full --cache-dir /Users/ishaanrathod/rclone/cache --disable About GDrive: /Users/ishaanrathod/GDrive

The rclone config contents with secrets removed.

[GDrive]
type = drive
scope = drive
token = {"access_token":"ya29.a0AfB_byAqnE1ou1SBvJ6y0skLf-7rjwQruYfLx8eyGrGt6IAFGKHTRkBl5TqTwxZukKx81Jsrv5Xm4HO0gj_iUFI-5_J-iZBFusW7LbduVTqiWWbasPS5_AxYjZKbNxve6lI1ez1oxdAqvFZra0rMTs5a8cQl9oP5KOJ0aCgYKARgSARASFQHGX2MiuDj7X2SyuCyYaiEUh1gBaw0171","token_type":"Bearer","refresh_token":"1//0gKYkToEHo93CCgYIARAAGBASNwF-L9IrH4LRE9mLzYZgKKE84vX5fWgmguDOetEBuaLXYeTGTeJz5RyIH-lO54rTJTXFN8X5C-M","expiry":"2024-02-07T02:42:29.387606+05:30"}
team_drive = 

A log from the command with the -vv flag

2024/02/07 22:03:32 DEBUG : rclone: Version "v1.65.2" starting with parameters ["rclone" "mount" "--daemon" "--allow-other" "--vfs-read-chunk-size" "32M" "--vfs-cache-max-size" "10GiB" "--poll-interval" "15s" "--vfs-cache-mode" "full" "--cache-dir" "/Users/ishaanrathod/rclone/cache" "--disable" "About" "GDrive:" "/Users/ishaanrathod/GDriv" "-vv"]
2024/02/07 22:03:32 DEBUG : Creating backend with remote "GDrive:"
2024/02/07 22:03:32 DEBUG : Using config file from "/Users/ishaanrathod/.config/rclone/rclone.conf"
2024/02/07 22:03:32 DEBUG : Reset feature "About"
2024/02/07 22:03:33 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: Quota exceeded for quota metric 'Queries' and limit 'Queries per minute' of service 'drive.googleapis.com' for consumer 'project_number:202264815644'.
Details:
[
  {
    "@type": "type.googleapis.com/google.rpc.ErrorInfo",
    "domain": "googleapis.com",
    "metadata": {
      "consumer": "projects/202264815644",
      "quota_limit": "defaultPerMinutePerProject",
      "quota_limit_value": "420000",
      "quota_location": "global",
      "quota_metric": "drive.googleapis.com/default",
      "service": "drive.googleapis.com"
    },
    "reason": "RATE_LIMIT_EXCEEDED"
  },
  {
    "@type": "type.googleapis.com/google.rpc.Help",
    "links": [
      {
        "description": "Request a higher quota limit.",
        "url": "https://cloud.google.com/docs/quota#requesting_higher_quota"
      }
    ]
  }
]
, rateLimitExceeded)
2024/02/07 22:03:33 DEBUG : pacer: Rate limited, increasing sleep to 1.182693159s
2024/02/07 22:03:33 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: Quota exceeded for quota metric 'Queries' and limit 'Queries per minute' of service 'drive.googleapis.com' for consumer 'project_number:202264815644'.
Details:
[
  {
    "@type": "type.googleapis.com/google.rpc.ErrorInfo",
    "domain": "googleapis.com",
    "metadata": {
      "consumer": "projects/202264815644",
      "quota_limit": "defaultPerMinutePerProject",
      "quota_limit_value": "420000",
      "quota_location": "global",
      "quota_metric": "drive.googleapis.com/default",
      "service": "drive.googleapis.com"
    },
    "reason": "RATE_LIMIT_EXCEEDED"
  },
  {
    "@type": "type.googleapis.com/google.rpc.Help",
    "links": [
      {
        "description": "Request a higher quota limit.",
        "url": "https://cloud.google.com/docs/quota#requesting_higher_quota"
      }
    ]
  }
]
, rateLimitExceeded)
2024/02/07 22:03:33 DEBUG : pacer: Rate limited, increasing sleep to 2.285942813s
2024/02/07 22:03:34 DEBUG : pacer: Reducing sleep to 0s
2024/02/07 22:03:34 DEBUG : Google drive root '': 'root_folder_id = 0ACz4OyPlCn5BUk9PVA' - save this in the config to speed up startup
2024/02/07 22:03:39 DEBUG : rclone: Version "v1.65.2" finishing with parameters ["/usr/local/bin/rclone" "mount" "--daemon" "--allow-other" "--vfs-read-chunk-size" "32M" "--vfs-cache-max-size" "10GiB" "--poll-interval" "15s" "--vfs-cache-mode" "full" "--cache-dir" "/Users/ishaanrathod/rclone/cache" "--disable" "About" "GDrive:" "/Users/ishaanrathod/GDriv" "-vv"]

There is no need to use sudo. I am on macOS and never had to resort to it.

Are you using macFUSE or FUSE-T? either it is make sure you have the latest version.

I recommend FUSE-T - as macFUSE kext is just wrong solution with latest macOS. It is kernel extension which can introduce all sort of problems. Especially using it with sudo is asking for trouble.

how to know if u using macfuse or fuse-t? i think its macfuse afaik but i need to be sure

also is copy pasting reliable through mount on fuse-t? i do not want to use cli or rclone browser thats just too much time consuming

It is your computer... so you should know what you installed:)

Here you are links to both:

https://osxfuse.github.io

You will also find there info how to uninstall.

Yes all works no problem. I use it daily.

1 Like

alright, i will revert back, thank you so much!

when trying without sudo you should first delete /Users/ishaanrathod/rclone/cache and /Users/ishaanrathod/GDrive. Most likely they have now wrong permissions. Or just set their ownership and permissions to your usual user (with -R option)

I installed the fuse-t with its package installer and applied full security by removing kernel extension access and restarted my system. I hope thats right.

I'm sorry i do not understand the -R option but i did delete the /Users/ishaanrathod/GDrive folder and the cache folders developed inside /Users/ishaanrathod/rclone/cache/ and not the cache folder created manually by me. Is that right?

UPDATE: i tried running without sudo and faced this error:
2024/02/07 22:49:04 Failed to load config file "/Users/ishaanrathod/.config/rclone/rclone.conf": open /Users/ishaanrathod/.config/rclone/rclone.conf: permission denied

This is good. You should never lower security unless you have very special reasons. Definitely not for running rclone mount

You have to change ownership/permissions to all files/directories so chown and chmod have to be run recursively. Deleting everything will spare your from trouble of learning how to do it.

as before you used sudo now files have wrong ownership/permission. Change it. Or delete and create again.

Now your issues with throttling:

To avoid it for gdrive you need your own client_id/secret. Follow documented steps and do it.

After probably the easiest is to delete remote and create it again but this time providing your freshly minted parameters.

You might also have to add some transactions or transfers limiting flags. I am not familiar with gdrive - but you will find a lot of threads about this subject on this forum. For sure as a bare minimum you need your own client_id.

i do not exactly understand whats the error in the log, can u explain me why would i need a client id/secret id and what problem would it solve

All is explained in documentation. Just read it. And if something not clear we can clarify.

while creating the client id, it asks me to add scopes in "Oauth Consent Screen", like .../auth/docs, .../auth/drive, and ../auth/drive.metadata.readonly, where to find these scopes? its not listed in the available scopes

All is there... but fair point that maybe docs are not perfect. If you think you can maybe clarify them please submit PR. It is very valuable when others see things a bit different:)

Here my take on this - simplified but maybe more clear:

the links do not open "You are receiving this error either because your input OAuth2 scope name is invalid or it refers to a newer scope that is outside the domain of this legacy API."

Which link? Sorry no idea what you refer to.

im sorry i did not understand ur previous comment.Can you please tell me which scopes should i select from below

as per documentation... I am not going to read it for you:)

search (filter) for these strings:

https://www.googleapis.com/auth/docs
https://www.googleapis.com/auth/drive
https://www.googleapis.com/auth/drive.metadata.readonly

As I remember it was easier that going through those looong lists.

Google dev console is very ugly and not easy to use. I think the same:)

exactly what im saying, when i search the filter with these strings theres nothing that matches, should i just paste these strings and manually add them in table?