Ok I figured out the slow scan issues I was having. I had the Plex media mounted over smb. I switched it to being on the local machine and scans finish almost instantly. Really happy about that because I had thought it was just a con of using a cloud storage medium.
In the process of switching from smb, I created a new instance of rclone for the Plex machine. So since I now had two instances of rclone with the same configuration, I thought I would test the copying again.
Unfortunately, I had the same slow speeds copying using rsync locally. Although I noticed something strange. Copying using rsync from one machine resulted in slow speeds, but playing back media on the other machine using a different rclone mount resulted in the first machineâs copy to speed up.
So thereâs some sort of interaction that is going on.
Iâll continue to do more testing, but something makes me feel that changing my IP will result in the issue going awayâŚ
Do you ever get duplicate files when using rclone copy?
I have been having this issue occasionally and have been using rclone dedupe to correct it. A quick search shows that it is a Google Drive issue but I feel that I get them pretty frequently?
Also I have another weird issue. I get these messages every so often:
NOTICE: Encrypted drive 'direct-decrypt:': ChangeNotify was unable to decrypt "413167835.LZ": illegal base32 data at input byte 9
That particular file is not within the scope of direct-decrypt: at all.
I have it setup now so GD: has a root folder id of a folder called "Sync" and direct-decrypt decrypts the files in the "Sync" directory. That particular file "413167835.LZ" is outside of the "Sync" directory and in a completely different folder.
[GD]
type = drive
client_id =
client_secret =
service_account_file =
token =
root_folder_id = ["Sync Folder"]
[direct-decrypt]
type = crypt
remote = GD:
filename_encryption = standard
password =
password2 =
Iâve also used this config and had the same problems.
[GD]
type = drive
client_id =
client_secret =
service_account_file =
token =
[direct-decrypt]
type = crypt
remote = GD:Sync
filename_encryption = standard
password =
password2 =
Itâs really strange because the error message only comes up occasionally right after this
2019/04/14 22:20:34 DEBUG : Google drive root '': Checking for changes on remote
Is there a way to get rclone to show the full directory path when it gives errors for those files? I know they are in my drive somewhere but not inside the rclone folder.
Does anything seem off with that command? Iâm honestly not sure if buffer-size or checkers makes a difference with copy and can remove them if it doesnât affect performance.
I am not sure why you are setting any buffer size. Google limits to 10 transactions per second so setting 16 checkers is going to 403 a lot as the default transfers is 4. Iâd run with 4 and 4 or something along those lines.
drive-chunk-size is useful on copy commands as 32M or 64M is generally the sweet spot for Google.
My command I use is pretty straight forward with an exclude:
and got nothing. There were no errors except for some 403 rate errors. Nothing even resembling the error I mentioned earlier.
So do you mean I should just leave checkers at the default 4? Why did you adjust your checkers and transfers to 3? How did you know if 32M or 64M is better for chunksize?
That is my rclone.conf file exactly as is. There was no âdirectory_name_encryptionâ and the directory names were being encrypted by itself. Thats what I was trying to say. I think I also just answered my question. It appears that now when editing rclone.conf, it always adds a directory_name_encryption and sets it as either true or false. Mine didnât have anything, so it might have been because a previous version of rclone didnât require it.
The error with the files isnât gone. My mount logs still occasionally show the error. Its not consistent. I believe that it is a bug because I donât have those file even in the path of the rclone crypt mount.
With uploading, what should I be looking for to know what works better?
Iâve had time to do more testing on this issue. I connected a laptop directly to my modem from Comcast and received a public IP on the device. No router or anything in between.
I then attempted to download files from
Google Drive Web Interface
I had similar results to before. Speeds were at a steady 2MB/s.
Rclone Rsync copy in Ubuntu VM
Same results as web interface, slightly slower. Speeds started at 10MB/s then dropped to 1.5MB/s.
Speedtest.net and DSLreports
Received results of ~950mbps down and ~40mbps up.
I had been searching around and seen this issue but am unsure how to use it?
Ok I have been doing some testing using Wireshark to determine the ip that Chrome downloads were coming from to see what I could add to the hosts file. After adding several items to the host file, I was still unable to get it to use the ip that I entered (keep seeing different IPs in Wireshark).
Anyways, I noticed that my downloads from Chrome returned to full speed again when it was the slow speeds yesterday. Copy speeds from Rclone mount have also went to full speed. I made no configuration changes, so I have no clue what happened or if the issue will reappear again.