It seems my transfers are slowed to 0 bytes/s and no reason why. What is going on and why is this happening? Should I limit my number of transfers or rate-limit myself? I see many people transferring “200mb/s” and doing 1 Terabytes or something or 450 GB in 24 hours. Why can’t I do that?? All I want to do is transfer 4 TB to this Google Education account (unlimited storage) and I see that I can’t go very far without these damn issues…
I’m not familiar with PLEX or anything and only know of Rclone Sync and Copy – Should I try something else?
Mostly what I’m doing is uploading media (MKV) files and kinda using Google Drive as a backup database. What should I look into for trying to transfer these files from a CLI (preferred) or maybe a GUI is okay?
Well for one you do not want to use --transfers 8 for googledrive because googledrive is heavily api request limited, few concurrents will work better… also you might want to try --drive-chunk-size 128M
Transfering 30gigabytes in 45minutes is a prefectly good pace (the absolute daily limit is 750GB). the problem is that you ended up with 0useful data in the end because of your command flags…
also without -v or -vv there’s not enough information to know if google is throttling you or not…
You don’t actually mention where you are transferring from?
Are you uploading from home or a seedbox or a dedicated server etc…
What is the upload speed of whichever of the above you use? Personally i always found the default 4 simultaneous transfer was optimal. Absolutely nothing will be gained using --transfers 8 imho.
It’s the GD upload quota limit that Google implemented last year.
Basically you can only upload in total about 750GB a day. Beyond that the transfer rate reduces to 0 until midnight (of the Google server you are using.)
Some people set -bwlimit=8000 so that they don’t transfer more than 750GB a day. Otherwise just let it reach the quota because the second day it will still continue. People also set --tpslimit=9 as well.
Google the keyword “Google API 403 quota limit” or something like that and you will find tons of discussion about it
I’m guessing you’re running this as a one time sync and then you only sync the changes?
Have a look at the issues I had in the past:
Now, I can tell you that with me it helped a lot having two different setups in rclone, so I use my own API one to basically sync all of my local (home) data and then I use another one to replicate an “offsite data” folder I keep only on Gdrive. That way, if somebody by mistake deletes something in the offsite folder, I can grab it back thanks to the backup I have on gdrive itself.
Once the full backup was finally completed, the sync’s won’t take all of the 500/750GB day limits and so everything worked ok.
Now, as for the limits, I was told 500GB in my thread, now it seems it’s 750GB. Now, one issue I’ve been having recently is that I moved off Plex Cloud (it was just s*it and kept getting stuck) so I use Google’s Drive File Stream and linked a few folders to my own plex server. That started making so much traffic that every day I’d get banned until the midnight. So, I’m hoping that plex is gonna finish soon what it’s doing
I don’t want to go too much off topic, but I started wondering: I know how much data rclone uses, but I don’t know how much data Plex is downloading for its metadata updates. So I have had like 4 chats with Google’s Business support which were horrible. They were not releasing any official data on the actual amount per day and nor they have an alert in GDrive’s Admin console to send notifications whether you’re reaching your quota at least you could proactively stop any process and not lock the entire thing. It’s ridiculous that nobody that uses Google Suits for Business every complained about setting alerts, I just can’t believe it! Anyhow, I’ve sent an idea about this (upvote here if you like it: https://www.cloudconnect.goog/ideas/14077)
All of this to say that yes, Google Drive for Business is probably c*ap if used a bit more intensively. It’s worth it only for small business who do not need a lot of data exchange. But hey, you have unlimited storage…
All of this to say that, I was told by their support that if it gets locked,you can unlock it from your own Admin console, however, if you only have one single account, you won’t be able to and to contact them back to get it unlocked it before time (which I did, and guess what? They asked me to wait until the day after…).
I’ve got some screenshots of the chats that are like the stupidest thing you may look at today.
There are multiple 403 errors. The rate limit error is actually referring to the limit of api requests per minute which is something like uh, roughly somewhere between 10-100. When you’re 750gb daily quota limited you’ll see error 403 with both normal rate error’s and user rate errors, both randomly.
also use -vv to help people understand what’s wrong.
–checkers=8 is the default, so that command shouldn’t actually do anything
Also if you really really don’t want to see 403 rate limited errors you can use tpslimit=1 or something,
but one type of 403 rate limit error is unavoidable and perfectly fine to see because it only lasts 10-60seconds. The other 403 error lasts until midnight for the google server. HOWEVER I’ll note that in EST my googledrive daily quota resets at 7pm because it’s using GMT time on it’s server. So it’s not your local time, nor even the local time of your server, but the time your server is using, which is likely to be the same as a GCC VM’s internal clock time. Although midnight GMT is probably very popular as the reset time.
If I were you I’d probably try something like --transfers 1 or --transfers 2, although this is likely to slow you down unnecessarily, without better details on how much usage you’re able to achieve it’s hard to tell what if anything is actually wrong.
I’d say googledrive would likely accept roughly 200,000 1kilobyte files per day or 750 1gb files (files can be much larger than this, this was just a simple example). With a mix of filesizes and filecounts you’ll likely reach something in between. Large files make it easy to hit the 750GB daily quota limit, tiny files make it impossible because there’s just a flat limit to how many files google will take in per minute (which is roughly I dunno 50-100ish) which is based on it’s API throttling which works differently than it’s bandwidth throttling although both for some silly reason use error 403.
One thing that will confuse the issue though is that when the user rate limit message is being given you’ll also still at the same time get the other message along side it. presumably because spamming api requests that all result in error messages ends up resulting in sending too many api requests on top of your daily quota limit being reached.
So the two possibilities will be spam of the latter message or both messages (I’ve never seen the user message totally alone for very long.)
If you have; Generate video preview thumbnails, Generate chapter thumbnails, Upgrade media analysis during maintenance, Perform extensive media analysis during maintenance plex options on, you’ll likely get banned as it’s downloading and reading a lot of each video file to do its indexing.
@left1000 So here’s the thing. I only have 4 transfers going at a time and generally, I get to around 40 GB before my transfers’ speed reduces to zero.
I use -vvv and even that I get no info as to why my transfers suddenly stop. It’s not like it’s a network problem because I’ve tried over different networks, VPNs, etc.
What I see when I try to restart rclone (Ctrl + Z,X,C) and then open it again I see a 403 rate limit error and then my uploads start again. Other times I’ve seen my uploads not even start and I see the status page refresh with nothing there but a 0byte/sec transfer rate.
It’s not like I’m having a ton of transfers and random stuff going on to send this many API requests. I just can’t wrap my head around why I’m getting these issues.
@gforce Can you please explain from a high-level (or not) how your Plex project circumvents these issues? Also to be clear, I’m uploading to google drive media files and such - not downloading/streaming yet.
@aj1252 I’ll get back to you as what the error is but I’m not certain which but I’d assume it’s the API Rate Limit Exceeded. I get to 40 GB before I get those errors let alone 750 GB.
Edit: Here’s a paste of my transfers today I just tried. You have to scroll a lot but after going 3/4ths of the way down and look around that area you’ll see my speed go from ~2mb/s to 0 in maybe 30 seconds. https://pastebin.com/raw/23TNnng2
Having a similar issue here, after 2h or so rclone copy just stops. no error no messages in log. I am copying from gdrive to gdrive using: /usr/bin/rclone copy /root/clone/unencrypted/film GCrypt:film --no-traverse --no-update-modtime --transfers=3 --checkers=3 --min-age 180m --drive-chunk-size 128M -vv --stats 30s --log-file=/root/clone/copy.log
Did you try with the latest beta? I improved the timeout code recently which might help. So after --timeout of no transfer it should kill the connection and retry.
Can you post a log somewhere with -vv
That is a good log showing the problem! The problem appeared to start at about 2018/02/22 13:34:17. I would have expected stuff to timeout 5m after that at 13:39 - unfortunately the log didn’t quite on on long enough!
Can you try the latest beta and see if it does timeout after --timeout (5 mins by default).
I spent some time debugging a similar issue which turned out to be some sort of driver problem.