ACD oauth proxy broken!

It is no API limit. It is a “per account” limit. Using another app won’t help. Also uploading from Google Cloud Compute won’t bypass the limit. You just have to wait or use multiple accounts to upload.

1 Like

That doesn’t explain though why I can still upload through the website, but whatever, that’s not too important? More importantly I guess. Is it a 24hour ban? or is it a 24hour limit? as in will it refresh to 0 at midnight my time? midnight GMT? or 24hours from the time the message triggered?

The limit is not necessarily reset at midnight but I experienced that it is reset at the time, the account has been created. So it is a 24h limit. Every 24h you can upload another 750GB (approximately. In my case it is more like 800GB). You won’t be banned for 24h.

1 Like

That is extremely helpful, and explains why my limit was actually 759GB (I left that detail out as I thought it insignificant) as well as providing me an exact time on the resets. So thank you for helping me to fully understand how this quota works.

Hmm so at roughly 4pm Tuesday I hit my 750GB daily quota. At 6am Wednesday it had still not refreshed. At 6pm Wednesday it had refreshed. This seems to indicate my daily reset timer is between 6am and 3pm (because if it had reset at 3:30… when I was halfway through using it… in theory I should’ve gotten another 300gb or so that day? or can it only reset if it’s full?)…

Odd thing though is that this account’s hello your account was just created email was at Sat, Feb 3, 2018 at 1:42 AM
.Maybe it’s not at google account creation time, but the time that is whatever the first time that google account specifically used drive for the first time? (Which is a time I don’t know).

edit:
nevermind my quota just reset at 6:20am ish roughly despite me using my entire 750GB around 4am.
edit2:
nevernevermind this quota refresh seems “fake” as after only 12GB I’ve been throttled down to the point I cannot upload anymore at all and rclone is trying each file 10times then erroring out. Maybe I do need a 24hour wait function built into rclone. The too many api requests error has a different text message than the 750GB quota error. So in theory rclone could treat them differently. If I had a 24hour wait function, I might actually get to leave the house this month.

I think too many api requests causes: “Error 403: Rate Limit Exceeded, rateLimitExceeded”
and 750gb daily quota causes: “Error 403: User rate limit exceeded, userRateLimitExceeded”
although I might be wrong.

Just snuck another 1GB through, still super confused, totally wish I knew what time of day my reset was.

You could apply a bandwidth limit using –bwlimit=8650k which should net you roughly 750GB in 24 hours, avoiding a ban and allowing a continuous transfer until completed.

1 Like

Whelp it still hasn’t reset and I’m bored/tired/needtoleave so either the daily quota does not reset at one specific time each day or my specific time is roughly between 12:30pm and 3:30pm…
Which would be randomly annoying that after checking 21 out of the 24hours of the day my daily reset time is in the window I missed. Although it would be even more annoying if it’s 24hours from the time at which the message appears :frowning:

edit:
11pm. quota appears reset. daily ban first arrived at 4:30am today. Lifted before 11:30pm. Guess I’m still wrong and @sebiTNT is still right.

@boltn a bwlimit command won’t work, beause while googledrive maxes out at accepting around 150-180megabytes per second when transfering large files when transfering small files it can only manage like 200kilobytes per second. I guess I won’t care about any of this though once I get my initial migration finished 750gb per day will be more than enough.

Thanks, @left1000. I have been procrastinating. But I’ll definitely plunge in this migration project in the next few days, as things with acd and/or gdrive might change before I can forget about amazon completely.

a new problem I ran into:

it’s really going to be hard for me to get all 20tb moved before the middle of march isn’t it google! I managed to upload roughly 4TB this first week, but, I’ve run into so many temporary bans and problems that I’m slightly worried I won’t make my goal.

@carlyuanliu
if you have a large amount of data I’d start ASAP, the api request limits and 750gb daily quota make migrating to gsuites a pain.

Thanks, @left1000! I have less than 1T and 70 some days left. When it comes to it, I would resort to downloading everything to a local drive and then upload to my destination. Not sure if this would help you though, with acd daily limit and all.

The reason I want to use rclone is that it’s the only way to copy/move files preserving the date/time value of each file.

I also depend on rclone to get me a complete listing of all files in a cloud drive. It’s funny that none of the cloud drives would give me a file listing, let alone a normal DOS DIR command listing.

By the way, if anyone can give me a program that can convert rclone file listing to DIR listing, I would greatly appreciate it. I have tried to use Access, but I hit the limit of table size. Then, perhaps I should start a different thread for this.

Anyway, another reason I am fleeing acd is that they do not provide search function for double byte file names. This is confirmed by their customer service. However, this may or may not be right, as that guy did not sound like he knew his stuff.

You could always use an rclone cache rclone ls pairing to search any remote in any fashion you want. I will agree though that amazon’s own keyword search system for ACD is awful.

Randomly today, … ACD vomited 30,000 files into ACD’s root directory structure. (a location I’ve never kept files.)
Yesterday I’d finished moving 100% of files into the trash.
This can’t possibly be rclone’s fault since rclone hasn’t been on.
What gives with these files? they don’t even seem to all be from the same directory, or day I deleted them, most are from a random media folder yet some are random portions of video game backups.

Figure I’ll mention it here, wonder if anyone else ever observed this behavior.

I’ve seen this too. When running the integration tests rclone deletes lots of files. These appear randomly in the root directory for no reason I’ve been able to determine!

Anyone else been having issues with ACD and rclone + alternative auth methods yesterday or today? I’ve been rate limited / receive error code 429 for over 24 hours now :frowning:

Argh, never going to get my data moved in time.

pacer: low level retry 2/10 (error HTTP code 429: "429 Too Many Requests": response body: "{}")

Any info appreciated!

I saw quite a few files appear in acd root sometime ago, and wasn’t sure what happened. If it was caused by acd, then I assume I can safely delete all of them, right?

Yes, that’s what I did, I deleted them.
Of course I didn’t just blindly delete them, I looked at a few of the filenames and the magically appearing files were in fact files I had manually deleted off of ACD in previous days.

The acd connection seems to break again.
Here is the command and the message that I got:
rclone lsd acd:/
2018/03/31 22:42:06 Failed to create file system for “acd:/”: failed to get endpoints: HTTP code 429: “429 Too Many Requests”: response body: “{“message”:“Rate exceeded”}”

Another example:
rclone rmdirs acd: --dry-run
2018/03/31 23:00:51 Failed to create file system for “acd:”: failed to get endpoints: HTTP code 429: “429 Too Many Requests”: response body: “{“message”:“Rate exceeded”}”

But my deadline is quickly approaching, and will try to use MultCloud to move my acd data to GDrive or somewhere else.

I have been using acd client to download my 890 GB data, but the process gets slower and slower after a couple of hours. I tried to stop the downloading, waited a while, and restart it again. It’s just too much trouble.

Hopefully I will be done with acd soon. Very bad experience.