ACD BAN/Alternatives discussion (error HTTP code 429)

I read some people where ban after uploading lots of data from ACD to gdrive. Someone have a rought idea of the amount of data per day i could transfer safely without any ban risk from acd to gdrive ?

bandwith:
Ingress is free (download from ACD to VM)
Egress to google services (e.g. upload to gdrive is free. currently promotional offer)

Let me correct this:
15TB ingress generate about 1% egress, should be 150GB with 0,12 per GB = 18€ for traffic. (which shortens my ~9 free day only insignificant)

1 Like

not like a mount. odrive uses a sync agent, like dropbox. therefore you need to have copies of all files on a local hdd.

What is a PLAIN remote?

I cannot answer this. I hope that they will just ban upload in that case. Which is impossible via rclone anyway at the moment.

I will report back if i got banned after the 15TB are transfered. (or not :smiley:)

do not use a crypt remote over gdrive remote to transfer initially.

Wait what? Is that something that might happen? I’m about 200GB into the ACD --> GSuite transfer right now. I was going to leave it running for the next 3-4 days straight. Is Google known to ban people for too data transfer?

I think he means the other way around. Amazon is going to ban you for excessive transfer.

AFAIK Google just rate limits you in case you overdo it.

Ah gotcha. Sounds plausible at this point. Has this happened? That would be a pretty nasty call if they banned my account. “Look MFers, I am trying to pull my data off your network because I need linux access and you blocked it. Gimme my data”

:slight_smile:

1 Like

I just tryed transfering (from a remote VPS under windows 7 64bits) with Expandrive (7 days trial) directly from ACD to google drive, and it’s barely usable : really slow (600-700kb/sec !) and quite often some errors.

Do you guys not want to see what the response is from ncw before running for the Google hills?

IF this gets resolved in the next few days, I might consider ACD still as a backup, if it gets back “again”.

This incident and the “i can do whatever i want with your account” service terms of Amazon just reminded me of the importance of a backup.

3 Likes

IF This gets resolved :slight_smile:

1 Like

Having a copy on gdrive and acd is never a bad idea. I had this and was using a sync task to sync them once a week or so. So my gdrive is up-to-date except for the last few days. Unfortunatly, I did make a mistake of using different encryption keys for gdrive and acd so that complicates getting the last few days a little bit.

Yes, a response would be great - but if Amazon even refuse to answer....

1 Like

That’s why ACD is my backup. I have:

  • Synology NAS with RAID5/SHR as a primary data store (photos/docs/tv/movies)
  • Offline backup (4TB USB drive which I periodically refresh after checking my data integrity - it’s my ransomware insurance)
  • ACD as a cloud/offsite backup
    I don’t back up the TV/movies - they’re all retrievable by other means (re-ripping or downloading). So if ACD/rclone is permanently borked, I’ll just terminate my ACD account, re-subscribe and re-upload to GDrive/GSuite. I only switched to ACD last year because GDrive did 1TB or 10TB but nothing in between (and I need about 1.5TB). Ironically, they now do a 2TB plan, so I may go back to them anyway (although it’s still 3x the price of ACD’s ‘unlimited’ storage).

Amazon sucks on more levels than just this. Final straw for me.

1 Like

Yeah, I have to admit, I was pretty disappointed to find that

  • their app sucks bigly
  • the Synology Cloud Station ACD sync sucks bigly
  • Prime Photos doesn’t have search or image recognition like GPhotos does (it’s available in Family Vault apparently, but that’s still US-only)
  • Transfer rates can be really slow
  • Document support is nowhere near as good as GDrive

Can you elaborate on how to execute the recursive sync? The odrive linux CLI documentation is really poor and I’m at a loss here how to make odrive download my ACD content to the server. Is there a possibility to choose which directories to download?

Exactly the same issue here, here’s my current attempt

python “$HOME/.odrive-agent/bin/odrive.py” sync ~/odrive-agent-mount/cloud provider/backup/fileso.cloudf

How do I make this recursive? I’ve tried the normal arguments


Edit: I see you need to script it, can anyone help with this?