Gdrive to Gdrive backup, fastest way?


#1

I want to backup my primary Gdrive with around 100TB to a new account as a backup. I tried this before using a Google cloud compute but together with the 750GB limit it was too slow (30 days in had double files and stopped) when the backup is in sync keeping it in sync is not a problem.

Bonus question; is a backup even needed? Most are media files and a backup would only be against user error (so me deleting, or sonarr and radarr going berserk and deleting) if something happens to Google then both my primary and backup will be gone (unlikely case)


#2

I would only backup to another cloud provider, since if sonarr/radarr goes bezerk you always can undelete from trash.

Iam using dropbox and gdrive synced, just in case if any of providers change the rules or drop API support for rclone and/or other tools.

My main is dropbox due no API/upload/download limitation’s

Google Drive sync is lagging a bit behind since I download 2TB of new stuff within a day.

Plex stats

Radarr stats

Sonarr stats

Waiting for new Sonarr so I can redownload where cutoff is unmeet ( replace all web downloads with blueray releases )


#3

Ah yes forgot about the trash on Gdrive :grinning: that’s one less worry. You mention Dropbox, they’ve got unlimited or near unlimited storage now?


#4

You could also clean up/empty trash gdrive once per day with script
rclone cleanup gdrive:
any maybe add some check for size eg if size changes for X then do not preform cleanup.

As for dropbox its unlimited however when you reach less then 30% free space you need to contact support so they add you more space. I do it on live chat every few months and it takes around 5 minutes.

It was pain at beginning when I was syncing 120TB library to DropBox since i was contacting them daily to add more space. Google also have 10TB daily limit download transfer and I was transferring between 7 and 8TB daily.

Dropbox i think it gives you 30TB so once you are at 21TB you contact them and then depends some operators will give you 10TB more some 20TB, once you get over 100TB then they started to give me 30TB per request.

I love dropbox since you dont need any rclone cache but mount it directly and it works perfectly.
The only downside is its quite expensive since you need 3 accounts in team for unlimited = 54$/month


#5

Yeah I keep forgetting to script the trash cleanup so I can run it manually when needed. I’ll research the Dropbox, always interesting to have a backup plan in place for Google. IF they enforce the 5 user for unlimited I would still do it but you never know, maybe they pull an acd :wink: You use Dropbox only for backup or did you also test it as a fallback for Plex?


#6

Now I use dropbox as main and gdrive as failback.


#7

Looking at the website of Dropbox now, is that Dropbox Business (advanced)? The no API limit sound nice, are there any hidden limits (file access limits etc) like on Gdrive?
You’re running Dropbox without any vfs layer? Just directly mounted? How’s the speed and bandwidth compared with Gdrive?


#8

Directly mounted, no speed or bandwith limits and it works perfectly iam achieving max speeds per upload/download.

This is traffic on my download/upload server ( I run sync & plex on different servers )

99% of stuff is from Usenet eg no torrents seeding.


#9

That’s really interesting. You’re moving the media directly to the Dropbox mount?
I currently got a download server with 20TB in raid10 for staging, downloads will be processed to mp4 and put in a mergerfs local/gdrive media directory. Offloading with rclone daily. Had API / file access count issues because the localmedia agent would rescan whole series and gdrive didn’t like that so wanted a big buffer for all series, if Dropbox doesn’t have those issues that would be a real advantage in my case.


#10

No Iam still using rclone move from local to dropbox.


#11

Well wouldn’t you know, we talking about Dropbox and today I got Google API errors :face_with_raised_eyebrow:

Got a trial now and setting up a test server. Looking through the info of Dropbox the business plan shouldn’t have API limits but looking deeper it states 25000 for data transport partners. I’ll chat with support when they open about this.
Do you have this also in your admin dashboard?


#12

This is what i see In my dashboard

Dropbox Business Advanced Plan (includes 3 licenses) + Unlimited API Call Rate Limit + 2128 x 100GB of Space + 3 x 1TB of Space

And if I check docs

Data transport limit
Introduction
Calls to certain API endpoints will count as data transport calls for any Dropbox Business teams with a limit on the number of data transport calls allowed per month. This applies only to certain Dropbox Business plans, and not other types of accounts.

There were days when I was uploading around 8TB daily and never had any limitation

Tautulli / PlexPy stats


#13

As far as I can tell from reading forums and Dropbox info it’s part of their new plan. I presume you got that plan a while now? Uploaded 112GB to Dropbox through rclone and speeds seems fine. Used 4% of the api limit. If the limit is in place for only uploads to Dropbox then a max of 3TB can be uploaded in a month (149Mb chunks) still need to chat as support chat is closed last time I checked.


#14

I was researching dropbox as well in case drive goes up in smoke. I’ll be quite interested in what you find out around limits and performance of their current plans.


#15

Well at the moment I’m waiting for Dropbox to respond to my mail but no reply yet. If the API limit is only for uploads I might consider it after testing it for speed, if its for up and down than its not for me.


#16

Wow!
What is your server specification for 17 concurrent transcodes?


#17

Me too. It’s always good to have a backup. And I guess 3x15 EUR for such a filespace, is not that huge amount of money. Even if you can split with friends :smiley:


#18

Well for one it’s not 3x15€ but 3x25$ or 3x20$ if you pay annual. I’ve contacted support via their forum and the API limit is only for upload. There is a business plan without upload limits and waiting on sales to tell what the costs etc are. Going on what’s on there forum it will be 10 licenses for unlimited API calls I guess.


#19

yes, you’re right
3x20 USD (18EUR) billed monthly
or 540 EUR billed yearly (3x15 EUR / month)


#20

Check again cause you’ll need Business advanced for unlimited storage. Advance is 20$ per user billed yearly or 25$ per user per month.