Migrating 100tb + from Google drive to DropBox

So as per the title, I currently have in excess of 100tb of both my companies and my personal data archived using Google workspace via 5 user accounts and I'm looking to migrate everything over to a 3 user Dropbox business account.

I have everything encrypted via rclone and I want to maintain that encryption on Dropbox and continue using my rclone mount in the same way once the transfer is complete

I'm presuming (hoping and praying :grinning:) this will be a fairly simple case of adding Dropbox to my rclone .conf file, then setting up an rclone copy script to duplicate all of the data, followed by a tweak to my mount script.

As this is such a big job and I am shifting my company data at the same time, I want to be very careful I don't mess this up, so it would be great to get a helping hand with the best way to script this

Would I also be right in thinking there is a max of 10tb that can be transferred per day? I have tried getting this info from Google, but that was like hitting my head against a brick wall :roll_eyes:

Thanks in advance for the assist

check out https://rclone.org/crypt/#backing-up-a-crypted-remote

and when i need to do large data transfers between two providers, i rent a cloud vm and run rclone on that
https://black.host/unmetered-vps-hosting

1 Like

Thanks for that.
Can I ask what the advantage of renting a cloud based virtual machine to do this is?

well, mostly, it depends on the quality of internet connection that is running rclone.
maybe your connection is not stable, has low speeds, isp data caps, etc...

for $12.00, for one month, i can run a cloud vm with 1Gbps, run rclone on that.

3 Likes

before starting the migration, maybe @Animosity022 has some battle tested suggestions.

Ah yes of course, that makes perfect sense!

I wasn't thinking about the local limitations as I hadn't factored in the transfer was happening via rclone

So ideally I want a cloud based VM with a nice fast connection :wink:

Really no magic other than I ran a copy and waited :slight_smile:

I'm curious about how long the process actually was. Did you experience any speed/data limitations either at the Google or Dropbox end, or was the connection saturated?

You can only move about 10TB per day per user as that's what I found from testing. None of that is documented and Google will tell you nothing.

My favorite day was closing out my GSuite account.

1 Like

Thanks gents

I notice in the rclone documentation that the advised way to backup an encrypted remote is using sync.

Bearing in mind this is going to an empty Dropbox folder, is there any advantage/disadvantage to using sync as opposed to copy for this procedure?

This was the basic script I ran...

rclone sync -i RichFlixCrypt: PeaPodCrypt:

If for any reason I have to stop and start the process again would I just add

--ignore-existing

to the script? Or would that only be required for copy as opposed to sync?

could run your command as is. but would not use --ignore-existing

as this is a first time copy/sync and the dest is empty, you could tweak the command.
but given the large amount of data, not sure it is worth it.

just make sure to saturate the internet connection, perhaps tweaking --checkers and --transfers

No, they will do the exact same if the target (Dropbox) is empty when you start and the source (Google) doesn't change at all and you do not use some of the more advanced flags (e.g. --no-traverse) - even if you have to restart during the migration.

Each have their advantages and disadvantages, e.g.
copy will protect you from accidently deleting files at the target (e.g. when forgetting the path)
sync will allow you to migrate from a source that is changing in the process (assuming you do several syncs)

not sure that is correct, as you want to backup the dir containing the files in their crypted state.
as the redacted config file was not posted, cannot share exact command.

@asdffdsa is right, you want to transfer the encrypted files directly, so your first trial command should probably be something like

  rclone sync --dry-run RichFlix: PeaPod:

@Animosity022 is the expert here, but I guess you can speed it up if adding these flags:

--checkers=16 --transfers=8 --drive-pacer-min-sleep=10ms

I suggest you first test with a small subfolder and then make sure you can see and use the files with your new mount.

Also, please make sure you are using the latest version of rclone, that is v1.59.2.

1 Like

not sure that is correct, as the goal is for rclone not to decrypt/re-crypt the files.
that would force rclone to download+decrypt from gdrive and then re-crypt+upload it to dropbox.

should copy the dir that contains the already crypted files, not the crypt remote
something more like
rclone sync --dry-run gdrive:crypt dropbox:crypt

Agree, I think we mean the same, just making different guesses on the information we don't know (yet).

That is why I removed the Crypt part of the remote names.

Thanks guys, so I'm confused about where you think my command is wrong?

RichFlixCrypt and PeaPodCrypt are the names of my encrypted remotes. The intention is to copy over the entire remote from Google to Dropbox

I can also see encryted files that match the ones on Google appearing on DropBox

Where am I going wrong?

please post the redacted output from these two commands:

rclone config show RichFlixCrypt:
rclone config show PeaPodCrypt:

please post your redacted config file and i can share the exact command you need.

These are the relevant sections from my .conf file

[RichFlix]
type = drive
client_id = ??????
client_secret = ?????
scope = drive
root_folder_id =
service_account_file =
token = {"access_token":"???????

[RichFlixCrypt]
type = crypt
remote = RichFlix:/The Skull/
filename_encryption = standard
directory_name_encryption = true
password = ?????????
password2 = ????????

[PeaPod]
type = dropbox
token = ????????

[PeaPodCrypt]
type = crypt
remote = PeaPod:/Rich/SKULL/
filename_encryption = standard
directory_name_encryption = true
password = ???????
password2 = ??????