Transfer 2TB of data from a Google Drive Edu account into Google One

What is the problem you are having with rclone?

I am trying to figure out how to transfer 2TB of data from my edu account to my personal (5TB) google drive account. I am a complete newbie with rclone and over the past few days I have managed to create two remotes (one for the edu account and one for the remote) using a private Client ID. Now, I don't know what to do next. I have tried reading through the documentation but for me it's really hard to fully understand as it's not a topic I am very savy on.

Fore example, I know that to transfer files I should probably use this command line:

rclone copy source: nameofremote1 destination: nameofremote2

and that using this other line will prevent using my own internet line

--drive-server-side-across-configs

However, I don't know how to piece things together.

Additionally, I am aware there is a a 750GB daily limit and that using this command should automate pausing when reaching the limit and continuing again the day after:

--drive-stop-on-upload-limit

Again, I am not sure how all of this work. Would anyone be able to assist me in accomplishing this transfer? I am very happy to look at documentations if you'd be so kind to link it to me, but as a aheads up I might have doubts and questions due to the lack of knowledge on the topic. Thank you so much!

Run the command 'rclone version' and share the full output of the command.

rclone v1.60.0

  • os/version: darwin 12.5.1 (64 bit)

  • os/kernel: 21.6.0 (x86_64)

  • os/type: darwin

  • os/arch: amd64

  • go/version: go1.19.2

  • go/linking: dynamic

  • go/tags: cmount

Which cloud storage system are you using? (eg Google Drive)

Google Drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

I don't think this applies to me as I'm trying to figure out what to do next.

Paste config here

The rclone config contents with secrets removed.

I am not sure what this means

Paste config here

A log from the command with the -vv flag

Says the -vv command can't be found. How to I add a flag?

Paste  log here

I think you are pretty much there.

If you'd posted your config I could write exactly what you need to type, but I'll assume you've got two remotes called edu and personal.

First check the two remotes are working - this should list the root directory of each. Don't proceed until these commands both do what you expect.

rclone lsf edu:
rclone lsf personal:

You then want to type

rclone copy -P --dry-run edu: personal: --drive-server-side-across-configs --drive-stop-on-upload-limit

Remove the --dry-run when it looks like things are working. When it gets to the upload limit and stops, then wait 24 hrs and run it again. The -P flag will show you what progress rclone is making.

Note that this dumps all your stuff in the root of personal: - if you've got stuff in there already you might want to put it in a subdirectory personal:backup_of_edu

Hello Nick - you have no idea how much I appreciate your help!

I have renamed the remotes "edu" and "personal" to make sure we are on the same page, great suggestion.

I've also ran

rclone lsf edu:
rclone lsf personal:

and they both come back with the list of folders included in their main directory which I think it's great news.

I have put all the folders of personal into a 00_old folder so I can simply dump the edu folders in the main directory of personal - another great flag, thank you!

You mentioned

This might be a series of dumb questions but:

  • How do I see if things are working?
  • Do I remove it --dry-run while it's computing? Or maybe running the whole line does a simulation or flags if things are all good to go?
  • You said that after it reaches the daily limit it will stop by itself. After 24hr will I need to copy and paste the same command and it will pick it up from where it left off? I would like to avoid uploading things multiple times and incurring the progress to stop at the same bit it did the day before (I'm sure you thought of that but always better to make sure)

Thank you infinitely for helping out with this.

hi, there are a few ways.
--P - for progress will display the real-time stats, how many files are copied, what speed of the copying, and esttimate to complete the copy.
--- use a log file, that will show what rclone did, and would display any errors.
--log-level=/path/to/log.txt

you cannot change the flag while rclone is running,
so you would run the command with --dry-run, make sure the output looks ok, then run the command again without --dry-run

can lower the overall transfer speed so you never hit the 750GiB limit in a 24 hour period.
add --bw-limit=8.0M

and might do a real test on a small subdir, before running the command on the full 2TiB.
something like this, just change the name of the subdir to the name of a real dir and change path of the log file to match your system.
rclone copy -P -v --dry-run edu:subdir personal:subdir --drive-server-side-across-configs --drive-stop-on-upload-limit --log-level=/path/to/log.txt
and if that looks ok, remove --dry-run, run the command again.
then check the log file for errors or issues.

Yes, it will pickup from where it left.

rclone will check if files already exist (with the same size and modification date) and only copy if needed.

Hello, thanks for your answer.

Just to make sure I understand it (I have also google it!): a dry run is a way of checking that all files are ok and ready to move?

What is the usual speed and what does 8.0M means in terms of speed?

Running this code on the whole drive wouldn't be a better idea? Or do you think it'll take a long time?

Between limiting the speed and this, which one do you feel is more "newb" friendly? I am just afraid I am going to mess it up one way or another.

Thanks both for your input!

correct.

this is what i would do in your case,

  1. run the code on the whole drive with --dry-run
  2. run the command on a subdir without --dry-run
  3. run the code on the whole drive without --dry-run

Oh - on reddit someone was helping me setting this up and they suggest this:

If you add "--drive-server-side-across-configs" (it does not matter where you add it like at the beginning after the rclone command or at the very end, rclone will detect automatically) it will try to tell drive to copy across the servers without going the route over your own internet line.

Do you think this would work?

ok. i missed --drive-server-side-across-configs in my last response, which i have since edited.
ncw suggested that flag and was included in my example command up above.

you will need to set gdrive permissions correctly
https://forum.rclone.org/t/unable-to-sync-across-drive-server-side-configs/26836

Ohh... I see! Do you know how to set up the right permissions as the thread suggests?
They don't seem to offer a proper (as in idiot proof lol) solution and not sure where I'd begin in setting up the right permissions...

that is mentioned at the end.
added the source as a shortcut in my destination and it's working now

https://rclone.org/drive/#shortcut
"In the first example this creates a shortcut from the "source_item" which can be a file or a directory to the "destination_shortcut"

Got it, thanks. So sorry, I wasn't trying to be lazy, I genuinely didn't understand what that person meant with that comment.

I tried to modify that code to my needs but I don't think I got it right - I'm really sorry but it's so difficult for me to understand the terms I need to substitute with my drive's name and the ones I need to leave as they are.

Given that I just want to transfer the edu main directory (and all its contents) to the personal main directory, would something like this work?

rclone backend shortcut drive: edu: personal:

Again I'm sorry for my incompetence, I'm trying my best but I'm getting very confused now. I even tried to go back to find the rest of the code and now I'm not even sure which one we decided was the best to go with between the different options we have written down.

Could you help me by bringing all the code together?

I would just like to run the code, leave it be for a few days, even a week if necessary, and not worry about it. I don't know how to use rclone well enough to make decisions on what to include or not in the code and getting very discouraged. :frowning:

i do not use gdrive much.
maybe somebody else can comment about rclone backend shortcut

well, just do a simple copy using your local internet connection.
and then do not have to experiment with --drive-server-side-across-configs and rclone backend shortcut

what is the result of an internet speedtest?

or might try a commercial service, which i know zero about, get paid tech support.
https://www.multcloud.com/tutorials/copy-from-one-google-drive-to-another-1234.html

or try
https://www.makeuseof.com/tag/transfer-files-google-drive-accounts/

https://www.nucleustechnologies.com/google-drive-migration/

I think that might be the best solution to use my own internet speed (by the way I'm on 94mbps - 14mpbs)...

Is this the right code to run?

rclone copy -P -v --dry-run edu: personal: --drive-server-side-across-configs --drive-stop-on-upload-limit --bw-limit=8.0M --log-level=/path/to/log.txt

I'm assuming I would need to specify the path for the log?

not sure if this is in your skill set, but can rent a cheap cloud virtual machine and run rclone on that.
that would have a 1Gbps connection
i use https://black.host/unmetered-vps-hosting

14 Mbps = 1.75 MB/s and at that speed, at best, 13+ days to complete.

so using your local internet connection, no need for --drive-server-side-across-configs
and given the speed of the internet connection, no need for --drive-stop-on-upload-limit --bw-limit=8.0M

rclone copy -P -v --dry-run edu: personal: --log-level=/path/to/log.txt

correct

Thank you for your input - once again very appreciated.

I have no idea how to use a virtual machine and I felt that I was making good progress on rclone... So I think I want to keep trying in this direction. If I see this path is not doable then I'll think about something else, but it's too early to give up.

I think I might need to go back to try to bypass my internet speed... 13+ days seem like a very long time - I'm also taking in consideration the possibility that I mess something up and will need to run it again or something. Plus I honestly wouldn't be able to deal with the anxiety of waiting that long, ha!

Anyway... My head is spinning from all of this overload of new info, so I think I'll sleep on it and try again tomorrow - I'm off work and might get the chance of researching and understanding this topic a bit more.

And hopefully, in the meantime, someone else can also join the conversation and help us set this up correctly!
Thank you again for sticking with me this far, I'm sure I made your eyes roll more than once! :sweat_smile:

yes, if someone could help with the the exact rclone backend shortcut command,
then the rest of all this becomes easily doable.

Why not go with ncw's solution above? It takes a few days, is simple / straight forward and you'd be done. Trying to find the efficient way sometimes makes things more complex as dealing with shortcuts and whatnot adds complexity.

No need to worry about bandwidth and just let it copy ~750GB per day and be done with it. There's no harm in hitting upload limit as it resets daily and you can still upload via the normal web client.

1 Like

hey @Animosity022,

the ncw solution requires --drive-server-side-across-configs, which, so far, the OP cannot figure out.
i shared a bunch of howto guides with pictures....

am i missing something about that?

What's the issue with typing that in and going other than I'd probably dump into a folder rather than the root.