I am trying to figure out how to transfer 2TB of data from my edu account to my personal (5TB) google drive account. I am a complete newbie with rclone and over the past few days I have managed to create two remotes (one for the edu account and one for the remote) using a private Client ID. Now, I don't know what to do next. I have tried reading through the documentation but for me it's really hard to fully understand as it's not a topic I am very savy on.
Fore example, I know that to transfer files I should probably use this command line:
and that using this other line will prevent using my own internet line
However, I don't know how to piece things together.
Additionally, I am aware there is a a 750GB daily limit and that using this command should automate pausing when reaching the limit and continuing again the day after:
Again, I am not sure how all of this work. Would anyone be able to assist me in accomplishing this transfer? I am very happy to look at documentations if you'd be so kind to link it to me, but as a aheads up I might have doubts and questions due to the lack of knowledge on the topic. Thank you so much!
Run the command 'rclone version' and share the full output of the command.
os/version: darwin 12.5.1 (64 bit)
os/kernel: 21.6.0 (x86_64)
Which cloud storage system are you using? (eg Google Drive)
The command you were trying to run (eg rclone copy /tmp remote:tmp)
I don't think this applies to me as I'm trying to figure out what to do next.
Paste config here
The rclone config contents with secrets removed.
I am not sure what this means
Paste config here
A log from the command with the -vv flag
Says the -vv command can't be found. How to I add a flag?
Hello Nick - you have no idea how much I appreciate your help!
I have renamed the remotes "edu" and "personal" to make sure we are on the same page, great suggestion.
I've also ran
rclone lsf edu:
rclone lsf personal:
and they both come back with the list of folders included in their main directory which I think it's great news.
I have put all the folders of personal into a 00_old folder so I can simply dump the edu folders in the main directory of personal - another great flag, thank you!
This might be a series of dumb questions but:
How do I see if things are working?
Do I remove it --dry-run while it's computing? Or maybe running the whole line does a simulation or flags if things are all good to go?
You said that after it reaches the daily limit it will stop by itself. After 24hr will I need to copy and paste the same command and it will pick it up from where it left off? I would like to avoid uploading things multiple times and incurring the progress to stop at the same bit it did the day before (I'm sure you thought of that but always better to make sure)
hi, there are a few ways. --P - for progress will display the real-time stats, how many files are copied, what speed of the copying, and esttimate to complete the copy.
--- use a log file, that will show what rclone did, and would display any errors. --log-level=/path/to/log.txt
you cannot change the flag while rclone is running,
so you would run the command with --dry-run, make sure the output looks ok, then run the command again without --dry-run
can lower the overall transfer speed so you never hit the 750GiB limit in a 24 hour period.
and might do a real test on a small subdir, before running the command on the full 2TiB.
something like this, just change the name of the subdir to the name of a real dir and change path of the log file to match your system. rclone copy -P -v --dry-run edu:subdir personal:subdir --drive-server-side-across-configs --drive-stop-on-upload-limit --log-level=/path/to/log.txt
and if that looks ok, remove --dry-run, run the command again.
then check the log file for errors or issues.
Oh - on reddit someone was helping me setting this up and they suggest this:
If you add "--drive-server-side-across-configs" (it does not matter where you add it like at the beginning after the rclone command or at the very end, rclone will detect automatically) it will try to tell drive to copy across the servers without going the route over your own internet line.
Ohh... I see! Do you know how to set up the right permissions as the thread suggests?
They don't seem to offer a proper (as in idiot proof lol) solution and not sure where I'd begin in setting up the right permissions...
Got it, thanks. So sorry, I wasn't trying to be lazy, I genuinely didn't understand what that person meant with that comment.
I tried to modify that code to my needs but I don't think I got it right - I'm really sorry but it's so difficult for me to understand the terms I need to substitute with my drive's name and the ones I need to leave as they are.
Given that I just want to transfer the edu main directory (and all its contents) to the personal main directory, would something like this work?
rclone backend shortcut drive: edu: personal:
Again I'm sorry for my incompetence, I'm trying my best but I'm getting very confused now. I even tried to go back to find the rest of the code and now I'm not even sure which one we decided was the best to go with between the different options we have written down.
Could you help me by bringing all the code together?
I would just like to run the code, leave it be for a few days, even a week if necessary, and not worry about it. I don't know how to use rclone well enough to make decisions on what to include or not in the code and getting very discouraged.
Thank you for your input - once again very appreciated.
I have no idea how to use a virtual machine and I felt that I was making good progress on rclone... So I think I want to keep trying in this direction. If I see this path is not doable then I'll think about something else, but it's too early to give up.
I think I might need to go back to try to bypass my internet speed... 13+ days seem like a very long time - I'm also taking in consideration the possibility that I mess something up and will need to run it again or something. Plus I honestly wouldn't be able to deal with the anxiety of waiting that long, ha!
Anyway... My head is spinning from all of this overload of new info, so I think I'll sleep on it and try again tomorrow - I'm off work and might get the chance of researching and understanding this topic a bit more.
And hopefully, in the meantime, someone else can also join the conversation and help us set this up correctly!
Thank you again for sticking with me this far, I'm sure I made your eyes roll more than once!
Why not go with ncw's solution above? It takes a few days, is simple / straight forward and you'd be done. Trying to find the efficient way sometimes makes things more complex as dealing with shortcuts and whatnot adds complexity.
No need to worry about bandwidth and just let it copy ~750GB per day and be done with it. There's no harm in hitting upload limit as it resets daily and you can still upload via the normal web client.