"Clever" moving

Hi,

I've had a GSuite Drive account where I had a lot of huge rclone encrypted content. Mostly large files. I was thinking about the most convenient way to move all the files onto a new GSuite account and I figured the following process out:

1.) using rclone + a script i'd iterate over every single directory and create the directory structure what I had on the old account
2.) inside each folder on the source Drive I'd create a shareable link for every single file
3.) on the target Drive I'd add these, then make a copy, delete the original and rename back the created copy
4.) Since by default the add to my Drive option puts the file into the root directory, I'd use also a script to move the files one by one into the folders

  • maybe depending on the filesize I'd decide whether to go that way, or if the file is reasonably small and there is no need of doing this copy thing then I'd do that...

As far as I know the Google Drive API supports all the mentioned operations. I am not sure about the add to my Drive one, but all the others. However the question is that rclone supports all these as well? And I could never figure out how to use rclone from scripts. Is there a way to avoid the interactive mode completely and add shares using cmd flags only? Right now I am manually appending lines to the rclone.conf.

Also, do you see any other bottlenecks in this idea?

Thanks

What is the goal you are trying to accomplish?

I'd like to transfer huge files as fast as possible without going thru my server. Since Drive -> dedi -> Drive connections are limited to around 20 MB/s. And for example I have quite a few files more than 100 GB large.

You can only copy 750GB per day though anyway so it really doesn't matter much.

I know, but maybe not this way. And it'd still be a lot more faster.

You can just run rclone copy a: b: --bwlimit 8.5M and let it run.

But what's the point of ratelimiting if I'd like to copy the files as fast as possible? I have a gigabit network to both sides however Drive seem to be limiting me to 20 MB/s upload. Which's gonna be slow to migrate large files like 128 GB big datasets. That's why I wanted to go with this method.

I am not following unfortunately.

Going as fast as you can you'd hit the 750GB limit faster in a day and would have to wait for a reset anyway.

There are ways to use service accounts and other things that you can find in other spots, but I'm not an advocate of breaking their quota limits.

Good luck!

Okay, no worries. I have to apologize since my English is well... Far from top notch. Let me rephrase the whole thing!

So I want to transfer a lot of large binary data from one Drive account to another. Using rclone you can do that so simply with the rclone copy sourcemount:/encryptedfilename targetmount:/encryptedfilename however this way rclone downloads the stream to my computer while uploading the incoming data straight to the other machine. Therefore my computer acts as a middleware and because the rclone uploads have a 20 MB/s network limit from my experience, these files take a lot of time to migrate. And also the quota comes later but that's the least issue I care about now. Therefore the idea is simple. I already have these in one Drive folder. The web Drive offers a create shareable link for the file option which gives me a link that everyone can open. I assume rclone could generate these too and this way I could embed it to a script. But what I can do with these shared links? Well, also from web I could open that within the target account. Then there is an option to make a clone to my own storage. I would like to do that, make a copy. However that will still depend on the original source Drive file. It's like a symlink but between accounts, so if the original file gets removed, the target will lose access also. So that's why I'd like to make a copy of the freshly pasted binary inside the target account and remove the original "symlink" which points to the source Drive account.

I am sure the last step would be easy with rclone as I know it already supports displaying these files and therefore the copy will be easy, so let's make my questions clear...

  1. Is there a way I can mass generate public share links using rclone? If I read it right, Drive API has support for this. And if yes, is there a way to get a "prettyfied" JSON output or anything I can parse easily from a script? It should only contain the created URL(s) for the files.

  2. How would you go by copying an entire folder structure (only the directory tree, not the actual files) from one share to another? I hope rclone has something for that but this one can be actually solved by mounting both shares and a find + exec combination on directories should work... :thinking:

This way I guess I can bypass the quota limit hitting also since I just moved a lot more than 1 TB by hand (manually) from the web.

It's a bit hacky but a real movement and not a complete reupload what I would like to achieve.

And last but not least, you can't imagine how thankful I am because of your quick responses and active presence in such stupid questions. :smiley: The whole rclone project is a god's hand tool and I have never seen such a well moderated forum before like this one. And uhh, I know I require a lot of patience to handle and apologies for that!

Oh nevermind, I can actually share whole folders. So I only have to script the cloning part which is gonna be easy. Thanks a lot again for all the help!

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.