Gdrive to Gdrive - Suggestions

Hey guys,

I opened this in the past: Google Drive for Business - slow: Basically I was “complaining” of Gdrive to Gdrive being superslow.
Since then I let it go a bit as I held what was really important on two NAS I owned.

I’ve decided it’s time to get rid of one of the NAS and so I’m ready to tackle this issue once again. I’ve noticed that the script worked eventually until a month or so ago, when again I could see it identifying a ton of “duplicated” files (which apparently is normal) and then nothing transferred.
This is the line of code I use for that (powershell >> So the variables are the locations):

./rclone.exe sync --checkers=25 --transfers=25 $sourcefolder_offsitedata $destinationfolder_offsitedata --backup-dir $destinationHistoricFolder_offsitedata --suffix $suffix_offsitedata

Source and destinations are the same gdrive remote, which uses my own API I got from Google.

Another thing I tried was to add a new remote, but configured the same way (so same API), just a different name. It worked for like 2 hours (where I could see it transferring ~400GB). Whenever I called that line of code afterwards, it would behave exactly the same as the above (original) script.

I’m now trying something else: new remote, this time using no API (so default configuration “next, next”) for the remote and it seems to be going fine for now. I’m just afraid to stop it, so I applied QoS on rclone.exe to make sure I had some bandwidth left to work :slight_smile:

Now that you have all of the info, my question is: what do you suggest doing here? I mean, hopefully this solution with two remotes and two different APIs works better and I’d be happy like that, but I’m looking to hear some suggestions.

In the other thread we discussed Google File Stream, which I eventually got during the testing period and unfortunately (even if it’s great) it doesn’t work for this sort of backup I’m running as it’ll basically be downloading ALL of the files it then re-uploads. But the uploads are way slower than the downloads (looks like G Drive File Stream isn’t as fast as rclone when uploading - in fact, rclone/gdrive are saturating the line which is a 500/500Mbps) and that killed the OS drive :slight_smile:


Instead of using the same gdrive remote, create two rclone remotes pointing to the same destination and it will do server-side copys. That being said, google has a lot of limits on doing this and it won’t be really fast but it will get the job done. I do this all the time. The initial copy will likely be better to simply do a regular copy up. But the maintenance of the sync will work fine. That is how I’ve done it.

You should probably run rclone dedupe from a recent (1.38) rclone to fix duplicated files and directories. Duplicated files and directories confuse rclone!

I’ve done that. But when I used the same API, it gave me the same issue. It’s still working (~6.5 hours since I launched it) at the minute, and I’m using 2 remotes, the original one with my own API and the second one with the standard config.

I don’t have duplicated files/directories though :grinning:

I’m afraid I’m about to have the same issues as in the past: lots of files not being uploaded (no errors though) and stuck at 0 Bytes/s.

zero bytes is the google upload limit.

Do you know where I can find more on limits on Gdrive? Is there like an upload hard limit per day or like, if I stop the script and try again in an hour it’ll continue just fine? I just got indeed the first error about it:

Note that after the first upload, I think I’m looking at max 50GB per day or so of changes, which should be easily processed going forward.

I dont think they publish them. No one has ever seen them published. I believe the drive-to-drive though is around 500GB.

Then what? :slight_smile: Is there like a time frame or something I shall wait? I just stopped it and started it again and indeed it’s still complaining the limit was exceeded.
Being a G for Business account it’s a bit ugly, I mean this prevents large businesses to work (not my case and I’m guessing they will have a solution to allow people to temporarily hammer their servers). I’m not even encrypting, so I’m sure they’ll dedupe a lot.

@ncw Regarding the duplicated files, this is what happens when the script starts and I checked, no duplicated files in Gdrive (nor when accessing the path with rclone):

ls with rclone:

24 hour limit. You’ll need to let some time lapse or just let it keep trying.

Thanks for that! I use a single script that runs every 12 hours. I’ll probably start splitting it and keep this Gdrive to Gdrive separated every 24h. I just don’t get how come they impose a limit on Gdrive to Gdrive but still allow me to upload from anything to Gdrive (better this way :slight_smile: ).

That error message suggests that you are using an old version of rclone… I suggest you run rclone dedupe with 1.38 or the latest beta - this will detect duplicate directories which are probably the problem here. You won’t see duplicate directories with rclone ls.

Thanks, I will look at upgrading to the latest as soon as I have more time. Replacing the .exe file didn’t work and I think I might need to re-create the remotes (not an issue).
But… if I really had duplicated folders, shouldn’t I see them in either G Drive File Stream or through the G Drive web interface?

You shouldn’t have to re-create the remotes.

You should, but remember they may be arbitrarily far up the hierarchy.

Great thanks. Just for the sake of whoever might be interested in this topic, after the first sync that took about 3 days (due to GDrive’s file transfer limitations), everything started working ok.
I’m using two different remotes, the primary is using my own API and the other one was configured with the default process.

One important thing to notice: The errors showing 0 bytes/s that had before opening this thread (when trying to copy from/to the same remote) were not due to high queries nor due to high amount of data transferred (as nothing was being transferred and as the Google API console showed no more than 40/50k queries and the limit was a million).

Hi, not sure whether to create a new topic for my question or add it on here. Apologies in advance.

I am mainly going to be using my new GSuite for Business account to backup concert photos (that currently sit on a Synology NAS - prob around 8TB).

My download/upload speed is not great ~40 down/7 up.
I can upload about 3GB per hour with my home ISP.

I have about 40GB uploaded but wanted to try backing up these photos to my personal Google account too.
I tried “rclone copy gdrive1: gdrive2:”
but it seemed to be downloading and then re-uploading the images (which would take ages)

Can I do it all on Google’s end without the need for the data to have to go through my slow upload?

I tried sharing the Photos folder in GSuite Gdrive to the personal user Gdrive
and then I logged into personal user account and chose ‘Save to My Drive’
The photos seem to be there now… I think(?)
I don’t have any rclone commandline checking going on, so can I assume this method works fine? (or are the files still tied to the Business Gsuite?)

I think you should have created a new post, however, I can answer this for sure: you cannot with rclone (or any other software really…). As it’s a 3rd party, for now, you’ll always have to pass through where rclone is running from (your PC).
Not too sure on the sharing/saving through Gdrive though.

1 Like

Google provides a “copy” command in their API. Rclone should make use of it but doesn’t. It is fairly easy in concept to implement.

I do google-to-google transfers using the API in my sample scripts PERL-CloudSync (google it).

rclone calls that concept “server side copy” and you can see in the overview that it is supported for google drive.