Move encrypt file and folder

-n, --dry-run

Do a trial run with no permanent changes. Use this to see what rclone would do without actually doing it. Useful when setting up the sync command which deletes files in the destination.

ok it may take a long time haha ​​and after that what happens the transfer is done by itself?

I don't want the source to be deleted though. I will do it myself

if the dry-run output looks good, and you are ready for rclone to transfer the files, then remove --dry-run and run the command again.

How am I going to know that everything is correct? with -vv I see this happening: <7> DEBUG: pacer: Rate limited, increasing sleep to 16.620293166s
<7> DEBUG: pacer: low level retry 2/10 (error googleapi: Error 403: Rate Limit Exceeded, rateLimitExceeded)
<7> DEBUG: pacer: Rate limited, increasing sleep to 16.930981511s
<7> DEBUG: pacer: Reducing sleep to 0s

have you done this?
https://rclone.org/drive/#making-your-own-client-id

I used this before but not anymore

I recreate it now I have to add it to my Gsuitee and Gsuite configuration of rclone.

it looks okay now. I let go until the end :slight_smile:

Does this sound okay to you? I don't want to lose everything :slight_smile:

<7>DEBUG : Google drive root 'Personnel': Waiting for checks to finish
<7>DEBUG : Google drive root 'Personnel': Waiting for transfers to finish
<7>DEBUG : Waiting for deletions to finish
<5>NOTICE:
Transferred: 39.599T / 39.599 TBytes, 100%, 55.219 TBytes/s, ETA 0s
Transferred: 37579 / 37579, 100%
Elapsed time: 8m59.2s

<7>DEBUG : 18 go routines active

what was the command?

rclone sync Gsuitee:/Personnel Gsuite:/Personnel --drive-server-side-across-configs --dry-run -vv

you cannot lose anything, as rclone sync does not delete files in the source.

i would:

  • remove --dry-run
  • run the command for 10 minutes
  • kill the command
  • add a new crypt, let's call it grcypt2 pointing to Gsuite:Personnel
  • make sure a rclone ls grcypt2: shows the file names in their decrypted state
  • download a file from grcypt2: and make sure it looks good
  • run the command again without --dry-run and let rclone sync all the files

i get again this errors:

<7>DEBUG : pacer: low level retry 8/10 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
<7>DEBUG : pacer: Rate limited, increasing sleep to 16.889790654s

as i pointed out earlier in the post, that is not an error.

not a gdrive expert, but you can transfer 750GB per 24 hour period.
if you exceed that limit, gdrive will force rclone to slow down or prevent rclone from copying files.
might try to add https://rclone.org/drive/#drive-stop-on-upload-limit

you can read about that at
https://developers.google.com/drive/api/v3/handle-errors#resolve_a_403_error_usage_limit_exceeded

and you can log into the google console and look at the api quotas limits

ah ok i didn't think this limit applied between a shared drive and the main drive. But I tested and it works I created another rcloned remote and I see what has been transferred :slight_smile:

I added --tpslimit 1 and bwlimit 4M I will see if I limit

rclone sync Gsuitee:/Personnel Gsuite:/Personnel --drive-server-side-across-configs --tpslimit 1 --bwlimit 4M -vv

i recently learned that --bw-limit does nothing when combined with --drive-server-side-across-configs.

might want to use
https://rclone.org/drive/#drive-stop-on-upload-limit

i don't know wath is --drive-stop-on-download-limit but it's don't stop anything :roll_eyes:

<7>DEBUG : pacer: Rate limited, increasing sleep to 16.149769542s
<7>DEBUG : to7it9cdp0b3g9rvgkp9mbccik/sdaprimkt8cia7pjdhe6i8vppcrnscrdrtcv1k0fabd6blm1crlg/53ak0mbha1s4tk386k0r6npk00/1an42lqrb409kvikvrri5056q190svtv2th082vg68l183vl7cbel488epapt5srmaoglkdcacb0l84mbt1fjrpkm5a184rguahtsn3lmql9lp5btfqej6afp19fu7jofc3191ee1t5rlu6qrosd096kr0: Received error: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded - low level retry 1/10

I will do some tests in the days to come, already not to have to have everything go through my house it's great !! Thank you for your help :slight_smile:

If I look at the space I used before the transfer it gives well in the 750 GB following the transfer so I have to limit the transfer speed to 4m which will give 650 GB per day. i will try bwlimit 4m a give the feedback!

sure, glad to help.

as i pointed out in the post above, you should remove --bw-limit as it does nothing when combined with --drive-server-side-across-configs
the reason is rclone is not using your local bandwidth.

fortunately everything is summed up so the transfer continues quickly!

I found my mistake. I used: --drive-stop-on-download-limit instead of --drive-stop-on-upload-limit=true :sweat_smile:

The transfer stops as soon as there is a limit reached error with:
rclone sync Gsuitee:/Personnel Gsuite:/Personnel --drive-server-side-across-configs --drive-stop-on-upload-limit=true -vv

but if i use rclone sync -i the transfer does not stop but remains in standby