Retrieving remote data that does not exist locally?

What is the problem you are having with rclone?

No problems as such, really just looking for verification that my plan for recovering data is on the right track. With the Google Drive gravy train coming to an end, looking to retrieve all files existing on my encrypted Google Drive that do not exist on my local storage. I'm using Rclone on Windows 2012 Server in this case with actual storage on a mergerfs + SnapRaid volume. Before I go nuking my Google Drive service, would appreciate some eyes on my approach/commands below to ensure they look solid.

Run the command 'rclone version' and share the full output of the command.

C:\Tools\rclone>rclone.exe version
rclone v1.63.0
- os/version: Microsoft Windows Server 2012 R2 Standard (64 bit)
- os/kernel: 6.3.9600.0 (x86_64)
- os/type: windows
- os/arch: amd64
- go/version: go1.20.5
- go/linking: static
- go/tags: cmount

Are you on the latest version of rclone? You can validate by checking the version listed here: Rclone downloads

yes

Which cloud storage system are you using? (eg Google Drive)

Google Drive (encrypted)

The command you were trying to run (eg rclone copy /tmp remote:tmp)

  1. Get a list of files that exist on the remote, but not on local.
rclone.exe cryptcheck M:\ GCrypt:/SecBackup --buffer-size 32M --drive-chunk-size 256M --checkers 40 --track-renames --track-renames-strategy modtime --modify-window=1s --tpslimit=20 --exclude-from .\excludes.txt --error=.\log\CheckCyrptError.txt --missing-on-src=.\log\CryptRestore.txt
  1. Batch file to recover files from remote
FOR %%G IN (.\log\CryptRestore.txt) DO rclone.exe copy GCrypt:/SecBackup/%%G M:\ --buffer-size 32M --drive-chunk-size 256M --transfers 20 --checkers 40 --track-renames --track-renames-strategy modtime --modify-window=1s --tpslimit=20 --exclude-from .\excludes.txt -P --stats-log-level INFO --log-level DEBUG --log-file=.\log\Rclone.txt
  1. Rerun step 1 which should result in no files being logged to the .\log\CryptRestore.txt file.

The rclone config contents with secrets removed.

[GDrive]
type = drive
scope = drive
token = {"access_token":"Bunch-O-AlphaNumeric"}
client_id = Bunch-O-AlphaNumeric.apps.googleusercontent.com
client_secret = Bunch-O-AlphaNumeric
root_folder_id = Bunch-O-AlphaNumeric

[GCrypt]
type = crypt
remote = GDrive:/SecureBackup
filename_encryption = standard
directory_name_encryption = true
password = Bunch-O-AlphaNumeric
show_mapping = true

On the surface it looks like your way will work (only testing can prove it 100%) but any reason why you do not use rclone sync?

Thanks for taking a look.

Yeah, sync would likely do it with much less fuss. Main reason for me not using the sync command I'd say is the control. Method above gives me a defined, user readable list of files to be brought back. I also use SnapRaid locally which has given me some unexpected results in the past due to modification times. This plus the thought of sync deleting files on the destination (which would be my local in this scenario) if they weren't found on the source (Google Drive) has me a bit cautious.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.