Empty Google Drive Trash

Just wanted to share my experience with emptying my google drive trash. Google's UI method to emptying trash is awful and extremely lacking, especially for the kind of people who use rclone.

The cleanup command was implemented into rclone to help. It basically uses the drive DELETE api command, which can be demoed here: https://developers.google.com/drive/api/v2/reference/files/emptyTrash?apix=true#try-it

I used the command, and it kinda worked, but even Nick (rclone's developer) has commented on other posts on this forum about how no-one really knows what's going on or why it sometimes doesn't work. For example, I tried the DELETE command directly in google's developer portal (link above) and the trash did not empty. Same with the the cleanup command. It's not the fault of rclone, it's Google's bad implementation of a simple delete idea.

But, I did find a solution that worked for me (be careful, however, as you can use a lot of API requests and get rate limited.) Use your own secret/key and I would recommend using a separate secret/key/project from the one you use for your production rclone setup.

Here's the command to run (add --dry-run to the first run to verify everything first so you know what you're deleting)
rclone delete gdrive-root: --drive-trashed-only --drive-use-trash=false --verbose=2 --fast-list

gdrive-root: a custom remote that has it's own client/key and does not have a folder id (runs on the root of the google drive)

--drive-trashed-only: This is absolutely crucial. It interacts with only the files in the trash in the same folder structure as it was when it was deleted from the drive.

--drive-use-trash=false: Also critical, as it doesn't use the drive trash when you delete the files. By default, a google drive remote will send deleted files to the google drive trash. We obviously want to empty the trash, so the false flag will bypass this default option.

--fast-list: this build a list of the files in memory. It uses less transactions (good for API usage) but it uses a little more memory.

--verbose=2: prints a verbose output of the operation, useful for keeping a visual eye on the progress

After running this, I empty 3tb of data that was just junk. I hope this helps someone who needs to empty their trash.

1 Like

You can just use:

rclone cleanup remote:

https://rclone.org/commands/rclone_cleanup/

As mentioned, I do use it, but after running the command on cron jobs for almost a week, my drive trash was still massive, with thousands of files. It only removed about 2% of my trash after a week. But, after running the command in my post, I deleted the 98% in about 2 minutes.

Edit: To further clarify my original post, the inability to empty the trash via the cleanup/delete API command is not rclone's fault. I followed Google's instruction set on the developer portal. It completed with a good response, but all the files remained. So it's definitely not rclone's fault, as the problem is repeatable outside of rclone. My command posted above is more of a workaround, simply enumerating and deleting files while bypassing the drive files.

I think the issue might be more related to running it multiple times maybe?

I cleaned a few thousand files in 5 minutes as I checked back on it.

Are you using a team drive or regular drive?

Regular. And the interval was daily.

For me, rclone cleapup has always worked eventually but has taken some time (hours)! I typically just run it once then wait.

I haven't deleted that much data though.

@ncw, I figured that has to do with it. I just got impatient :slight_smile: and I gave this a shot. Worked quickly and effectively. Do you think you could create a --force-cleanup flag for the cleanup command? So something like this:

rclone cleanup gdrive: --force-cleanup

If the forced flag is used, it flips the cleanup command to a delete command with the trash options. So basically a shortcut.

That's a great idea. Can you tag a new issue on github for it?

1 Like

added issue #3964

@jiru I tried your implementation, however, it doesn't remove the folders. I don't know if I just need to still be patient with them, but are they deleted succesfully to you?

Some questions/assumptions:

  • Is your remote: the root of the google drive?

  • Did you get any output on the screen (if you set it to verbose=2)

It did delete folders for me.

  • Yes, it's the root of my drive.

  • For the files, yes, not for folders.

Replacing delete with ls returns no results as well.

EDIT: Tried latest beta with same results. Also, do you have a team drive to test? Or everything was tested in normal drives?

Can you, please, try to create an empty folder, send it to trash and see if you can delete it? (Because I think that, in my case, rclone deleted all the files contained in those folders first, so the folders are sitting just empty on my trash)

rmdir doesn't work as well, it's returning 403 (Forbidden) errors

Your method is so awesome
I had around 100k files in my trashed (around 20 TB I think)
It took some time..but it did the job neatly
rclone cleanup never did the job for me

FWIW
Here is a script that dedupes, removes empty dirs and deletes trash. If you put this in crontab it will keep all/selected remotes nice and clean.

1 Like

Glad it worked for you! Go comment on the issue (linked above) to get some traction on the feature request.

1 Like

Hi..do you mind explaining this line pls
"cleanremotes accepts a command line filter now. e.g. ./cleanremotes tv will only clean remotes that have tv in them"
Did you mean all the remote names having a word called tv in them?

Using the delete command, 403, API quota error appears, how can I solve it?

what was the command you used?
what is the exact error message?

rclone delete [p#1 用户名]: --drive-trashed-only --drive-use-trash=false --verbose=2 --fast-list
''
2020/06/13 23:30:12 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=202264815644, userRateLimitExceeded)
2020/06/13 23:30:12 DEBUG : pacer: Rate limited, increasing sleep to 1.770183715s
2020/06/13 23:30:12 DEBUG : pacer: Reducing sleep to 0s
''