Working with Google Photos "all requests" Quota limit (10k)?

What is the problem you are having with rclone?

Goal: copy all Google Photos content (photos, videos, ~50k in number) to local storage.

I'm aware of the limitations: Google Photos

Google Photos Library API has a "General quota limit" of 10,000 requests per project per day

And there appears to be no way of increasing that for the purposes of this exercise, even if you're willing to pay $.

I created my own Google Cloud project and client ID; however hit the 10k limit after around 35,000 files were copied. I created a second project in another account, and it seems that most of the request limit on the second project was burned in the 'check' process, rather than in downloading files not yet downloaded - I hit the 10k limit on the second project with only about +3k additional files downloaded.

e.g. from the copy using the second project, at the point where the 10k requests were reached:

Transferred:   	   11.851 GiB / 11.851 GiB, 100%, 120.592 KiB/s, ETA 0s
Errors:               103 (retrying may help)
Checks:             20134 / 20134, 100%
Transferred:         3742 / 13749, 27%
Elapsed time:     58m26.5s
  • there were a heap of 429 RESOURCE_EXHAUSTED leading up to the above, and then further requests return 403 Forbidden

Is there any way of invoking rclone to avoid the 'check', i.e. carry on where it left off but with a different client ID+secret?

Aside: I could try and use the "start_year", but it's unclear whether rclone fetches files in chronological order, so I suspect that won't work.

Run the command 'rclone version' and share the full output of the command.

% rclone version
rclone v1.60.1
- os/version: darwin 12.4 (64 bit)
- os/kernel: 21.5.0 (arm64)
- os/type: darwin
- os/arch: arm64
- go/version: go1.19.3
- go/linking: dynamic
- go/tags: none

Which cloud storage system are you using? (eg Google Drive)

Google Photos

The command you were trying to run (eg rclone copy /tmp remote:tmp)

% rclone copy --progress --create-empty-src-dirs --gphotos-start-year 1900 --gphotos-include-archived remote: ./

The rclone config contents with secrets removed.

[remote]
type = google photos
read_only = true
start_year = 1900
include_archived = true
token = {"access_token":"...","token_type":"Bearer","refresh_token":"...","expiry":"2022-11-25T13:58:24.069736+11:00"}
client_id = ...
client_secret = ...

I think that is probably the best plan if you can. Google photos has an API to just return the files in a year. You could break down into year/month also I guess.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.