Google Photos rate limits

Greetings,

I started using rclone to download from Google Photos but unless I put a tight tpslimit there are several service errors (429 too many requests). So is the best approach to use tpslimit=1? My plan is to use this to check/copy on a regular basis (several times daily).

Now using:
rclone -vv --tpslimit=1 copy gphotos:/media/by-month/2019 down/gphotos/2019

Thanks. Mark.

The Google Drive remote-backend has something called the "drive-pacer" which is basically flexible speed-limit that makes rclone try to more or less obey Google's limits in terms of API calls.

Google Photos used to be part of Google Drive I believe, so I would assume it either has a slightly tweaked version of the pacer, or simply uses the same. In short - I doubt it should be necessary to use such a severe limit. The general rule is that default settings (4 transfers, no tpslimit) should generally work for a remote out of the box. That said, Google Photos is still fairly new and might have quirks and issues.

Are you using a custom Oauth key? If you never made one and you still use the shared key then that COULD be related if and when lots of other rclone users are pushing the system. It could also be irrelevant to your spesific problem - but it's just generally advised to do this if you can.

Finally, there has been a whole slew of Google Photos related posts recently, due to it being fairly new. It would be worth doing a quick search-by-date and looking to see if anyone already reported similar issues.
https://forum.rclone.org/search?q=google%20photos%20order%3Alatest_topic

Sorry that I don't have a more spesific solution to offer, but i have not yet used Google Photos extensively myself so my experience is limited with it.

A few 429 errors are normal it is Google's way of telling rclone to slow down.

Adding --tpslimit is a good idea if you want to be kind to google photos.

Note also that the by-day is very intensive on API calls so if you can use the by-month or by-year directories!