Should I assume rclone has been tweaked for Google Drive limit? Or do I have to tweak it by myself?
Is it really creating too many requests hence hitting the Google limit?
Sounds like you probably didn't configure it to be used, but since you deleted the whole help and support template, we have no idea on your config/version/any error log.
Have you added client_id = XXX and client_secret = XXX to rclone.config after creating GoogleDrive remote?
If yes then they are not used. They have to be bound to the token. The easiest way to ensure it is to create this remote from the scratch providing client_id/client_secret in the configuration process.
Then if you still see Error 403: User rate limit exceeded you have to slow down your copy command.
Add for the start --tpslimit 10 --tpslimit-burst 0 flags - increase/decreace --tpslimit to ensure that errors are gone. There is no one best value here and you have to find what works the best for you.
If it is really using up the limit, I wonder if I can confirm it on anywhere from Google webpage?
I may want to try asking Google to raise the limit.
Thanks.
Realistically, you never run out of Google API as that's rarely the problem unless you are using the overload default rclone one.
Chances are if you are seeing any API hits on the API page, you aren't using the key you made. Best bet is to make a new remote and connect it.
Additionally, Google has daily upload quotas and download quotas. 750GB seems to be the upload and ~10TB seems to be the download. If you hit those, you wait 24 hours and nothing you can do. Google will not raise/alter/change or do a thing for you. Ask away and you'll get denied.
Very likely I have hit the 750 daily upload rate. Is it officially stated by Google, or just by our experience? I have used GSuite several years ago, it seems there was no such limitation.