Assuming that the setup is working as it should, unnecessary advanced scanning that is not suitable to cloud is disabled, and that there isn't any weird issue of rapidly opening and closing files going on...
Then API will basically be a non issue.
With default settings you will need 1 API call to request a 128MB chunk to read. This could be set even higher if needed, but even on a very high bitrate stream this means API calls will be quite infrequent to serve one viewer.
You can (with default quota) make 1000 requests pr 100 seconds, or 10/sec on average during sustained load in other words. That should be theoretically sufficient to serve dozens of users. You will likely be bandwidth limited before this becomes an issue.
You effectively have no cap on the amount of requests pr day. You can do 10/sec 24/7 for 864 000 API calls in a day. 10TB download/day , 750GB upload/day.
API limits mostly come into play when dealing with accessing loads of files, which for Plex mostly means some of the more in-depth scans that actually open each file to make thumbnails, featch stream-data ect. These should therefore not be run automatically but only manually triggered (and preferably not often) if you have a large library. It's the sort of thing you do on a maintenance day when the system will not be used much
If you still are not convinced, you can go take a look at the API yourself:
On that page you will get a nice graph and statistics about API calls, the percentage of errors (some small % is normal) and more.
Let me also clarify you can not get "banned" from the API from over-requesting. You can only run into the 1000 pr 100sec quota. If you spam too many requests you will just get a 403 error back and rclone will have to re-request in a second or two later (ie. it deals with it automatically).