ncw is correct, this won’t work this way.
However, if you calculate the amount of data you want to transfer before hand, and then pre-define the users that are going to upload it. Let’s put it in a narrative.
You have 10,000GB (10TB roughly) that you want to upload, but you’re only allowed to upload 750GB per day. You create 10 service accounts, each service account can push 750GB a day. So that’s 745GB for SA1, 745GB for SA2, so on so forth. Once you’ve surpassed your 24 hour limit, you can cycle through your service accounts.
I haven’t tested this theory yet, but I’m working on a script to do this automatically for you. It’ll calculate the total amount of information being uploaded, divide the data up into 745GB chunks, and upload it until it’s done. I’ll let you guys know how this goes on my personal setup.
I’m attempting to push 10TB of data per day, that’s the goal of the script, however in reality, I’ll only being doing 2TB a day.