I’m on a gigabit broadband and I’ve pushed 40TB to Team Drives typically by setting “–max-transfer 748G” and cycling through multiple user credentials when the max-transfer is hit. To date that has worked well. Although, I’ve come up on a 2.5TB file which is causing me problems. It actually, took me several days to realize that my copy was effectively stalling on the same file. My scheme only works if the file is less than 748G. I also by time of the day vary the bandwidth using rc and core/bwlimit.
What’s the best way to deal with files which are larger than Google’s daily limit? In looking at the forums it seems like controlling the bandwidth is what many are using. Is that the preferred means for dealing with large files? Are there any other recommended approaches?