@GBoudreau Your problems with Drive started the same time mine did. It's weird that the --drive-upload-cutoff 1000T has worked for you and others but not for me. Would you mind sharing what other options (if any) that you're passing to rclone? Prior to November 15, I was passing no other options to rclone.
For what it's worth, I am wrapping my Drive remote with a Crypt remote.
I'm seeing weird results for a 3G FILE UP IN Google Drive, using this method...
Yeah with my larger file, I don't think this is going to work
Yep, looks like the transmit just tries over and over again
172.217.14.202:443: wsasend: An existing connection was forcibly closed by the remote host. - low level retry 6/10
I'm on Rclone 1.55, do I have to be on Rclone 1.60?
Nevermind 1.60 works with this command, yay!
I had created a single test file 13.550 GiB, then tried to upload with --drive-upload-cutoff 1000T but when the progress was about >90% then the Transfer size became doubled (13.55 GiB became 27.1 GiB). This process kept looping over and over again without finishing or printing any error. Looks weird, right?
I've also been seeing this issue w/ Google Drive over the last few days. Prior to that, rclone has been rock solid for multiple years -- thanks for that!
Sure, it just seems that using --drive-upload-cutoff essentially disables chunked uploads. This only works for operations like copy or move. I guess I'm wondering if there were any ways to achieve similar functionality with rclone mount.
Wow that seems to have done it haha, but I needed to restart for it to work. Will keep tabs on this thread in case Google fixes this issue on their end. Thanks again for your help.