Varying upload speeds on Google Drive

Has anyone recently (within the last 1-2 weeks) had speed variations crop up for Google Drive uploads? I have a nightly cron job that has worked for years to upload a couple hundred gigs per night and I've recently started getting speeds of less than 1M/s when it's usually 20-50M/s. I can start a copy of a file, stop it after a few seconds and start it again and get wildly different results as seen below. I just ran these back to back.


$ rclone copy --progress .local-encrypt/TvSrY11syg1N0J-xQgEhNknb/qUpled4Ga8NNIUxonwUWmPaL/-I,PrTeuu3JGXRT8GlmslZn3/sx9zNyIbJ5aKM0KjaE3K-A95XYHOAll3f2a1ugbLaiG0oJl0InhcVFf4wCZnKfEcwmZM0ReKtiyEXKOx0Gn7Ep-u gdrive:/
Transferred:        2.434M / 856.752 MBytes, 0%, 616.529 kBytes/s, ETA 23m38s
Errors:                 0
Checks:                 0 / 0, -
Transferred:            0 / 1, 0%
Elapsed time:          4s
Transferring:
 * sx9zNyIbJ5aKM0KjaE3K-A…ZM0ReKtiyEXKOx0Gn7Ep-u:  0% /856.752M, 606.937k/s, 24m1s
^C

$ rclone copy --progress .local-encrypt/TvSrY11syg1N0J-xQgEhNknb/qUpled4Ga8NNIUxonwUWmPaL/-I,PrTeuu3JGXRT8GlmslZn3/sx9zNyIbJ5aKM0KjaE3K-A95XYHOAll3f2a1ugbLaiG0oJl0InhcVFf4wCZnKfEcwmZM0ReKtiyEXKOx0Gn7Ep-u gdrive:/
Transferred:      110.371M / 856.752 MBytes, 13%, 24.166 MBytes/s, ETA 30s
Errors:                 0
Checks:                 0 / 0, -
Transferred:            0 / 1, 0%
Elapsed time:        4.5s
Transferring:
 * sx9zNyIbJ5aKM0KjaE3K-A…ZM0ReKtiyEXKOx0Gn7Ep-u: 12% /856.752M, 23.045M/s, 32s^C

Helpful info:

  • 1Gbps symetric connectivity on dedicated server.
  • Running at Wholesale Internet so it's in a data center that has consistent connectivity.
  • Using paid GSuite Google Drive
  • Using personal API key, not default

Speedtest CLI is consistently giving normal speed results such as the one below.

Retrieving speedtest.net server list...
Selecting best server based on ping...
Hosted by Giant Communications, Inc. (Haviland, KS) [113.97 km]: 14.246 ms
Testing download speed................................................................................
Download: 850.16 Mbit/s
Testing upload speed......................................................................................................
Upload: 652.45 Mbit/s

I do a transfer every night at the same time and it's pretty consistent:

Transferred:   	  300.149G / 300.149 GBytes, 100%, 80.770 MBytes/s, ETA 0s
Transferred:   	  382.505G / 382.505 GBytes, 100%, 95.846 MBytes/s, ETA 0s
Transferred:   	   60.208G / 60.208 GBytes, 100%, 95.960 MBytes/s, ETA 0s
Transferred:   	  217.346G / 217.346 GBytes, 100%, 91.004 MBytes/s, ETA 0s
Transferred:   	  141.844G / 142.370 GBytes, 100%, 89.216 MBytes/s, ETA 6s
Transferred:   	  142.373G / 142.373 GBytes, 100%, 89.002 MBytes/s, ETA 0s
Transferred:   	  102.412G / 102.883 GBytes, 100%, 69.413 MBytes/s, ETA 6s
Transferred:   	  102.893G / 102.893 GBytes, 100%, 69.188 MBytes/s, ETA 0s
Transferred:   	  123.427G / 123.427 GBytes, 100%, 88.992 MBytes/s, ETA 0s
Transferred:   	  148.590G / 148.590 GBytes, 100%, 81.589 MBytes/s, ETA 0s
Transferred:   	  277.503G / 277.503 GBytes, 100%, 93.611 MBytes/s, ETA 0s
Transferred:   	  477.132G / 477.132 GBytes, 100%, 89.420 MBytes/s, ETA 0s
Transferred:   	   56.672G / 56.672 GBytes, 100%, 81.899 MBytes/s, ETA 0s
Transferred:   	   78.948G / 78.948 GBytes, 100%, 75.752 MBytes/s, ETA 0s
Transferred:   	  130.869G / 130.931 GBytes, 100%, 82.107 MBytes/s, ETA 0s
Transferred:   	  130.936G / 130.936 GBytes, 100%, 81.988 MBytes/s, ETA 0s
Transferred:   	  255.083G / 256.351 GBytes, 100%, 90.391 MBytes/s, ETA 14s
Transferred:   	  256.354G / 256.354 GBytes, 100%, 89.715 MBytes/s, ETA 0s
Transferred:   	  128.726G / 128.726 GBytes, 100%, 62.104 MBytes/s, ETA 0s
Transferred:   	   65.498G / 65.498 GBytes, 100%, 78.713 MBytes/s, ETA 0s
Transferred:   	  146.199G / 146.199 GBytes, 100%, 83.889 MBytes/s, ETA 0s
Transferred:   	  136.844G / 136.844 GBytes, 100%, 78.534 MBytes/s, ETA 0s

Can confirm the same issue.
It’s from Google side. Try changing the user agent using —user-agent

@Harry Thanks for the tip. Unfortunately I’m still seeing the same inconsistency with the --user-agent string set to Chrome.

$ rclone copy --progress /tmp/Screen\ Recording\ 2019-11-14\ at\ 8.25.46\ PM.mov gdrive:/ --user-agent "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/70.0.3538.77 Safari/537.36"
Transferred:        5.996M / 2.082 GBytes, 0%, 708.989 kBytes/s, ETA 51m10s
Errors:                 0
Checks:                 0 / 0, -
Transferred:            0 / 1, 0%
Elapsed time:        8.6s
Transferring: 
Screen Recording 2019-11-14 at 8.25.46 PM.mov:  0% /2.082G, 703.470k/s, 51m34s 
^C

$ rclone copy --progress /tmp/Screen\ Recording\ 2019-11-14\ at\ 8.25.46\ PM.mov gdrive:/ --user-agent "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/70.0.3538.77 Safari/537.36"
Transferred:        5.340M / 2.082 GBytes, 0%, 597.889 kBytes/s, ETA 1h41s
Errors:                 0Checks:                 0 / 0, -
Transferred:            0 / 1, 0%
Elapsed time:        9.1s
Transferring:
Screen Recording 2019-11-14 at 8.25.46 PM.mov:  0% /2.082G, 603.984k/s, 1h0m4s
^C

$ rclone copy --progress /tmp/Screen\ Recording\ 2019-11-14\ at\ 8.25.46\ PM.mov gdrive:/ --user-agent "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/70.0.3538.77 Safari/537.36"
Transferred:      230.559M / 2.082 GBytes, 11%, 26.589 MBytes/s, ETA 1m11s
Errors:                 0
Checks:                 0 / 0, -
Transferred:            0 / 1, 0%
Elapsed time:        8.6s
Transferring: 
Screen Recording 2019-11-14 at 8.25.46 PM.mov: 10% /2.082G, 26.000M/s, 1m13s
^C

It's more likely to be your peering or something else related to location/setup/networking/etc if you haven't changed anything else.

I haven't seen any change as my example from last night.

Transferred:       59.605G / 59.605 GBytes, 100%, 87.850 MBytes/s, ETA 0s
Errors:                 0
Checks:                30 / 30, 100%
Transferred:           15 / 15, 100%
Elapsed time:    11m34.7s

Did you do any updates kernel wise or something else along those lines?

WSI hosts out of Kansas and this configuration has worked for roughly five years with them (1 on this server). I'm trying to get WSI support to help investigate at the network level but that's tough to do without definitive evidence of the issue. Uploading to S3 and OneDrive works as expected on this server and Google Drive uploads work as expected from my home and work networks on a different machine with the same OAuth credentials. As far as kernel upgrades go, I've updated as a part of trying to get this working. This server's running Ubuntu 18.04 and the kernel version is 4.15.18-041518-generic. This issue cropped up on rclone v1.50.1 and I've updated to v1.50.2 and v1.51.0 as it was released this morning. Issue has persisted through all three versions.

That's a pretty old kernel version, but if it was working and that hasn't changed, not sure that would be relevant.

Can you share a debug log of the copy?

Debug log is below on two back to back transfers that I cancelled after 20 seconds. This server does have Canonical's Livepatch Service running so the kernel is updated regularly. Do you think the kernel could be a part of this problem? I've been looking into updating the kernel to 5.3 or 5.4 but am not sure how comfortable I am with it not being a mainline release from Ubuntu for 18.04. This is a headless box with no straightforward recourse if it goes sideways but if it might resolve this issue (or a kernel rollback) I'm open to it as I've got 1.6TB of data waiting to upload and continuing to pile up.

$ rclone copy --progress --log-level DEBUG /tmp/Screen\ Recording\ 2019-11-14\ at\ 8.25.46\ PM.mov gdrive:/
2020/02/01 14:36:56 DEBUG : rclone: Version "v1.51.0" starting with parameters ["rclone" "copy" "--progress" "--log-level" "DEBUG" "/tmp/Screen Recording 2019-
11-14 at 8.25.46 PM.mov" "gdrive:/"]
2020/02/01 14:36:56 DEBUG : Using config file from "/home/plex/.config/rclone/rclone.conf"
2020-02-01 14:36:56 DEBUG : gdrive: Loaded invalid token from config file - ignoring
2020-02-01 14:36:56 DEBUG : gdrive: Saved new token in config file
2020-02-01 14:36:57 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Need to transfer - File not found at Destination
2020-02-01 14:36:57 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 0 length 8388608
2020-02-01 14:37:11 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 8388608 length 8388608
Transferred:       12.121M / 2.082 GBytes, 1%, 619.089 kBytes/s, ETA 58m25s
Transferred:            0 / 1, 0%
Elapsed time:        20.0s
Transferring:
 * Screen Recording 2019-11-14 at 8.25.46 PM.mov:  0% /2.082G, 622.495k/s, 58m6s
 ^C

$ rclone copy --progress --log-level DEBUG /tmp/Screen\ Recording\ 2019-11-14\ at\ 8.25.46\ PM.mov gdrive:/
2020/02/01 14:37:20 DEBUG : rclone: Version "v1.51.0" starting with parameters ["rclone" "copy" "--progress" "--log-level" "DEBUG" "/tmp/Screen Recording 2019-
11-14 at 8.25.46 PM.mov" "gdrive:/"]
2020/02/01 14:37:20 DEBUG : Using config file from "/home/plex/.config/rclone/rclone.conf"
2020-02-01 14:37:20 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Need to transfer - File not found at Destination
2020-02-01 14:37:20 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 0 length 8388608
2020-02-01 14:37:21 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 8388608 length 8388608
2020-02-01 14:37:22 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 16777216 length 8388608
2020-02-01 14:37:22 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 25165824 length 8388608
2020-02-01 14:37:22 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 33554432 length 8388608
2020-02-01 14:37:22 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 41943040 length 8388608
2020-02-01 14:37:23 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 50331648 length 8388608
2020-02-01 14:37:23 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 58720256 length 8388608
2020-02-01 14:37:23 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 67108864 length 8388608
2020-02-01 14:37:23 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 75497472 length 8388608
2020-02-01 14:37:24 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 83886080 length 8388608
2020-02-01 14:37:24 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 92274688 length 8388608
2020-02-01 14:37:24 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 100663296 length 8388608
2020-02-01 14:37:25 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 109051904 length 8388608
2020-02-01 14:37:25 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 117440512 length 8388608
2020-02-01 14:37:25 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 125829120 length 8388608
2020-02-01 14:37:26 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 134217728 length 8388608
2020-02-01 14:37:26 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 142606336 length 8388608
2020-02-01 14:37:26 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 150994944 length 8388608
2020-02-01 14:37:26 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 159383552 length 8388608
2020-02-01 14:37:27 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 167772160 length 8388608
2020-02-01 14:37:27 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 176160768 length 8388608
2020-02-01 14:37:27 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 184549376 length 8388608
2020-02-01 14:37:27 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 192937984 length 8388608
2020-02-01 14:37:28 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 201326592 length 8388608
2020-02-01 14:37:28 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 209715200 length 8388608
2020-02-01 14:37:28 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 218103808 length 8388608
2020-02-01 14:37:28 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 226492416 length 8388608
2020-02-01 14:37:29 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 234881024 length 8388608
2020-02-01 14:37:29 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 243269632 length 8388608
2020-02-01 14:37:29 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 251658240 length 8388608
2020-02-01 14:37:30 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 260046848 length 8388608
2020-02-01 14:37:30 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 268435456 length 8388608
2020-02-01 14:37:30 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 276824064 length 8388608
2020-02-01 14:37:30 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 285212672 length 8388608
2020-02-01 14:37:31 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 293601280 length 8388608
2020-02-01 14:37:31 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 301989888 length 8388608
2020-02-01 14:37:31 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 310378496 length 8388608
2020-02-01 14:37:31 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 318767104 length 8388608
2020-02-01 14:37:32 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 327155712 length 8388608
2020-02-01 14:37:32 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 335544320 length 8388608
2020-02-01 14:37:32 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 343932928 length 8388608
2020-02-01 14:37:32 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 352321536 length 8388608
2020-02-01 14:37:33 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 360710144 length 8388608
2020-02-01 14:37:33 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 369098752 length 8388608
2020-02-01 14:37:33 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 377487360 length 8388608
2020-02-01 14:37:34 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 385875968 length 8388608
2020-02-01 14:37:34 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 394264576 length 8388608
2020-02-01 14:37:34 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 402653184 length 8388608
2020-02-01 14:37:34 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 411041792 length 8388608
2020-02-01 14:37:35 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 419430400 length 8388608
2020-02-01 14:37:35 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 427819008 length 8388608
2020-02-01 14:37:35 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 436207616 length 8388608
2020-02-01 14:37:36 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 444596224 length 8388608
2020-02-01 14:37:36 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 452984832 length 8388608
2020-02-01 14:37:36 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 461373440 length 8388608
2020-02-01 14:37:36 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 469762048 length 8388608
2020-02-01 14:37:37 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 478150656 length 8388608
2020-02-01 14:37:37 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 486539264 length 8388608
2020-02-01 14:37:37 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 494927872 length 8388608
2020-02-01 14:37:37 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 503316480 length 8388608
2020-02-01 14:37:38 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 511705088 length 8388608
2020-02-01 14:37:38 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 520093696 length 8388608
2020-02-01 14:37:38 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 528482304 length 8388608
2020-02-01 14:37:39 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 536870912 length 8388608
2020-02-01 14:37:39 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 545259520 length 8388608
2020-02-01 14:37:39 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 553648128 length 8388608
2020-02-01 14:37:39 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 562036736 length 8388608
2020-02-01 14:37:40 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 570425344 length 8388608
2020-02-01 14:37:40 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 578813952 length 8388608
2020-02-01 14:37:40 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 587202560 length 8388608
2020-02-01 14:37:41 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 595591168 length 8388608
2020-02-01 14:37:41 DEBUG : Screen Recording 2019-11-14 at 8.25.46 PM.mov: Sending chunk 603979776 length 8388608
Transferred:          584M / 2.082 GBytes, 27%, 28.205 MBytes/s, ETA 54s
Transferred:            0 / 1, 0%
Elapsed time:        20.7s
Transferring:
 * Screen Recording 2019-11-14 at 8.25.46 PM.mov: 27% /2.082G, 28.055M/s, 55s
 ^C

Is from November 2018 so not very recent.

I've avoided the 5 kernels until they are more stable, which is why I run Debian as stability and consistency is more important to me.

Updated to 5.3.0-28-generic with no changes. WSI is still looking into it.

I can confirm the same problem. Since a few days ago (maybe a few weeks already) I've also come to experience obvious upload throttling on my end.

I'm using a home fiber connection (300/150Mbps if that matters) and have always been consistently hitting cap on uploads ever since I setup rclone back in April 2019. I'm using a Google Team Drive with "unlimited storage" and have roughly over 4TB of data already stored.

I have an organization account solely setup for this Google Drive environment, had the Drive API enabled at the Cloud Console, a project also created solely for the purpose of providing rclone its own key - then proceeded to create a key and credentials (full ownership). It was working wonders and uploads were blazing fast.

Then, all of sudden, the maximum I have is 3x ~450Kbps transfers (roughly 1.5Mbps, close to the speed @dabigc has reported in this attempts). I have tried using 3 different VPN services with high throughput (I could guarantee at least half my maximum upload through those VPNs and have validated uploads at other different servers and services) but the result is the same hard cap of ~450Kbps per transfer nonetheless.

I have also tried creating another API client_id. No avail.

Then I added another Google account to the drive roster, and within that completely new (personal account) created a Cloud Console project, enabled Drive API, created an OAuth application, created yet another brand new API key and re-configured rclone remotes. No avail, same upload throttling.

I have also tried using ANOTHER Google Team Drive which my organization account has access to (and which was uploading to max speed up until the same epoch that the first one was working fine) and have also got an upload speed limited at ~450Kbps per transfer.

This may sound ridiculous, buy if you change those files name (Eg: add some suffix, like file_test.mkv then you will get max speed again.)

Already tried that. Same results.

WSI has narrowed the issue down to one of their backbone providers and cut them out of the path while they work to resolve this. Speeds went back to normal as soon as they made this change so this confirms the issue being isolated to the network and outside of my control.

Nice. That makes the most sense based on your story too!

Happy to see they isolated it as that's impressive.

Did they provide you the name of the backbone provider with issues? I am having similar issues as you were experiencing and before reaching out to my provider it would be nice to have the name to see if my provider utilizes the same "backbone provider with issues" as well.

They didn't volunteer the provider info but I've asked if they can share it with me. If they do, I'll report back with it.

The problem was affecting HEs Equinix DA1 exchange peering. HE was the provider in question.