Migrating from googledrive to dropbox was proceding without error, but only at 50MiB/s so I added a second computer/internetconnection to the mix
I started getting a lot of
NOTICE: too_many_requests/.: Too many requests or write operations. Trying again in 300 seconds.
messages so I moved from sync to async, didn't seem to do much, should I lower --transfers down or in sync mode does that not matter?
edit: More specifically, if --transfers is lower than --dropbox-batch-size how can the batching work? alternatively, clearly asking such a silly question means I misunderstood the concept
edit2: stopping it waiting 5-10minutes and starting it seems to always fix it, BUT letting the automated timer restart it? seems to never work... why? probably because the automatic timer assumes that I'm not targeting two VPS at the same dropbox account. I wonder if there's a way to adjust the too many requests wait timer so that I can run 2 VPS afk, it doubles my speed but after like an hour breaks due to this.
Did you create your own Dropbox app id and secret for each? Otherwise you are using the default rclone and sharing them with everyone that doesn't create them
I am also a victim of the change in Google Drive's policy, and I am preparing to switch to Dropbox. The problem is that, given the large volume of data I have accumulated and the speed of my internet connection, it could take me more than a year to transfer everything. Do you know of a server provider that I can use to download/encrypt/upload files from Google Drive to Dropbox, which can provide me with decent speeds without costing too much?
This seedbox is somewhat different than other seedbox providers where you can install Ubuntu on the seedbox and use it to transfer your files from Google drive to Dropbox.
I've bitten the bullet and gotten a box.com subscription, after LayerOnline's cancellation. Using Chunker on the lowest unlimited plan, speeds are quite incredible. I have hit no limitations (Besides my source), and am currently using a --transfers=50 without any API errors.
I should have roughly 20TB transferred by this time tomorrow, and will update on results.
done a webtropia vs, works fine, great upload speed.
still need to undertand what happens when i reach the 40tb/month traffic: it will stop working? they are billing me more? 40tb is combined inbound and outbound traffic?
now i have another issue: i uploaded about 5tb to dropbox, now in trial for 3 licenses and i receive
ERROR : XXXXXXXX : Not deleting source as copy failed: upload failed: path/insufficient_space/...
but nowhere in the dropbox console i can see any error message about running out of space, in the billing section i see
Usage
Your plan includes as much space as you need and finite API calls. Contact support if you have any questions.
I didn't use the Dropbox Advanced plan's trial to avoid the storage limits.
Within the first month I go 100TB & more storage was temporarily put on hold & after repeated asking they have refunded the first months payment as an apology.
I used Rclone via a seedbed to transfer everything. When I reached the cap, I selectively deleted some large folders & continued to transfer the rest. The trick is, you can undelete within 180 days, so once they increase the storage cap, I'll undelete the less important folders.
The rclone commands I used transfer files by the most recently modified so I have the most recent stuff available quickly. rclone sync google: dropbox:/ --ignore-existing --progress --order-by modtime,desc
So far I'm only using 3 accounts but I'll add more once I feel confident that Dropbox is a good new home.
Feel free to ask me question if you are in a similar situation.
40TB is upload, they only count upload. You get charged per TB once you exceed 40TB, in the server menu under bandwidth you can set a limit at which they automatically throttle you to 200Mbit so they don't charge.