I'm trying to start uploading ~600K files (350TB total) to an Enterprise Box.com account. Box.com Enterprise accounts are limited to 100K API calls per month. I have not purchased the Box account yet because I wanted to check if it was even viable for my use case first. If the API limiter caps me after checking the timestamps on 100K files, I'll have to find another solution.
Run the command 'rclone version' and share the full output of the command.
v1.63.0
Which cloud storage system are you using? (eg Google Drive)
Box
The command you were trying to run (eg rclone copy /tmp remote:tmp)
Yes, I've confirmed with my Box sales rep that they're fine with this amount of data. The source files are located on local storage on the same LAN as the host that will run rclone (storage is mounted on that host via SMB).
I'm not sure if Box supports --fast-list, I can check with them though (unless someone here happens to know). That does seem like a viable option though.
well, there are multiple workarounds for that, such as filters.
let's take this simplified example.
--- the initial transfer has completed.
--- rclone is run once a day.
--- within the last 24 hours, a single source file is modified.
in that case, only one source file needs to be compared and copied to the dest, at the cost of a few api calls.
rclone copy ./source box: --max-age=24h
with that command, rclone will list the source, find files modified within the last 24 hours and check+copy only those modified files against box.com
rclone copy ./source box: --max-age=24h --no-check-dest --retries=1
with that command, reduce api calls further by not checking dest, simply copy the file
Interesting idea. What if that new file (or set of files) took 50 hours to upload and at the 24 hour mark, even more new files were created? I imagine Rclone would then miss those new files on its next run because they’re 26 hours old. Is that correct or am I missing something?
you would need to adjust the time period based on your specific use case and total api usage.
perhaps --max-age=72h would work.
and there are some possible edge cases that rclone might miss.
I’m going to follow the advice in your first post and make a trial account to see how things look. I’ve heard reports that normal Rclone operations might not count against your normal api usage so I want to experiment a bit. I’ll report back here for the benefit of future adventurers.
It appears that normal rclone operations do not count against the monthly API limit. I created a 14 day trial account, uploaded a few thousand files, and ran a Platform Activity Report to check API call usage and the report came back empty indicating no API calls had been made.
It seems that limit is only for custom applications that need a specific API key to work. Not sure I totally understand the difference between that and what rclone is doing, but hey, problem solved!
in my experience, such statistics are not real-time but aggregated over a time period.
and based on the link you shared, that seems to be the case with box.
so i would wait 24+ hours and check again.
I am the same as you. I also get a blank report, but I don't trust it. I think it is impossible that with what I have uploaded, no API calls have been made.
I've been running my upload job on a Box Business Plus eval plan (50K API calls per month). The admin console indicates that rclone has nearly 70K "activities" and the upload is still running with no errors:
This is why I was worried about even trying Box.com, I looked at my API calls for my rclone Google Projects and they average 6-7 million calls/month. I have about the same amount of data as you, moved ~228TB to Dropbox so far.
The other thing about Box.com is they have that 5GB max file size limit. Are you getting around that with rclone's new feature to break apart the files? Last I knew, that feature was still in beta, I'm interested in knowing if it works well for streaming 4k files.
According to the reply from this forum user, if Box exceeds the number of API calls, it will not stop, but there may be additional fees. Have you encountered this situation?