What is the problem you are having with rclone?
My education institution tries to limit total storage pool on Google Drive Edu. They asked me to tidy up and shrink my drive. I'm trying to migrate to Workspace Enterprise Drive. The problem is there is one, huge folder (Arq backups) with >10TB of data and 1M+ files/folders that I'm unable to copy.
I cannot move files or transfer ownership as it is outside of university security setup. I have to copy.
I've been copying 750GB daily. Eventually the archive is still incomplete and Rclone exits with an error:
ERROR : [redacted]: Failed to copy: googleapi: Error 403: The limit for this folder's number of children (files and folders) has been exceeded., numChildrenInNonRootLimitExceeded
- os/version: darwin 13.1 (64 bit)
- os/kernel: 22.2.0 (x86_64)
- os/type: darwin
- os/arch: amd64
- go/version: go1.19.4
- go/linking: dynamic
- go/tags: none
Which cloud storage system are you using? (eg Google Drive)
Workspace Edu and Eterprise Plus Google Drive
The command you were trying to run (eg
rclone copy /tmp remote:tmp)
rclone copy "Drive,shared_with_me:Arq Backup Data" "Drive:Arq Backup Data" -P --checkers=40 --transfers=20 --drive-server-side-across-configs --fast-list -vv --stats=30s
I've already tried with no effect on the error:
- successfuly deduplicating destination,
- keeping destination at root level and not nested deeper.
Time is precious. I will appreciate any thoughts.
Looks like you probably want to use the WebUI and clean that up as the API won't do it.
Resolve a 403 error: Number of items in folder was exceeded
numChildrenInNonRootLimitExceeded error occurs when the limit for a folder's number of children (folders, files, and shortcuts) has been exceeded. There's a 500,000 item limit for folders, files, and shortcuts directly in a folder. Items nested in subfolders don't count against this 500,000 item limit. For more information on Drive folder limits, refer to Folder limits in Google Drive .
How would you approach it practically? I'm trying to copy the folder fresh with another account. But besides deleting and re-trying would you have any advice?
I woud hire a VM with enough storage, use rclone to transfer the directory to the VM.
Then re-organise the directory as necessary.
I'd then upload the re-organised directory with rclone.
It's peculiar the files are just fine on one Google Drive and "won't fit" on another.
Google probably changed their policy at some point in the past but left existing directories alone. That would be my guess.
Does the source directory have > 500,000 entries?
Way over, but nested in multiple directories. I don't know how to easily learn if any particular directory holds over 500K objects.
You could use
rclone lsr -R to make a recursive listing of the directory. I'd probably do that and analyse it with a little python program.
This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.