Couldn't list files: 413 Request Entity Too Large on copying

What is the problem you are having with rclone?

I cannot use rclone to copy files from Microsoft Sharepoint, apparently some folders are too large and rclone cannot list them.

Run the command 'rclone version' and share the full output of the command.

rclone v1.59.0
- os/version: ubuntu 22.04 (64 bit)
- os/kernel: 5.15.0-1011-aws (x86_64)
- os/type: linux
- os/arch: amd64
- go/version: go1.18.3
- go/linking: static
- go/tags: none

Which cloud storage system are you using? (eg Google Drive)

Onedrive (Sharepoint), trying to copy to local.

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone copy -vv  --files-from files.list.test  remote: ./test
# It also fails with the following
rclone copy -vv  remote: ./test

The rclone config contents with secrets removed.

[remote]
type = webdav
url = https://example.sharepoint.com/sites/files/
vendor = sharepoint
user = file_copier@example.onmicrosoft.com
pass = <redacted>

A log from the command with the -vv flag

BigFolder: error reading source directory: couldn't list files: 413 Request Entity Too Large

Is there any option that chunks the listing output? Can I avoid listing in any way?

Actually the error is slightly different when I don't use a file list. I get:

Failed to create file system for "remote:BigFolder": read metadata failed: 413 Request Entity Too Large

Do these folders have lots and lots of entries in? Do you know how many?

Rclone should be doing the listings in chunks already, but you can vary the chunk size to see if this helps

  --onedrive-list-chunk int          Size of listing chunk (default 1000)

Try halving it until it works.

Unfortunately I do not control the onedrive server I just have credentials to it and the list of files I should copy. The list has around 80k unique files divided in two folders. I tried using --onedrive-list-chunk 500 and I still get the same error.

Thanks for the help btw

I also tested with smaller values, like: --onedrive-list-chunk 1 and I get the same error :confused:

Can you do rclone lsf -R remote: -vv --dump bodies and post the output here? I'm assuming that is going to fail in the same way. It may have sensitive file names in which need redacting, but shouldn't have any auth info.

I guess not much more there, it seems Microsoft returns 413 :slightly_frowning_face:

2022/08/02 14:35:56 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2022/08/02 14:35:57 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2022/08/02 14:35:57 DEBUG : HTTP REQUEST (req 0xc0000a2f00)
2022/08/02 14:35:57 DEBUG : PROPFIND /sites/BigFolder HTTP/1.1
Host: example.sharepoint.com
User-Agent: rclone/v1.59.0
Cookie: <cookie monster>
Depth: 1
Referer: https://example.sharepoint.com/sites/BigFolder/
Accept-Encoding: gzip

2022/08/02 14:35:57 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2022/08/02 14:35:57 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2022/08/02 14:35:57 DEBUG : HTTP RESPONSE (req 0xc0000a2f00)
2022/08/02 14:35:57 DEBUG : HTTP/2.0 413 Request Entity Too Large
Accept-Ranges: bytes
Cache-Control: private,max-age=0
Content-Security-Policy: frame-ancestors 'self' teams.microsoft.com *.teams.microsoft.com *.skype.com *.teams.microsoft.us local.teams.office.com *.powerapps.com *.yammer.com *.officeapps.live.com *.office.com *.stream.azure-test.net *.microsoftstream.com *.dynamics.com *.microsoft.com;
Date: Tue, 02 Aug 2022 14:35:57 GMT
Expires: Mon, 18 Jul 2022 14:35:57 GMT
Microsoftsharepointteamservices: 16.0.0.22713
Ms-Cv: oFcSj11gAECy7K237HR1qQ.0
P3p: CP="ALL IND DSP COR ADM CONo CUR CUSo IVAo IVDo PSA PSD TAI TELo OUR SAMo CNT COM INT NAV ONL PHY PRE PUR UNI"
Public-Extension: http://schemas.microsoft.com/repl-2
Request-Id: 8f1257a0-605d-4000-b2ec-adb7ec7475a9
Set-Cookie: <cookie monster>; domain=sharepoint.com; path=/; SameSite=None; secure; HttpOnly
Set-Cookie: <cookie monster>; path=/; SameSite=None; secure; HttpOnly
Set-Cookie: <cookie monster>; domain=sharepoint.com; path=/; SameSite=None; secure; HttpOnly
Set-Cookie: FedAuth=<cookie monster>; path=/; SameSite=None; secure; HttpOnly
Spiislatency: 1
Sprequestduration: 98
Sprequestguid: 8f1257a0-605d-4000-b2ec-adb7ec7475a9
Strict-Transport-Security: max-age=31536000
X-1dscollectorurl: https://eu-mobile.events.data.microsoft.com/OneCollector/1.0/
X-Ariacollectorurl: https://browser.pipe.aria.microsoft.com/Collector/3.0/
X-Cache: CONFIG_NOCACHE
X-Content-Type-Options: nosniff
X-Databoundary: None
X-Frame-Options: SAMEORIGIN
X-Ms-Invokeapp: 1; RequireReadOnly
X-Msdavext_error: 1966171; The%20attempted%20operation%20is%20prohibited%20because%20it%20exceeds%20the%20list%20view%20threshold%2e
X-Msedge-Ref: Ref A: C70A14F814D045568AE4C16C028C9A2F Ref B: DB3EDGE3217 Ref C: 2022-08-02T14:35:57Z
X-Powered-By: ASP.NET
X-Sharepointhealthscore: 2
Content-Length: 0

2022/08/02 14:35:57 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2022/08/02 14:35:57 Failed to create file system for "remote:BigFolder": read metadata failed: 413 Request Entity Too Large

Maybe the only insightful line is this one:

The%20attempted%20operation%20is%20prohibited%20because%20it%20exceeds%20the%20list%20view%20threshold%2e

That appears to be using the WebDAV protocol not the OneDrive one?

Maybe the list view threshold is configurable by the admin?

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.