after using rclone successfully for the last few years to pull data off of sharepoint, I am getting a 403 Forbidden when I do the lsf. It worked on Friday and today it is not working. I logged into the website incognito using the username and password and copied the URL from RCLONE configure remote and was able to retrieve the page no problem. Here is the remote config command:
rclone config create mqms webdav url " Sign in to your account Documents/Shared Documents/Forms/AllItems.aspx?id=/sites/FieldEng/ContPM/Files/ATOM Node Data/2025 MQMS Equip/2025 MQMS Equip Page- 09 Sep/" vendor "sharepoint" user pass --non-interactive
The program hasnât changed since its conception. exact error returned:
rclone lsf --absolute --files-only --max-age 10h --min-age 5m mqms:
2025/09/08 18:18:03 ERROR : : error listing: couldn't list files: 403 FORBIDDEN: 403 Forbidden
2025/09/08 18:18:03 Failed to lsf with 2 errors: last error was: error in ListJSON: couldn't list files: 403 FORBIDDEN: 403 Forbidden
Run the command 'rclone version' and share the full output of the command.
rclone v1.57.0-DEV
os/version: redhat 8.10 (64 bit)
os/kernel: 4.18.0-553.69.1.el8_10.x86_64 (x86_64)
os/type: linux
os/arch: amd64
go/version: go1.16.12
go/linking: dynamic
go/tags: none
Which cloud storage system are you using? (eg Google Drive) microsoft sharepoint
The command you were trying to run (eg rclone copy /tmp remote:tmp)
Paste command here
Please run 'rclone config redacted' and share the full output. If you get command not found, please make sure to update rclone.
I canât confirm that. In the windows world everything works fine for me. I have the same issue as stated by @asdffdsa. My workaround is currently using a windows machine using power shell scripts. This is only a workaround for me I want to have it on a Linux device.
The OneDrive protocol doesnât play well with my Tenant. It will only find subfolders 2 links deep into my Documents folder. In my case most likely due to the large number of subfolders present.
I have not seen a response since posting the rclone config redacted. But I have tried to use the copy command vs the lsf. I get a new message with that:
2025/09/12 14:24:06 CRITICAL: Failed to create file system for "mqms:MQMS Equip Page 09_11_2025.xlsx": read metadata failed: 403 FORBIDDEN: 403 Forbidden
And I have also worked with my sharepoint admin. He did verify that Webdav is enabled and that my user has contribute access.
$ rclone lsf --absolute --files-only --max-age 10h --min-age 5m -vv --dump headers remote:
2025/09/15 16:34:40 NOTICE: Automatically setting -vv as --dump is enabled
2025/09/15 16:34:40 DEBUG : --min-age 5m0s to 2025-09-15 16:29:40.807661232 +0000 UTC m=-299.965650240
2025/09/15 16:34:40 DEBUG : --max-age 10h0m0s to 2025-09-15 06:34:40.807682918 +0000 UTC m=-35999.965628563
2025/09/15 16:34:40 DEBUG : rclone: Version "v1.71.0" starting with parameters ["rclone" "lsf" "--absolute" "--files-only" "--max-age" "10h" "--min-age" "5m" "-vv" "--dump" "headers" "remote:"]
2025/09/15 16:34:40 DEBUG : Creating backend with remote "remote:"
2025/09/15 16:34:40 DEBUG : Using config file from "/home/.config/rclone/rclone.conf"
2025/09/15 16:34:40 DEBUG : found headers:
2025/09/15 16:34:40 DEBUG : You have specified to dump information. Please be noted that the Accept-Encoding as shown may not be correct in the request and the response may not show Content-Encoding if the go standard libraries auto gzip encoding was in effect. In this case the body of the request will be gunzipped before showing it.
2025/09/15 16:34:40 DEBUG : You have specified to dump information. Please be noted that the Accept-Encoding as shown may not be correct in the request and the response may not show Content-Encoding if the go standard libraries auto gzip encoding was in effect. In this case the body of the request will be gunzipped before showing it.
2025/09/15 16:34:40 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2025/09/15 16:34:40 DEBUG : HTTP REQUEST (req 0xc0004fa280)
2025/09/15 16:34:40 DEBUG : POST /extSTS.srf HTTP/1.1
Host: login.microsoftonline.com
User-Agent: rclone/v1.71.0
Content-Length: 1455
Accept-Encoding: gzip
It looks like onedrive sharepoint webdav is (going to be?) deprecated. I tried to edit "Permission Level" in "Site permission" and find this line âUse Remote Interfaces - Use SOAP, Web DAV, the Client Object Model or SharePoint Designer interfaces to access the Web site. (Deprecated)â.
A workaround now is to use cookies as suggested in this topic Sharepoint synchronization - #8 by codeye , and wait for the update from either microsoft (maybe not) or rclone (maybe yes):
[NAME_OF_YOUR_REMOTE]
type = webdav
url = YOUR_SHAREPOINT_URL
vendor = other
user = XXX
pass = XXX
headers = Cookie,rtFa=xxx;FedAuth=xxx
To make the cookies workaround automatic, you could use playwright to headless login and get cookies daily.
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Wed Sep 17 00:25:09 2025
# For cron jobs:
python3 get_cookis.py rclone_remote_name usr pwd https://<YOURS>.sharepoint.com/<YOURS>/
@author: ttllttttlltt
"""
from playwright.sync_api import sync_playwright
import sys
import time
import configparser
account_name = sys.argv[1]
EMAIL = sys.argv[2]
PASSWORD = sys.argv[3]
SHAREPOINT_URL = sys.argv[4]
rcloneconf = '<PATH_TO_YOUR_rclone.conf>/rclone.conf'
# print('account_name',account_name)
# print('Email:',EMAIL)
# print('pwd:',PASSWORD)
# SHAREPOINT_URL = "https://<YOURS>.sharepoint.com/<YOURS>/"
with sync_playwright() as p:
browser = p.chromium.launch(headless=True) # start with headless=False for debugging
context = browser.new_context()
page = context.new_page()
page.goto(SHAREPOINT_URL)
# Step 1: Enter email/username
page.wait_for_selector('input[name="loginfmt"]')
page.fill('input[name="loginfmt"]', EMAIL)
page.click('input[type="submit"]')
# Step 2: Enter password
page.wait_for_selector('input[name="passwd"]')
page.fill('input[name="passwd"]', PASSWORD)
page.click('input[type="submit"]')
try:
page.wait_for_selector('input[id="idBtn_Back"]', timeout=5000)
page.click('input[id="idBtn_Back"]')
except:
pass
# Wait until SharePoint page is fully loaded
# page.wait_for_load_state("networkidle")
# page.wait_for_selector("div[data-automationid='SiteHeader']", timeout=10000)
time.sleep(15)
# Get cookies
cookies = context.cookies()
# cookie_header = "; ".join([f"{c['name']}={c['value']}" for c in cookies])
# rtFa = cookies.get('rtFa')
for i in cookies:
if i['name']=='rtFa':
rtfastr = i['value']
# print("rtFa:\n", rtfastr)
if i['name']=='FedAuth':
fedauthstr = i['value']
# print("FedAuth:\n", fedauthstr)
# Save session for reuse
context.storage_state(path="sharepoint_state.json")
config = configparser.ConfigParser()
config.sections()
config.read(rcloneconf)
config[account_name]['headers']='Cookie,rtFa='+rtfastr+';FedAuth='+fedauthstr
with open(rcloneconf, 'w') as configfile:
config.write(configfile)
browser.close()
Rclone worked really well for me for a long time to access onedrive sharepoint using webdav, until mid year that it stopped working and getting those 403 FORBIDDEN errors. Since then I've only seen a couple of reports about it on this forum, and there's not a straightforward fix to it to be able to use rclone again to access these onedrive remotes, if it is definitely not possible anymore to use rclone in this way you guys gotta update your documentation and clarify that here:
Yes, I agree Thomas. I am still waiting on a solution. I am also trying other solutions (microsoftâs API) but it is really locked down and waiting on a token just to be able to see if it works.