Rclone + plex = error s1001 network

What is the problem you are having with rclone?

nobody "I think"

What is your rclone version (output from rclone version)

Rclone - rclone v1.53.3 - os/arch: linux/amd64 - go version: go1.15.5

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Ubuntu 20.04 - 64Bits

Which cloud storage system are you using? (eg Google Drive)

Google Drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone mount cloud: /home/cloud/ --allow-other --drive-use-trash --dir-cache-time 48h --vfs-read-chunk-size 128M --vfs-read-chunk-size-limit 2G --buffer-size 256M --cache-chunk-no-memory --attr-timeout 1s --use-mmap --log-level INFO --fast-list &

The rclone config contents with secrets removed.

[cloud]
type = drive
client_id =
client_secret =
scope = drive
token =

A log from the command with the -vv flag

hello everyone i'm new and i'll explain my problem

I installed plex on a dedicated server
i installed rclone and mounted gdrive

the problem that plex doesn't read my files giving the error s1001 network

I've seen other open topics like this before but I don't understand how to solve the problem

i see the files if i type ls
but if I go into the plex video info it doesn't give me any information

i hope I've given you all the info you need

The template has the information needed and you neglected to fill it out :frowning:

If you want to use that as we spent a lot of time making the template which collects all the relevant information, that would be superb.

is that okay?

some questions don't answer it because i'm inexperienced

you need to post

  • the rclone command
  • debug log with the error in it.

you need to run the real mount command and use a debug log.
change --log-level INFO to --log-level DEBUG

best to use default values for flags, unless you are 1000% sure what the flags does.
really, the less flags the better.

also
--fast-list does nothing on a mount, so you can remove that.
--drive-use-trash is not needed, as the default is true.
--cache-chunk-no-memory, does nothing, so remove that as you are not using the cache backend.

2020/12/06 17:21:47 DEBUG : rclone: Version "v1.53.3" starting with parameters ["rclone" "mount" "cloud:" "--log-level" "INFO" "to" "--log-level" "DEBUG"]
2020/12/06 17:21:47 DEBUG : Creating backend with remote "cloud:"
2020/12/06 17:21:47 DEBUG : Using config file from "/root/.config/rclone/rclone.conf"
2020/12/06 17:21:47 DEBUG : Google drive root '': root_folder_id = "" - save this in the config to speed up startup
2020/12/06 17:21:47 Fatal error: Can not open: to: open to: no such file or directory

no, that is not correct.

you need to use a log file and set log level to DEBUG

https://rclone.org/docs/#log-file-file

https://rclone.org/docs/#log-level-level

can give me an example of the command

because I'm trying it all the way but it doesn't give me any results

I do this: rclone mount cloud: --log-level NOTICE --log-file=notice.txt

try
rclone mount cloud: /home/cloud/ --allow-other --log-level=DEBUG --log-file=/home/rclonelog.txt

2020/12/06 17:45:00 DEBUG : rclone: Version "v1.53.3" starting with parameters ["rclone" "mount" "cloud:" "/home/cloud/" "--log-level" "DEBUG" "--log-file=debug.txt"]
2020/12/06 17:45:00 DEBUG : Creating backend with remote "cloud:"
2020/12/06 17:45:00 DEBUG : Using config file from "/root/.config/rclone/rclone.conf"
2020/12/06 17:45:01 DEBUG : Google drive root '': root_folder_id = "0AHuM_j48UAXKUk9PVA" - save this in the config to speed up startup
2020/12/06 17:45:01 Fatal error: Directory is not empty: /home/cloud/ If you want to mount it anyway use: --allow-non-empty option

good, we are making progress.

  1. what is in that folder?
  2. make sure there are no other rclone mount commands running in the background, as you were using & in your previous commands

yes in the folder there's my material

now i haven't mounted the cloud with rclone using that command before

i try to mount it without those flags you told me

you could have a rclone mount running in the background since you used & in your commands.

in the folder, you see the files from the cloud: remote from gdrive?

do a ls /home/cloud and post the output

now it works

Thank you very much!

glad we got it working!

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.