Getting input/output error while uploading


as my pc is running windows and my server is running xpenology my idea was to run either a debian vm or a rpi3 to serve a rclone share for my network.
At first I tried that by a debian vm. So I mounted my crypted gdive (only the free 15gb for testing purposes) and created a smb share.
Poorly I wasn't able to write on that share from my pc. Though I was able to move, delete and rename files as well as creating new folders.

As I did't have the time and was stuck as I don't have the experience with linux I ended that try.
So now, I wanted to give it a new try.

This time my OpenVPN rpi3 has to do the work. Guess it might be fast enough as my internet connection is only 100/40mbit.
So I was planning to have two smb shares.

The first one is the upload path, which is on an exFAT formatted hdd, connected to the rpi3.
The second one is the gdrivecrypt.
My idea was to copy all files that have to be uploaded to the upload path and create a cronjob that moves these files to gdrivecrypt.

The first times it worked quite well.
But now I get an I/O error:
mv: failed to close '/media/cloudcrypt/media/testfile2.mkv': Input/output error

May that be because of my rclone options?

ExecStart=/usr/bin/rclone mount gdrivecrypt: /media/cloudcrypt \
--allow-other \
--buffer-size 1G \
--dir-cache-time 72h \
--drive-chunk-size 32M \
--fast-list \
--umask 002 \
--vfs-read-chunk-size 128M \
--vfs-read-chunk-size-limit off
ExecStop=/usr/bin/fusermount -uz /media/cloudcrypt

What can I do different?
Or is the whole solution, working with these two folders/paths a bad idea?

You are using 1G per file opened.

Do you have a log from rclone?

So now I set it to
--buffer-size 100m \

But still I am not able to upload the files. Not even one file at a time is possible.

Where can I find the logfile?

You need to put

--log-level DEBUG --log-file /tmp/somelog.log

Well I would like to provide you with a logfile, but the logfile seems to be way too big.

I used the following command to copy it to the cloudcrypt path, as I wasn't able to copy files from the other smb share /media/UploadTemp:
sudo cp /tmp/somelog.log /media/cloudcrypt

But the logfile gets bigger and bigger right now it is about 200mb. Something seems to be wrong here :smiley:

Doing the log with --log-level INFO will probably show the problem (look for lines which say ERROR in them) and it will be much smaller! If not then we'll need the DEBUG log.

Okay, I'll give that a try.

But apart from the error, is that a good way, to first copy the files to a upload folder and then letting them move by a cronjob to the mounted rclone cryptdrive?

Oh, I feel a little dumb right now. The gdrive I used for testing purposes was just full.
Didn't know, the bin won't empty itself.
So next try, will report back to you :wink:

Turned out the problem was really just my gdrive which was full.

Now it works.

Hooray! Glad it works :slight_smile: The error message probably would have said that it was full in the log. Unfortunately this gets lost in translation through the VFS layer and comes out as IO error which isn't wonderfully helpful.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.