Questions? Ask them here! [full now - please start a new topic per question]

[crypthubic]
type = crypt
remote = remotehubic:backup
filename_encryption = standard

Hello, I am trying to install rclone on a NAS NSA325v2 Zyxel, but when I enter this command:
sudo cp -f rclone /usr/sbin/

I get this error:
cp: cannot create regular file ā€˜/usr/sbin/rcloneā€™: Read-only file system

Could anyone please help me?

thanks

That looks OK. And you donā€™t see anything in the backup container?

does rclone ls remotehubic:backup show the encrypted files?

Maybe it is just Hubic control panel caching stuff?

Looks like /usr/sbin is read only. What does cat /proc/mounts say? You might be able to mount it rw do the copy then mount it ro

mount -o remount,rw /usr
# do the copy
mount -o remount,ro /usr

Though /usr may not be the right thing to use depending on the contents of /proc/mounts

Iā€™ve got what is no doubt a noob question indicative of a lack of Linux knowledge.

Iā€™ve got rsync working sending encrypted backups to my Google Drive account. Iā€™ve also got it set up to require a password to decrypt the configuration file.

Iā€™d like to keep this password, but Iā€™m finding it constrains my ability to pipe the output of the rclone command to, say, grep, since it wonā€™t do anything since itā€™s waiting for me to enter the config decryption password. This seems to rule out cron jobs too.

What am I missing?

Thanks for the quick answer, from root as root, I tried to enter:

mount -o remount,rw /usr

but I got the following error, I guess the system is very protected

mount: cannot remount block device /dev/loop0 read-write, is write-protected

In order to avoid this problem, can rclone be installed on other directories?

I have these instructions, may be with others I could install successfully,

unzip rclone-v1.33-linux-arm.zip
cd rclone-v1.33-linux-arm
#copy binary file
sudo cp rclone /usr/sbin/
sudo chown root:root /usr/sbin/rclone
sudo chmod 755 /usr/sbin/rclone
#install manpage
sudo mkdir -p /usr/local/share/man/man1
sudo cp rclone.1 /usr/local/share/man/man1/
sudo mandb

Thanks
Bye

You can set the password as an environment variable RCLONE_CONFIG_PASS - that should help.

So do

export RCLONE_CONFIG_PASS=your_password

Note if you do that your password will go into your history file which you may not want so you might prefer

read -sr RCLONE_CONFIG_PASS
export RCLONE_CONFIG_PASS

which will read the password without echoing it or storing it in your history

You can install rclone anywhere you like, youā€™ll just have to use it with the full path. So if you install it in /home/user you would then run it using /home/user/rclone.

You donā€™t need to install the manpage - that is optional

your command shows encrypted files:

~$ rclone ls remotehubic:backup
    73943 ctp0gg71paabob3cqmugfrijlk/r5urcnhs0l58i7tv3bfbhqrdl0/b75alt419ak66os68h7ck12lqg/fjm3loeof4j98tauntaiho6mgc
 37763102 hn09ah77vob7ke96rl0f4b76rfm9vtq0pdrbft911qf6kspgotv1irfg4kh70cn61bi83sjoorf5u
   533818 ctp0gg71paabob3cqmugfrijlk/aff2tu65r6dt4nk295q809h4c3a0j07hriqm0sklhd1qsho61adg
    73943 r5urcnhs0l58i7tv3bfbhqrdl0/b75alt419ak66os68h7ck12lqg/fjm3loeof4j98tauntaiho6mgc
   533818 aff2tu65r6dt4nk295q809h4c3a0j07hriqm0sklhd1qsho61adg
   533818 r5urcnhs0l58i7tv3bfbhqrdl0/aff2tu65r6dt4nk295q809h4c3a0j07hriqm0sklhd1qsho61adg
    73943 b75alt419ak66os68h7ck12lqg/fjm3loeof4j98tauntaiho6mgc
   533818 ctp0gg71paabob3cqmugfrijlk/cmkfve147egnrfm60vaksbhd2g/aff2tu65r6dt4nk295q809h4c3a0j07hriqm0sklhd1qsho61adg
    73943 ctp0gg71paabob3cqmugfrijlk/cmkfve147egnrfm60vaksbhd2g/b75alt419ak66os68h7ck12lqg/fjm3loeof4j98tauntaiho6mgc
   533818 ctp0gg71paabob3cqmugfrijlk/r5urcnhs0l58i7tv3bfbhqrdl0/aff2tu65r6dt4nk295q809h4c3a0j07hriqm0sklhd1qsho61adg
    73943 ctp0gg71paabob3cqmugfrijlk/b75alt419ak66os68h7ck12lqg/fjm3loeof4j98tauntaiho6mgc

But when I go to Hubic.com I do not see anything, so I am not sure where they are. I mean, they are indeed in the bucketā€¦but where? Invisible on Hubic.com.
This is not ok, because since they are not counted against my storage space I cannot see how much space these files take. Hubic.com says my files take 1.67MB, which is not correct (and actually I do see one file only, which was uploaded without Cryptā€¦it seems that Hubic.com has some problems showing/counting encrypted files or -my guess- file formats that it does not recognize).

Hi. I copied my Drive folder to my computer using rclone. Iā€™ve deleted a few files in the local copy, and now I want to sync the local copy back to Drive. However, rclone returns errors when trying to sync Google format files. Is there a filter for skipping these? or some other method I should try?

Hubic doesnā€™t seem to show you anything except the default bucket any more which is strangeā€¦

You can see other buckets by fiddling with the URL so if you go to: https://hubic.com/home/browser/#backup/ you should see the files.

I expect hubic doesnā€™t update that counter very often - how does it look now?

What errors do you get?

Can you give an example of a file that is causing the problem?

If the problem files have particular names you can skip them using the filters, or you could put them all in a particular directory and skip that using filters.

Using your link I can see the files and folders, but to way to see them unless I use your link (that means when I go do Hubic.com and browse I still cannot see them).
Counter still show 1.67MB, so itā€™s wrong or does not count (for whatever reason) encrypted files.
Well, I see this is an Hubic issue, not really a rclone one so maybe I should stop bothering hereā€¦

Iā€™m sure the hubic browser used to be able to show other buckets in an easy to use way.

I have stuff in other buckets and the hubic size shown is correct, so maybe it just takes a while to measure it.

Yeah, I will check again later and report back.

Have you thought about mega.co.nz? They offer a huge 50GB for free, it would be great to use rclone for that too.

Thanks again for your support. Appreciated.

I get several of errors like this during the sync:
2016/10/08 07:00:12 Infrastructure support/Information (IT) Systems/Disaster Recovery/Planning docs/5 Disaster Recovery Plan Development.pptx: Failed to copy: canā€™t update a google document

This file and the others like it originated as google documents online, were copied to my computer in the initial download, and now apparently canā€™t be reconciled with the online versions. None of the local copies have changed in any way. It appears to me that the sync of ā€œgoogle_document.added_extensionā€ to ā€œgoogle_documentā€ isnā€™t working.

Then at the end of the sync:
2016/10/08 07:00:49 Google drive root ā€˜SIL Documentsā€™: not deleting files as there were IO errors
2016/10/08 07:00:49 Attempt 3/3 failed with 16 errors and: not deleting files as there were IO errors
2016/10/08 07:00:49 Failed to sync: not deleting files as there were IO errors

I see.

rclone downloads google docs as a convenience so you can back up your docs on drive, but it wonā€™t upload them again to drive. Issue #565 discusses this.

There are several workaroundsā€¦ You could exclude ā€œ*.pptxā€. Iā€™d recommend you put your google docs in a different directory though and exclude that.

Thanks for the clarifications and suggestions. I have too many files that are shared with many different people to make either of those solutions workable. Iā€™ll content myself for now to just sync from Drive to my computer and not the other direction.

1 Like

Thatā€™s great and it works. But is there some way to make the environment variable persistent that is not itself a security risk?

Thanks. I actually tried adding the AMAZON:subfolder to the crypt remote before, and it had no effect.

After reading your note, I went back to look at it again and found that my config changes were not being updated for some reason. So I found the rclone.conf file and manually edited it. And what do you know, it actually works.

On the down side, it means I canā€™t actually add any more remotes, until I figure the problem with the config file.